Category: Current and Future Trends

LG puts downward pressure on OLED TV prices

Article

OLED TV from LG drops to lowest price of the year | CNET

CNET video about the LG OLED TVs – Click / Tap to play

My Comments

LG is pulling out the stops to make the OLED-based 4K flat-screen TVs become in reach for most consumers. These sets use the same display technology as a lot of the high-end Android phones like the Samsung Galaxy S and Note series, with this technology allowing for improved contrast and black levels as far as the picture goes. It is because the pixels light up under their own power and turn off when it is black, rather than using white LEDs as a backlight for a colour liquid-crystal display with the corresponding light leakage that can occur wherever it is black.

But another advantage that OLED has allowed for when it comes to product design is that the manufacturer can work towards a very thin product

For example, in the USA, they are offering the B6 series of 55” and 65” 4K HDR-capable sets at the price of US$2000 for the 55” variant (OLED55B6P) and US$3000 for the 65” variant (OLED65B6P). This is effectively a price reduction of US$500 for the 55” model and US$1000 for the 65” model.

Even Australian viewers haven’t missed out on the promotion effort with LG and Harvey Norman promoting the OLED smart-TV range through a TV-advertising campaign ran over the last few weeks, as a run up to the Christmas shopping season.

Personally, I could see this as a sign that OLED for large displays could be coming cheaper and more as a viable competitor to the LCD technology. This is more so for those of us who value the high contrast in the pictures that we see, especially if we work with high-quality photos and videos.

A Surface Book variant is actually a portable gaming rig

Article

Should you buy the Surface Book with Performance Base for gaming? | Windows Central

My Comments

Microsoft Surface Book press picture courtesy of Microsoft

A performance variant of the Microsoft Surface will show up as the stealth gaming rig

There have been the rise of gaming-grade laptops that have enough CPU and graphics-processing acumen to handle the latest games on the market with the full performance expectations. Here, most of the manufacturers are releasing at least one model with these abilities but most of these are styled like a muscle car or street rod.

But a few manufacturers are approaching this by offering performance computers that have a “stealth” look where their looks don’t betray the performance under the hood. Microsoft has now joined the party with a performance variant of the Surface Book which works with a “performance base”. This uses a keyboard base with the same graphics ability as the baseline variant of the Surface Studio all-in-one PC.

As with other Surface Book variants, the tablet detaches from the keyboard base so it can simply be a Windows 10 tablet but, when attached, it becomes a convertible laptop. Microsoft could provide the Performance Base not just as part of a system variant for how you wish to order the Surface Book, but as an upgrade option for existing Surface Books for those of us who want to upgrade them as a full-bore gaming or multimedia PC.

It is very similar to what can be done with Thunderbolt 3 over USB-C where manufacturers can offer a graphics module that connects to portable, all-in-one or low-profile computers to enhance their graphics performance. But this idea could be taken further with the use of coprocessors that are part of “performance modules” that could improve existing computers’ performance for gaming and multimedia tasks.

Personally I would find that the trend for portable and “all-in-one” computing is to offer gaming-rig performance as a model variant in one or more product lines rather than as its own product line. This is even though the manufacturer will offer a dedicated premium product line or brand that is focused towards gaming. Such products would appeal to those of us who value the performance angle for multimedia creation or gaming but don’t think of it like owning a “street-machine” car.

ARCEP is heading towards an IPv6 France

Article – French language / Langue Française

L’ARCEP propose un plan d’action pour migrer vers l’IPv6 | Freenews.fr

My Comments

Freebox Révolution - courtesy Iliad.fr

The Freebox Révolution – the sign of an advanced Internet in France

France is intending to take bigger strides towards an IPv6 Internet.

Here, the ARCEP who is the country’s telecoms authority are expediting this process through a series of steps.

Firstly, they will be moving the government’s public-facing Web sites towards IPv6 operation. Most likely, this will be a dual-stack affair to allow legacy networks to touch these sites.

Then they will run a public-awareness and education campaign about IPv6 including identifying obstacles associated with not moving towards this newer set of Internet protocols. Two main obstacles in this case would be computers running operating systems that don’t have IPv6 dual-stack operation, and routers that don’t provide for IPv6 operation. This may not be an issue with the latest “n-boxes” that each of the French ISPs are offering to their customers like the Freebox Révolution.

IPv6 logo courtesy of World IPv6 Launch programThe next stage would be to facilitate moving towards IPv6 by having it work across all of the providers competing with each other in that country.

Users will also benefit from improved information especially about maintaining the IPv4 equipment and networks. This is more so with maintaining the legacy IPv4 addresses, but the endpoint issue could be resolved with various routing or tunnelling setups that IPv6 offers.

Last but not least, the French Internet backbone will move off IPv4 towards IPv6, probably only allowing IPv4 “at the edge”.

But some, if not most, of the ISPs serving the French market, especially Free, may be stepping forward towards IPv6 as part of the competitive marketplace. This includes releasing “n-box” routers that have support for this technology or adding this level of support to some existing equipment through a firmware update. Let’s not forget that most operating systems for regular and mobile computing devices will provide for IPv6 in a dual-stack form. Here, it is underscoring that France has been identified one of the first countries to head towards IPv6 technology.

Why is there interest in Internet-assisted in-home healthcare

There is a strong interest in using Internet-based connectivity as a tool for facilitating in-home healthcare.

Bluetooth-connected pulse oximeter

A Bluetooth-connected pulse oximeter in action

This involves the use of a mix of sensor types that are typically used to observe a patient along with the use of regular, mobile and other computing devices to process and present this information to the carers and to the medical professionals. It also includes implementing voice and video telephony to allow medical professionals to communicate with the patient without the need to frequently travel to where they live.

Why the interest?

Ageing at home

This is where a senior citizen is able to live independently at their home as much as possible but have supporting care from relatives, friends and professional carers.

tablet computer used as part of in-home telemedicine setup

A tablet used as part of an in-home telemedicine setup

One of the reasons driving “ageing at home” is the fact that the generation of people born through World War II and the post-war Baby Boomers will be entering their senior years which will place strong demands on health care and welfare facilities that cater to this group of people.

Another is that an increasing number of aged-care facilities have been associated with substandard quality of care“I don’t want to end up in the nursing home”.  This is brought about with more of us being aware of this level of care either through observing how those we have known in our life’s journey that were looked after in those facilities were being treated, or hearing about instances of substandard care in the media such as the infamous “kerosene bath” incident that hit the news in Australia in the early 2000s.

This has also been driven by the trend towards health-care deinstitutionalisation affecting geriatric and palliative care where there isn’t a desire to rely on large facilities for this kind of care.

Other healthcare needs

Increasingly hospitals are looking towards “hospital in the home” or similar programs as a way to provide ongoing care for convalescent patients and those with illnesses that require long-term attention. Here, the care associated with what would typically be provided in a hospital, typically nursing-focused procedures, would be offered at the patient’s home but with visiting nurses, doctors and allied staff.

Even obstetric care is also affected by this trend, with an increased preference for minimal hands-on professional care for low-risk mothers when they go in to labour. Similarly, low-risk psychiatric care is being delivered at home thanks to telecommunications-based technologies.

The advantages that are being put forward for this kind of care is that the patient can stay in the familiar surroundings of their home and, again, has been underscored by the concept of deinstitutionalisation in healthcare. Governments and others also see it as a cost-saving because they can focus a hospital’s beds towards those needing acute care.

The rural community are seeing an application for this kind of technology so as to avoid the need for frequent long-distance travel which would be of importance when it comes to specialised or advanced healthcare.

How is the kind of healthcare delivered?

Here, the focus is on observational healthcare where medical professionals can assess the situation based on either the data that is collected or through communications with the patient. In some cases, it may be based on an event-driven principle where the professional is alerted if the situation goes beyond certain limits.

This is facilitated through the concept of “telemedicine” where the data is conveyed through an Internet connection and has been facilitated through various technologies.

One of these is “machine vision” where one or more cost-effective high-resolution cameras feeds images in to a platform-based computer which runs software that recognises and interprets these images for medical use. One application that was put forward was to observe a patient’s pulse using a camera that observed the brightness of one’s face as the heart beats. Another application is to use a smartphone’s or tablet’s camera to read fluid-analysis strips as part of assessing urine or blood while an app in that device interprets this information rather than a person comparing what is seen on the strip against a chart.

Another of these is the implementation of common communications technologies like Bluetooth, Zigbee or Wi-Fi in sensor devices. This can lead towards the existence of cost-effective sensor devices that can work with existing computer devices with a minimal need for extra hardware, while these devices can use cost-effective software to interpret and present this information. This has led to startup companies and tech innovators developing devices like “wandering-alert” socks that work with Bluetooth and apps.

What needs to be done?

An issue that will affect in-home telehealth is where device manufacturers and health providers legally stand when it comes to providing these services.

One of the questions that is being raised is the use of non-medical sensor devices for medical applications. One of the scenarios is the use of a general wellness device like a fitness band or a wellness-focused thermometer as a medical sensor for clinical purposes. Another scenario would be the use of a “non-wellness” sensor like a security system’s PIR and door sensors, a home-automation sensor, or a smartphone’s camera for medical-observation purposes with these devices feeding their data to software running on a platform-based computing device.

These questions are being examined by the US’s Food and Drug Administration with respect to wellness-focused devices serving as medical devices in this context. But implementing home automation and security in this context may require a case-by-case assessment based on the actual installation and would only work with geriatric, psychiatric or allied situations where observational healthcare is the order of the day.

Similarly, software that uses devices like cameras for medical reasons like “machine vision” may have to be certified by medical-device authorities to be sure that the software provides accurate results no matter the input device. In the case of software that uses cameras, there would be a requirement for a minimum resolution for the camera to turn out consistently accurate results.

Conclusion

Once the issues that affect the provision of Internet-assisted in-home health care are identified and worked out, then it could be feasible for the home to be a place to deliver continual health care.

A logo-driven certification program arrives for USB-C chargers

Article

USB-IF announces compliance for USB Type-C devices | Android Authority

From the horse’s mouth

USB Implementers Forum

Press Release (PDF) Certified USB Charger Logo and Compliance Program Infographic courtesy of USB Implementers Forum

My Comments

Previously, the USB standard has become effectively a “DC power supply” standard for smartphones and tablets. This has avoided the need to end up with a desk drawer full of power supplies and battery chargers with the associated question of which one works with which device. It has also led to various points of innovation like USB external battery packs and multiple-outlet USB “charging bars”. Similarly, gadgets like lights, fans and cup warmers have also appeared that can be powered from a computer’s USB port or a USB charger.

There was also the environmental view that we will see less chargers destined to landfill when devices are finally retired or less need to supply chargers with mobile phones. But a common reality is that most of these USB chargers end up being kept near or plugged into power outlets around the house more as a way of allowing “convenience charging” for our gadgets.

But the problem has surface where particular USB chargers don’t do the job properly when charging particular devices, especially high-end smartphones or tablets. Here, you need to be sure that you use something like a 2.1A charger for these devices and have them connected using a cable known to work.

The new USB Type-C standard is bring this concept as a low-profile connection for newer smartphones along with using the USB Power Delivery standard to extend this convenience to larger tablets and laptops. But there have been situations where substandard USB Type-C leads and chargers have been appearing on the market placing our new gadgets at risk of damage due to them being improperly powered.

Now the USB Implementers Forum have brought forward a certification program for USB Type-C chargers and leads with this program augmented by a logo. What will happen is that a charger or external battery pack will have to show this logo and state its power capacity in watts so you can be sure it will charge your Ultrabook or 2-in-1 as well as your smartphone.

What should be required is that the logo and the power output is stamped on the charger body itself and also a colour code is standardised for the power output. Having such a colour code could be useful when recognising which charger from a bunch of chargers could handle your gadget or which one is the right one to buy when you look at that display rack.

At least something is being done to make it easier to be sure we end up with the right USB Type-C power-supply device for that 2-in-1 Ultrabook or smartphone without the risk of the computer not charging or being damaged.

Pokemon Go and other similar situations may underscore the need for local micro-data-centers

Article – French language / Langue Française

A small server in an apartment block's basement communications room could be part of a distributed computing setup

A small server in an apartment block’s basement communications room could be part of a distributed computing setup

Le phénomène Pokémon GO révéle nos besoins en datacenters de proximité | L’Usine Digitale

My Comments

This year has shown a few key situations that are placing traditional data-center technology under stress. This is based around fewer large data centers placed sparsely through a geographic area and used primarily to store or process data for many businesses.

One of these is the popularity of Pokemon GO. As people started playing this augmented reality game on their smartphones, there was the need to draw down data representing the app from the different data centers associated with the mobile platforms’ app stores. Then, during play, the game would be exchanging more data with the servers at the data centers that Niantic Labs controls. In some cases, there was problems with underperformance due to latency associated with this data transfer.

Qarnot Q.Rad press image courtesy of Qarnot

.. as could one of these Qarnot Q.rad room-heater servers

Then lately, there was a recent attack, purported to be a denial-of-service attack, against the data centers that were being used to collect the data for the census taking place in Australia on Tuesday 10 August. It is although the census is being targeted towards an online effort where households fill in Web pages with their data rather than filling out a larger book that is dropped off then collected.

Both these situations led to data-center computers associated with these tasks failing which effectively put a spanner in the works when it came to handling these online activities.

What is being shown is that there needs to be an emphasis on so-called “edge computing” or the use of small localised data centers also known as “cloudlets” to store and process data generated in or called upon by a particular area like a suburb or an apartment block. These data centers would be linked to each other to spread the load and pass data to similar centers that need the data.

One application that Netflix put forward was their “Open Connect Appliance” which as a storage device that an ISP or telco could install in their equipment rack if they end up with significant Netflix traffic. This box caches the local Netflix library and is updated as newer content comes on line and older locally-untouched content is removed. Such a concept could be taken further with various content delivery networks like Cloudflare or those implemented by the popular news services or social networks.

The trend I would initially see would be suburban telephone exchanges, cable-TV headends or similar facilities being seen as a starting point for local micro datacenters that serve a town or suburb. Then this could be evolving to street cabinets associated with traffic signals, FTTC/FTTN services and the like, or the basement telecommunications rooms in multi-tenancy buildings being used for this purpose with these smaller data centers being used to serve their immediate local areas.

Qarnot, with its Q.Rad room heaters that are actually data servers, weighed in on the idea that a cluster of these room heaters in a premises or town could become effectively a local “micro data center”.

As for applications that I would see for these “micro data centers” that implement region-focused data processing, these could include: distributed content delivery of the Cloudflare or Netflix kind; localised “store and process” for data loads like a nation’s census;,  online gaming of the Pokemon GO kind; and distributed-computer applications that ask for high fault-tolerance. There will still be the need for the “virtual supercomputer” that would be needed for huge calculation loads like sophisticated financial calculations or 3D animation renderings which a collection of these “micro data centers” could become.

Similarly, the issue of distributed localised computing concepts like edge computing and local “micro data centers” could reduce the need for creating large data centers just for handling consumer-facing data.

What could be seen as affecting the direction for cloud-based computing would be the implementation of localised processing and storage in smaller spaces rather than the creation of large data centers.

Qarnot uses computers to provide free room heat for buildings

Qarnot Q.Rad press image courtesy of Qarnot

Qarnot Q.rad heater is actually a computer

One of the common ways of using electricity to provide room heat in a building is to use a panel or column heater that has a material like oil heated by an electric element.A variant that existed in the UK and, to some extent, Australia was a “storage heater” or “heat bank” that used a heavier material like bricks that stored more heat and was heated during overnight when the power was cheaper. Then this material diffuses this heat in to the room. These kind of heaters are able to provide this diffused heat to take the chill off a room but were expensive to run.

But Qarnot, a French cloud-computing firm, have looked at the issue of using the waste heat from a computer integrated in this heater to heat a room or building. Here, they have designed the Q.Rad which connects to your home network and electrical power and works as a data-server for their distributed-computing effort while using the waste heat to heat a room.

It also implements an integrated power meter so that you can be reimbursed for the power that it uses as part of the cloud-computing network, effectively providing “free heat”. But a question that can be raised for implementation in markets like Australia, New Zealand or, increasingly, the USA is the requirement to calculate transferred data and establish a mechanism to refund users’ bandwidth charges for this data. This is because of the practice where ISPs are either charging for data transferred or throttling users’ bandwidth if they transfer more than an allotted amount of data.

Qarnot Q.Rad exploded view press image courtesy of Qarnot

Processing power inside this heater – the waste heat from that goes to keeping you warm

The data that Qarnot processes using these heaters is typically for the likes of research labs, banks and animation studios where they “offload” calculations in to this cloud-computing array. They also have the ability to seek out distributed-computing research projects of the SETI or Folding@Home kind to keep the network alive and generating heat where needed. For data security, these heaters don’t implement any storage for the distributed-computing client’s data while implementing end-to-end encryption for this data,

Qarnot will implement an “upgrade and replace” program so that higher-speed processors are used in the Q.Rad computing heaters and there is the ability to deal with failed equipment quickly and easily to assure high availability.

Householders are still able to adjust the heater to their preferred comfort level and make it reflect their lifestyle by using a smartphone app or the controls on the heater. This kind of thermostatic control is achieved by deflecting some of the workload away from the heater that is not needed when there isn’t the need for heat output.

They rate the output of a single unit to around 500 watts which would cover a 150-300 foot area in an insulated building. Qarnot are also pitching these heaters as part of the smart-building concept by having them able to be equipped with sensors and being programmable for any IoT / building-automation application. Similarly, Qarnot have added functionality like USB or Qi wireless charging to these heaters so users can charge mobile devices on them.

At the moment, these heaters are being issued to large buildings in Europe and the USA where 20 units or more need to be deployed. But in 2017, Qarnot wants to release these heaters to individuals who want to take advantage of this heating concept. For householders, this may be seen as being advantageous for “always-needed low-output” heating applications such as kitchens, downstairs areas in split-level houses and similar areas.

In some cases, Qarnot could make it feasible to have the Q.Rad heaters provide services to a network, whether as a router, NAS, home-automation hub or something similar. This could be achieved through the use of extra hardware or software to fulfil these tasks.

What Qarnot has done is to harvest waste heat from computing processes and use this for heating rooms in buildings with little cost to the building owner.

Panasonic continues with a CD-capable multi-room system that respects most of us who keep CDs

Article – From the horse’s mouth

Panasonic

SC-ALL7CD Music System

Blog Post

Press Release

Specifications

My Comments

Panasonic are still furthering the QualComm AllPlay multi-room audio platform, this time with another music system that can share CDs or broadcast radio to other AllPlay speakers. Here, they are underscoring audio-content formats that may not be considered the way to go in these days thanks to Internet-derived audio services.

The Panasonic SC-ALL7CD can be set up to be a content source for the AllPlay-compliant speakers by offering CDs played on the integral CD player or recorded on the integral 4Gb storage, content held on a USB memory key, broadcast radio from FM or DAB+, Bluetooth A2DP from a smartphone or similar device; or this same system can be used to play anything offered up by other AllPlay sources on the same home network.

As for network connectivity, this music system which looks like the traditional clock radio is able to be connected to your home network via 2.4GHz or 5GHz Wi-Fi wireless or wired Ethernet which also allows it to work with HomePlug powerline networks when you use it with a “homeplug” adaptor. As for file-based audio, it can handle FLAC Hi-Res audio files and can work with most online audio services as long as you use the Panasonic-supplied AllPlay app on your mobile device.

The integral storage capacity is rated at 4Gb and you can store up to 5 CDs at best quality or 25 CDs at a normal quality, with the ability to have them play sequentially or in random order.

The Panasonic SC-ALL7CD is rated with a power output of 20 watts per channel (1 kHz, 8 ohms, 10% total harmonic distortion) and plays the music in to a stereo speaker setup which implements a 2-way speaker arrangement for each channel.

But this system is about continuing the ability to link a multi-room system based on the Qualcomm AllPlay platform with legacy sources like CDs and traditional radio, something that I see only Panasonic doing. This is unless others contribute integrated music systems to this platform that maintain one or more similar sources.

DLNA 4 and Vidipath facilitate elaborate TV user interfaces for network devices

Thecus N5810PRO Small Business NAS press photo courtesy of Thecus

NAS units will be required to provide a rich user interface on the big screen without the need of an app

I have had a look through the DLNA 4.0 and VIDIPATH standards and found a feature that these standards do provide for in the form of a “remote user interface”. This is where another server device can provide a graphically-rich user interface on a separate client device typically in the form of a Smart TV or video peripheral. It works very much in a similar frame to how Web browsing, where you have Web pages hosted on Web servers and streamed over a network to a Web browser existing on a client device.

The standards that are supported in this context are HTML5 and RVU (pronounced R-View) which facilitate this graphically-rich user interface. It was pitched more at pay-TV operators who provide their customers a PVR or media gateway and want to share the same user interface across all of the smart TVs, connected video peripherals (Blu-Ray players, games consoles, network media players), regular desktop/laptop computers and mobile computing devices (smartphones, phablets and tablets).

Here, this would facilitate operator-provided video-on-demand, interactive TV services, the electronic programme guide and value-added services but allow the operator to present these services with their “skin” (branding and user experience) on all of the screens in a customer’s household. This is in contrast to services like programme guides, PVR content collections and recording schedules being presented using the device manufacturer’s user interface which may not be consistent especially at the lower end of the market. It wouldn’t matter whether the server device was “headless” (without a display or control surface) like a broadcast-LAN tuner or had a display and control surface like the typical set-top PVR with its own remote control and connected to the TV in the main lounge area.

But this technology appeals to another class of devices beyond the pay-TV set-top boxes and media gateways.

Increasing network-attached-storage vendors are partnering with software developers to develop and deploy advanced media-server software in their consumer-focused NAS products. Examples of these include the Plex Media Server being packaged with newer Western Digital premium consumer NAS products and the media server software that Synology are packaging as part of their latest DSM 6 NAS software. Typically these offer functionality like rich media information or improved search / browse functionality.

Some of the NAS devices offer PVR software that works with USB digital-TV-tuner modules or broadcast-LAN tuner boxes and are targeted towards markets where free-to-air TV or pay-TV delivered without operator-supplied equipment is highly valued.

As well, a lot of consumer-focused NAS devices are being marketed in the concept of the “personal cloud” and these devices could benefit from a rich user interface that takes advantage of smart TVs.

It also includes the possibility of Secure Content Storage Association pushing their Vidity “download-to-own” platform as a way to deliver the same kind of collectability and rich user experience that the DVD and Blu-Ray box-sets are known for when supplying sell-through video content “over the wire” or allowing customers to download DVD and Blu-Ray content to their home networks. This could also encompass using a NAS as an “offload device” for extra binge-watch content that you bring in using a PVR.

More and more, manufacturers will look at ways to add value to NAS devices or broadcast-LAN tuner devices as a way to have customers buy the newer devices rather than hang on to older devices.

When NAS suppliers want to offer this kind of functionality, they either implement a Web user interface which may work best for regular computers and tablets with you needing to know IP addresses or device network names, or you are having to download and install companion client apps into your client devices. This doesn’t really work well with any 10-foot “lean-back” experience.

But the reality is that this software can exploit RVU or HTML5 remote-user-interface standard technology to realise the user-interface images on to the regular television screen. Typically, all it requires is that the devices exploit their Web server software to implement the RVU or HTML5 remote-user-interface technology and use UPnP which is already used for the DLNA content server functionality to expose this content to TVs and similar devices.

For that matter, the ability to print out content from an interactive-TV show should be integrated in to RVU or HTML5 technology because some shows and advertising like cookery shows encourage the printing out of value-added content for users to benefit from this content.

To the same extent, the hotel applications could take this further by opening up virtual content sources for things like in-house video-on-demand or gaming; or even provide a user interface to services like in-room dining or booking use of day-spa facilities.

What needs to happen is that the remote user interface technology can be exploited beyond the set-top-box or media-gateway application and taken further to NAS or other server-role devices on a home or business network for a proper 10-foot experience.

DLNA 4.0 to support server-based media transcoding

Article – From the horse’s mouth

DLNA

Synology DiskStation DS415play NAS with media transcoding - Press image courtesy of Synology

Synology DiskStation DS415play – demonstrating the value of transcoding content to provide to DLNA devices

Press Release

My Comments

An issue that can easily beset DLNA / UPnP-AV content-delivery setups is the fact that digital-image, audio and video content can be delivered in newer file formats and that it could be packaged for high-quality setups. A case to point could be 4K UHDTV video content which would work with the newer 4K UHDTV sets; or you could have audio content packaged in the FLAC lossless-compression file formats rather than MP3 or WMA file formats.

But the problem that exists is that you will likely to have older or cheaper equipment that can’t handle the higher-quality content types. Some devices that can handle the higher-quality content type may not be able to handle it in the file format it is delivered in unless the device’s firmware was updated to take the newer filetypes. Typically, this may ruin the experience because the device will typically throw up a confusing error message or show nothing.

A few UPnP-AV / DLNA Media servers do support some form of filetype or content transcoding with some Synology NAS units implementing this functionality at the hardware level. But there isn’t the ability to be sure that the NAS, broadcast-LAN tuner or similar device provides this kind of transcoding. The new DLNA 4.0 specification mandates that compliant server devices have to transcode the content that they serve if the client device can’t handle it directly.

The questions worth raising about this required function is whether this applies to filetype transcoding only or if it also includes functionality like downscaling a 4K video to Full HD for existing HDTVs for example. It shouldn’t also be about whether the transcoding takes place in the background for stored or downloaded media or only in a real-time fashion whenever legacy equipment wants the resource, something that would work with broadcast-LAN applications.

As far as NAS and DLNA media-server software design goes, one differentiating point that will exist would be the ability for the hardware and software to implement hardware-based transcoding. This is where a separate processor and RAM, like a GPU setup, is provided to transcode video content rather than the device’s main processor and RAM being used for the task. It is similar to what would happen if you use a computer equipped with a discrete video card or chipset to transcode some video content and this permits the main processor in the NAS to continue serving the files without having to transcode them at the same time. At the moment, the Synology DS416play, the successor to the DS415play which was the first NAS to offer this feature, is the only one that implements hardware transcoding.

Personally, I would like to see these devices offer transcoding for QuickTime and Motion JPEG video as used by some digital still cameras, and FLAC and ALAC lossless audio which is now valued as a high-quality audio format for “ripping” CDs or buying download-to-own music. This is because these formats are not universally handled in the DLNA network media sphere.

Other functions that are part of this version include catering to IPv6 networks which is fast becoming the way to go, inherent support for 4K and HDR video content, the requirement for a DLNA MediaServer to expose HD variants of more video filetypes and the VIDIPATH functionality being baked in to the standard which would be important especially for Pay-TV applications.