Category: Current and Future Trends

Apple to use the UAC connector as its standard connector for headphones

Articles

Dell A2 Performance USB Headset

Feature headphones with digital functionality will be using UAC as an accessory-side connector for widest compatibility

Relax, Apple isn’t introducing another new connector | The Verge

Apple’s Ultra Accessory Connector dashes any hopes of a USB-C iPhone | The Verge

Apple plans new smaller Ultra Accessory Connector (UAC) for Made-for-iPhone accessories | 9 To 5 Mac

What is UAC? Apple’s new ‘Made for iPhone’ accessory and port explained | Trusted Reviews

My Comments

If you have owned a Nikon digital camera, you may have dealt with the Ultra Accessory Connector (UAC) as a method to tether your camera to your computer for, perhaps, downloading. This would typically be facilitated using a USB to UAC cable that came in the box with your camera.

Apple is resurrecting this connector as part of its MFi (Made For iOS) accessories program for iOS devices. There was a lot of confusion in the computing press regarding this connector because it could be about a different socket existing on a subsequent iPhone or iPad, or devices and accessories not working unless “you get with the program” – be part of the Apple ecosystem.

But the Ultra Accessory Connector is about how its use as an intermediary or accessory-side connector on a pair of headphones. It is being called on because an increasing number of newer smartphones and ultraportable laptops won’t be equipped with the traditional 3.5mm headset jack where you can connect a wired headset.

There is also the same appeal where headphones will have integral digital-analogue audio circuitry and there has to he a way to connect these to your smartphone if you are going the “wired” path. It is something very familiar to those of us who use a USB headset with our computers or a Bluetooth headset or audio adaptor with our smartphones. Here, manufacturers will see better digital-analogue circuitry and / or sound-processing technology such as microphone arrays, accessory-side sound-tuning and active noise cancellation as a way to differentiate their product ranges more effectively and  innovate their products.

Dell AE2 Performance USB Headset - USB plug

It will still be feasible to keep a level playing field for headphones that use USB or other wired digital links.

The approach that is being pushed here is for a headset or pair of headphones to have the UAC connection as an accessory-side connection. Typically this will be as a “lump” on the headphone cable like what is used for remote control or a microphone, which comes apart. On the other hand, the most probable implementation for a pair of traditionally-styled “cans” would be a socket installed on one of the earcups similar to what happens for detachable-cord implementations. The headset would then be supplied with one or more application-specific connection cables that have a UAC-connector on the accessory side and the appropriate connector (Apple Lightning, USB-A, USB-C or 3.5mm phone plug) on the equipment side. There is also a goal to have such cables also available through the aftermarket thanks to accessory suppliers like Belkin.

The UAC connection is meant to facilitate a digital connection that works with USB or Apple Lightning norms along with the standard stereo analogue connection. Here, it means that an accessory cable can exist which has the traditional 3.5mm phone plug on it to allow use with equipment that still maintains this connection. This includes still being able to use the 6.35mm headphone jack adaptor to connect your headpbones to hi-fi equipment or the two-pin airline adaptor to plug in to your aeroplane seat’s in-flight-entertainment connection. It also encompasses the goal with the Apple Lightning and USB-C standards to provide analogue pass-through from equipment-side digital-analogue circuitry to cater for the cheaper headset designs.

In the digital context, this can mean that the sound processing circuitry can present itself to Apple’s iOS devices or “open-frame” USB Audio implementations properly as the equipment expects. Apple still sees this as being important because their newer MacBook laptops are being equipped just with USB-C connections and MacOS is still providing class-driver support for USB-Audio devices. But most other regular-computer and mobile operating systems are providing a similar level of support for USB Audio.

But what needs to happen in both camps is for proper operating-system-level support for audio input and output in both the communications and multimedia contexts, along with accessory-side remote control for call management, media transport control and volume control at least. It may also include the ability to use a basic display on the accessory to show information like current time, incoming calls and messages and media-play details, something that can earn its keep with in-line remote-control accessories.

The UAC connection type can lead to the idea of “feature modules” or “enhancement modules” that add extra functionality to or improve the sound quality of existing UAC headphones. For example, they could offer:

  • a highly-strung DAC circuit as an upgrade path for better sound quality with premium headphones;
  • a Bluetooth adaptor to add Bluetooth wireless functionality to a set of existing wired “cans”;
  • an advanced remote control with display so you can keep your device in your pocket;
  • or an extended-power module which allows you to use external battery packs to obtain long operating times out of your smartphone and advanced headset.

What the UAC connector that Apple is pushing for is the ability to headset manufacturers to continue to work on feature headsets that can work across all of the computing platforms. As well, I also see the UAC connector as a pathway to innovation because manufacturers will be encouraged to work on features that work across all phone platforms. This is more so as we invest in the premium headsets to go with our smartphones and computers so we can listen to music or watch those videos while we are on the train.

Send to Kindle

Improved Wi-Fi technologies as the deluxe option for your Internet service

Article

Waoo Smart WiFi kit press picture courtesy of Waoo.dk

Waoo Smart WiFi kit offered in Denmark

Premium Wi-Fi is a growing opportunity for service providers, both to differentiate and to increase ARPU | Videonet.TV

From the horse’s mouth

Waoo (Danish ISP) – (Danish Language / Dansk Sprog)

Smart WiFi – Product Page

Promotional Video –  Click or tap here to play / Klik eller tryk her for at spille

My Comments

Recently, at this year’s Consumer Electronics Show in Las Vegas, some of the major home-network hardware providers offered distributed Wi-Fi network setups which provide a simplified method to improve your home network’s Wi-Fi wireless coverage.

D-Link Covr router and wireless extender package press image courtesy of D-Link

D-Link Covr router and wireless extender package – could be offered by your ISP or telco

These have been offered either in a mesh-based setup or as a “router and extender” setup with simplified setup and operation procedures. The mesh setup creates a wireless backbone mesh between each of the “nodes” in such a way that any node can obtain a strong high-throughput signal from two other nodes and there is a failover process where if one node is out-of-action, other nodes can keep the coverage going. On the other hand, a “router and extender” setup works like most of the wireless extenders on the market but implements a simplified setup and roaming experience between the router and extenders.

Some of the distributed Wi-Fi network setups also allow for the use of a wired backbone which can cater for difficult wireless-network situations, multiple building setups or even as a robust high-throughput option.

There has been a need for these setups thanks to increased streaming of video content like Netflix along with heavy use of highly-portable computer devices like laptops, tablets and smartphones. But the typical Wi-Fi setup ends up being compromised by many different situations such as routers being installed at one end of the premises, the use of dense or metallic building materials in our houses and apartments or even “white goods” or metallic furniture like filing cabinets installed in a cluster against interior walls. As well, the existence of multiple Wi-Fi networks in a neighbourhood can make things works.

But there are some telcos, cable-TV providers and Internet service providers are offering distributed wireless setups as an extra-cost option for all of their customers, and / or as “part of the package” for their top-shelf packages. This kind of service is also of interest to other ISPs who are wanting to offer that more value to their customers, and is in response to complaints that customers aren’t benefiting from the headline or contracted bandwidth at their devices especially when they are using the Wi-Fi wireless network.

Examples of this are Singtel in Singapore, and Midco (Midcontinent Communications) in the USA are offering a distributed Wi-FI system as their “premium Wi-Fi” option offered as an extra-cost option while Waoo in Denmark are offering it at no extra cost to subscribers who take up their premium Internet packages that they offer with it available for extra cost for people who subscribe to the cheaper packages.

Here, the distributed Wi-Fi setup would be part of the modem-router normally offered as customer-premises equipment with it being managed and serviced by the ISP.  Some of these setups also have TV set-top boxes that also work as access points or as part of the mesh ecosystem, typically using a wired (MoCA, HomePlug AV500) or wireless backhaul. There may also be the use of dedicated access-point nodes around the premises to provide the extra reach to the other areas.

The ISPs are, at the moment, seeing this as leading towards increased customer satisfaction due to the increased stability and throughput realised at the end devices. It is also seen as being equivalent to cable-TV services where customers rent a PVR-based set-top box, because such customers see this as being better value for money therefore less likely to walk away from the service.

Send to Kindle

Right-to-repair for consumer electronics being pushed forward in the USA

Articles

Dell Vostro 3550 business laptop

A demand is taking place to make sure portable computers and similar equipment such as laptops that suffer a lot of damage is able to be repaired by independent technicians

Right to Repair bills introduced in five states | Engadget

Five States Are Considering Bills to Legalize the ‘Right to Repair’ Electronics | Motherboard

From the horse’s mouth

Electronic Frontier Foundation

Defend Your Right To Repair (Issue Page)

The Repair Association – representing independent repairers

Consumer Electronics (Issue Page)

My Comments

Samsung Galaxy Note Edge press image courtesy of Samsung

Even those smartphones that end up with cracked screens or are dropped in the swimming pool

An issue currently being raised in the United States Of America is the ability for us to repair our own consumer-electronics equipment or have it repaired by independent repair technicians. This is becoming more important with smartphones, tablets and laptops that often fall victim to accidental damage such as that familiar cracked screen. As well, the batteries in this portable equipment lose their performance over the years and an increasing number of this equipment is supplied with batteries that aren’t user-replaceable, which leads to this equipment being “disposable” once the batteries cease to hold their charge.

The manufacturers prefer us to have the equipment serviced by official outlets but this can be highly onerous both in cost and time without the equipment. It is something that is made worse if a manufacturer doesn’t implement an authorised-repairer network for some or all of their products or severely limits the size and scope of an authorised-repairer network.

On the other hand, independent repairers like the phone-repair kiosks in the shopping centres are able to offer value for money or perform simple repairs like replacing damaged screens or end-of-life batteries quickly but they find it hard to have access to official parts, tools and know-how to perform these jobs.  In some cases, it can lead to the equipment being fitted with “known-to-work” parts salvaged from other broken equipment or a grey-market full of generic parts being available, some of which may have a huge question mark over their quality or provenance. These generic parts have come about because the parts manufacturers have been fulfilling enough orders of them that they can sell them as a commodity.

What is currently happening is that the manufacturers and distributors are exploiting various intellectual-property-rights legislation to prevent the sharing of repair knowledge to third-party repairers. As well, they have been reducing the number of official repair facilities along with reducing the availability of original spare parts and tools thus making it more onerous financially and time-wise to keep your device in good repair. In some cases like Apple with its iOS devices, they could limit the scope of their authorised-repair program so that it is harder for anyone but a select few to repair a particular class of device.

The issue that is being raised is the ability for an independent repair workshop to obtain proper spare parts, tools and knowledge from the products’ manufacturers or distributors so they can perform repairs on customers’ equipment at a cost-effective price. Here, they don’t need to be turning away customers because they don’t know how to fix a particular piece of equipment. This also includes the ability for independent repairers to discover solutions to common faults and share this knowledge along with the ability for us to see our devices work in an optimum manner for a longer time, thus reducing the “e-waste” which can be destined to the landfills.

This call is also about legitimising the ability for independent technicians to modify equipment to suit newer needs. Examples of these procedures may include “upsizing” the storage in a device with fixed storage like a smartphone, PVR or games console to a higher capacity, modifying equipment so it is accessible to those with special needs or simply adding an officially-supplied “optional-function” module to existing equipment. As well, it also encompasses the ability to continually provide support to equipment that has been abandoned by the manufacturers.

A similar situation that has been happening in the motor-vehicle market is that as vehicles became equipped with highly-sophisticated computerised subsystems, it became harder for independent repairers to service newer vehicles. This typically ended up with motorists taking their vehicles to the official repair workshops that were part of motor vehicle dealerships to keep their vehicles in good order. But some recent activity in the USA has made sure that independent garages could continue to repair and service the newer vehicle fleet by requiring the vehicle builders there to share this knowledge with them.

What is happening now is that five US states (Kansas, Nebraska, Minnesota, Massachusetts and New York) are pushing forward laws that allow repairers to buy the tools and documentation from manufacturers. A similar law had been pushed in Wyoming to extend the “right-to-repair” principle to farm machinery. This action follows on from the Massachusetts effort in 2013 to establish “right-to-repair” for motor vehicles, causing a de-facto federal approach by the US’s vehicle builders to share this knowledge with the independent vehicle-repair and roadside-assistance trade.

The issue of “right-to-repair” also relates to the implementation of standards-based or platform-based design for equipment along with competitive-trade and consumer-rights issue. In these cases, it could be about repairer availability whether based on locality or satisfying users’ needs; the ability to increase value for money when it comes to equipment maintenance or insurance coverage for equipment damage; along with the equipment being able to last longer and not end up as landfill.

Small businesses and community organisations are also in a position to benefit because their budget isn’t affected heavily by capital or operating expenses for the equipment they own.This is because they could seek repairs to broken-down equipment at a cost-effective price or have existing equipment overhauled more frequently so that it is highly available and helping them operate. They can also purchase a high-grade domestic-rated unit like, for example, a premium domestic “bean-to-cup” superautomatic espresso machine to be used as part of a coffee stall, without being refused repairs or servicing or having to pay a higher price because it is used in a “commercial” setting.

Nowadays, what needs to happen is that jurisdictions legislate or enforce “right-to-repair” laws that allow independent technicians access to parts and knowledge so they can keep consumer equipment lasting longer.

Send to Kindle

Consumer Electronics Show 2017–Computer Trends

I am writing up a series of articles about the trends that have been put forward at the 2017 Consumer Electronics Show in Las Vegas. The first article in this series covers all of the trends affecting personal computers.

1: Computer Trends

2: Accessories And The Home Network

Computers

Most manufacturers were exhibiting refreshed versions of their product ranges. This is where the computers were being equipped with up-to-date chipsets and had their RAM, storage and other expectations brought up to date.

  • Key trends affecting mainstream computers included:
  • the use of Intel Kaby Lake processors for the computers’ horsepower
  • solid-state storage capacity in the order of up to 1 Terabyte
  • RAM capacity in the order of up to 16Gb
  • at least one USB Type-C socket on mainstream units with Thunderbolt 3 on premium units and / or ultraportables using just USB-C connections with some having 2 or more of these connectors

    Lenovo ThinkPad X1 Carbon USB-C Thunderbolt-3 detail image - press picture courtesy of Lenovo USA

    More of this year’s laptop computers will be equipped with these USB Type-C or Thunderbolt 3 sockets

  • Wi-Fi connectivity being 802.11ac multi-band with MU-MIMO operation

Another factor worth noticing is the increase in detachable or convertible “2-in-1” computers being offered by most, if not all, of the manufacturers; along with highly-stylish clamshell ultraportable computers. This class of computer is being brought on thanks to Microsoft’s Surface range of computers with some of of the computers in these classes also being about performance. The manufacturers are even offering a range of these “2-in-1” computers targeted towards business users with the security, manageability, durability and productivity features that this use case demands.

Dell XPS 13 2-in-1 convertible Ultrabook press picture courtesy of Dell USA

More of these convertibles and detachable 2-in-1 computers will appear in manufacturers’ product ranges

Nearly every manufacturer had presented at least one high-performance gaming laptop with the Intel Core i7 processor, at least 16Gb RAM and 128Gb solid-state storage, dedicated graphics chipset. Most of these computers are even equipped with a Thunderbolt 3 connection to allow for use with external graphics docks, considered as a way for core gamers to “soup up” these machines for higher gaming acumen.

Lenovo had refreshed most of their laptop range, especially the ThinkPad business range. Here, this is a product range that makes no distinction between the small-business/SOHO user class where a few of these computers are managed and the large-business/government user class where you are talking of a large fleet of computers handling highly-sensitive data.

Lenovo ThinkPad X1 Carbon press image courtesy of Lenovo

Lenovo’s ThinkPad X1 Carbon has been refreshed to newer expectations

The new ThinkPads come in the form of a newer ThinkPad Yoga business convertible, a refreshed ThinkPad X1 Carbon Ultrabook and a refreshed ThinkPad X1 Yoga convertible. For example, the ThinkPad Yoga 370 has the 13.3” Full HD screen, the classic ThinkPad TrackPoint button as a navigation option but is driven by Intel Kaby Lake horsepower. This machine can be specified up to 16Gb RAM and 1Tb solid-state storage and has a Thunderbolt 3 connection along with 2 USB 3.0 ports. Lenovo even designed in protection circuitry for the USB-C / Thunderbolt 3 port to protect the ThinkPad against those dodgy non-compliant USB-C cables and chargers. Like the rest of the new ThinkPad bunch, this computer comes with the Windows 10 Signature Edition software image which is about being free of the bloatware that fills most of today’s laptop computers. The computer will set you back US$1264.

Other ThinkPads will also come with either a USB-C or Thunderbolt 3 connection depending on their position in the model range. For example the T470 family and the T570 family will be equipped with the Thunderbolt 3 connections. Let’s not forget how the ThinkPad X1 Carbon and Yoga have been refreshed. The Carbon implements horsepower in the Intel Kaby Lake Core i family, a 14” Quad HD display, 16Gb RAM and 1Tb SSD storage, and an expected battery runtime of 15 hours along with Thunderbolt 3 connectivity. The X1 Yoga has been given the similar treatment with similar RAM and secondary-storage capacity but can be outfitted with an LTE-A wireless-broadband modem as an option.

Lenovo Legion Y720 gaming laptop - press picture courtesy of Lenovo USA

Lenovo Legion Y720 gaming laptop with Dolby Atmos sound

Gamers can relish in the fact that Lenovo has premiered the Legion range of affordable high-performance gaming laptops. The Legion Y720 is the first of its kind to be equipped with Dolby Atmos sound. The Y520 has a Full HD IPS screen driven by NVIDIA GeForce GTX1050Ti dedicated graphics chipset, the choice of an Intel Core i5 or i7 CPU, 16Gb RAM and hard disk storage between 500Gb and 1Tb or solid-state storage between 128Gb and 512Gb, and network connectivity in the form of 802.11ac Wi-Fi and Gigabit Ethernet. Peripheral connectivity is in the form of 1 x USB-C, 2 x USB 3.0 and 1 x USB 2.0 and an audio jack, with this computer asking for at least US$900. The better Y720, along with Dolby Atmos, has a bright IPS screen either as a Full HD or 4K resolution and driven by NVIDIA GeForce GTX1060 graphics chipset with 6Gb display memory. Lenovo was also offering a MIIX 720 creative-arts mobile workstation that eats at the Apple MacBook Pro and Microsoft Surface Pro lineup.

Dell XPS 15 Notebook press image courtesy of Dell USA

Dell XPS 15 ultraportable in a 15″ size

Dell had refreshed the XPS 13 lineup of Ultrabooks, known for offering the right combination of features, durability, comfort and price. But they also offered a convertible 2-in-1 variant of the XPS 13, again offering that right combination of features, durability, comfort and price. They also released the XPS 15 which is the smallest 15,6” laptop with Intel Kaby Lake processors, NVIDIA GeForce dedicated graphics and a fingerprint reader.

Dell XPS 27 all-in-one computer press image courtesy of Dell USA

Dell XPS 27 all-in-one computer with best bass response in its class

The XPS and Precision all-in-one desktop computers have had their sound quality improved rather than having it as an afterthought. This has led to audio quality from the XPS 27 and the Precision business equivalent being equivalent to that of a soundbar, thanks to the use of 10 speakers working at 50 watts per channel, including two downward-firing speakers to make the work surface augment the bass. Two passive radiators also augment the system’s bass response. Both have a 4K UHD touchscreen  while the Precision certified workstation can work with AMD Radeon graphics and Intel Xeon CPUs.

Like Lenovo, Dell had exhibited their business-grade computers at a trade fair typically associated with goods targeted at the consumer. This could underscore realities like people who use business-tier computers for “work-home” use including those of us who are running a business or practising a profession from our homes. Dell have been on a good wicket here because of themselves selling computers direct to the public and to business users for a long time.

Here, Dell had refreshed their XPS, Inspiron, Optiflex, Latitude and Precision computer lineups with new expectations. They would come with Kaby Lake horsepower under the bonnet, USB-C or Thunderbolt 3 connectivity depending on the unit along with newer dedicated-graphics options from NVIDIA or AMD. The business machines would be equipped with Intel vPro manageability features to work with business-computer management software.

Dell Latitude 5285 business detachable 2-in-1 - press picture courtesy of Dell USA

Dell Latitude 5285 business detachable 2-in-1 – the most secure of its class

In the case of business computers, Dell had underscored a desire to integrate the aesthetics of consumer-tier ultraportable computers with the security, manageability and productivity wishes that the business community crave for. For example, the latest Latitude Ultrabooks and 2-in-1s show the looks but come up with the goods as a business “axe” computer. One of the systems in the Latitude lineup is the Latitude 7285 detachable 2-in-1s which implement WiTricity wireless charging and WiGig docking while the Latitude 5285 detachable 2-in-1 sells on a highly-strong security platform with Dell-developed data-protection / endpoint-protection software and the option for a fingerprint reader or smartcard reader.

Samsung had shown some Windows 10 tablets but they also presented the Notebook Odyssey gaming laptop, available as a 15” variant or a 17” higher-performing variant. Both of these implement “dual-storage” with a solid-state drive in the order of 256Gb for the 15” variant or 512Gb for the 17” variant along with a 1Tb traditional hard disk. RAM is in the order of 32Gb or 64Gb for the 17” variant while these are driven by Intel Core i7 CPUs. Graphics is looked after by NVIDIA GTX dedicated GPU with 2Gb or 4Gb display memory but the 17” variant also has a Thunderbolt 3 connection for external graphics units.

There is also the Notebook 9 which implements a 15” HD display driven by NVIDIA 940MX graphics processor and Core i7 processor. Of note, the Notebook 9 implements a Windows Hello fingerprint reader along with a USB-C port which is its power socket thanks to USB Power Delivery.

HP was not silent but had fielded the Spectre x360 15” convertible Ultrabook, one of the few 15” portable computers that can be a tablet or laptop. It is driven by Intel Core i7 Kaby Lake horsepower and has the quota of 16Gb RAM and either a 256Gb or 512Gb solid-state storage. The 15” 4K IPS screen is driven by an NVIDIA GeForce 940MX graphics processor with 2Gb display memory, but the sound-reproduction has been tuned by Bang & Olufsen while there is an HP-designed noise-cancelling microphone array. The Webcam is an HP infra-red type which is Windows Hello compatible for facial recognition login. Connectivity is in the form of an HDMI socket, 1 USB-C socket, 1 Thunderbolt 3 socket, 1 traditional USB Type-A socket and an SD-card drive. Expect this convertible’s battery to run for 12 hours and be ready to go after 90 minutes of quick charging. The expected price is US$1299 for the 256Gb variant and US$1499 for the 512Gb variant.

Another interesting trend highlighted at CES 2017 has been an increase in the number of “Next Unit Of Computing” midget computers.  This is thanks to use cases like augmented-reality / virtual-reality gaming and an emphasis on aesthetics for desktop-based computing and has been brought about by the likes of the Intel Skull Canyon NUC. One of these was a range offered by Elitegroup with computers powered by Intel Braswell, Apollo Lake and Kaby Lake processors.

Zotac Mini PC press photo courtesy of Zotac

The latest Zotac Mini PC that is the hub of a “hi-fi” approach to computing

But Zotac approached the NUC trend in a manner not dissimilar to the “micro component” hi-fi systems, especially some of the premium offerings that emerged from Japan through the early 80s. These premium “micro-component” systems offered for their amplification needs a control amplifier and a power amplifier so as to provide more power output, along with their source components being a tuner and a cassette deck. In the case of Zotac, they offered the C-Series NUC midget computer which could be powered through its USB-C port thanks to USB Power Delivery. It came with the Intel Kaby Lake processors, NVIDIA GeForce dedicated graphics, a Thunderbolt 3 connector along with a few other features. The C-Series even has corporate manageability and security abilities such as Intel vPro and AMT system management along with the UNITE secure conferencing feature.

But Zotac offered an external “card-cage” graphics dock with a PCI Express x 16 expansion slot for graphics cards, 3 standard USB 3.0 ports and one USB 3.0 port supporting QuickCharge, but being able to supply power to the host computer via the Thunderbolt 3 port using USB Power Delivery. The graphics module’s power supply has a power budget of 400 watts and the module is known to be compatible with NVIDIA and AMD graphics cards.  They even offered their own NVIDIA GeForce GTX 1080 Mini graphics card as a partner card for this dock.

The goal here was to supply a two-piece high-performance computer setup with a system unit and a module that can serve as its graphics subsystem and power supply. But users still had the ability to install better equipment when they felt like it. Or the graphics module could be purposed to provide extra graphics horsepower to portable, “all-in-one” and other small computers that are Thunderbolt-3-equipped as well as supplying necessary power through this port to host computers that honour USB Power Delivery.

Mobile Devices

Even though Samsung had suffered a deep blow with the exploding Galaxy Note 7 phablets, the mobile-computing platform has not died yet. It is although we may be hanging on to our smartphones for longer than the typical two-year contract period in order to save money.

At the moment, the phones that are being given an airing are the mid-tier Android smartphones like the Huawei Honor 6X with a dual camera and the ASUS Zenfone 3 Zoom which is one of the first to have an optical zoom on the rear camera’s lens.

Samsung launched their Galaxy A3 and A5 Android smartphones which are still positioned in the mid-tier segment. This is while Sony came to the fore with the XPeria X2 premium smartphone which has a 5.5” 4K display and 5Gb RAM, just above the baseline expectations for RAM capacity in a desktop computer.

LG had launched a range of low-tier Android smartphones that are equipped with user-replaceable batteries. The K3 is a compact unit with a 4.5” display while the K4 comes with the standard 5” display. There is the K8 5” selfie smartphone which has a highly-optimised front camera for taking those selfies to appear on Instagram or Facebook. Then LG brought the 13 megapixel camera featured in the G series lineup to the K10 5.3” smartphone. They also offered a Stylus 3 phablet with an integrated fingerprint scanner.

The next in the series will cover high-resolution monitors, computer accessories and the home network including the distributed-WiFi trend.

Send to Kindle

Investing in an external graphics module for your laptop

Razer Blade gaming Ultrabook connected to Razer Core external graphics module - press picture courtesy of Razer

Razer Blade gaming Ultrabook connected to Razer Core external graphics module

Just lately, as more premium and performance-grade laptops are being equipped with a Thunderbolt 3 connection, the external graphics modules, also known as graphics docks or graphics docking stations, are starting to trickle out on to the market as a performance-boosting accessory for these computers.

The Thunderbolt 3 connection, which uses the USB Type-C plug and socket, is able to provide a throughput similar to a PCI-Express card bus and has put forward a method of improving a laptop’s, all-in-one’s or small-form-factor computer’s graphics ability. This is being facilitated using the external graphics modules or docks that house graphics processors in the external boxes and link these to the host computer using the above connection. What it will mean is that these computers can benefit from desktop-grade or performance-grade graphics without the need to heavily modify them and, in the case of portable computers, can allow for “performance” graphics to be enjoyed at home or in the office while you have battery-conserving baseline graphics on the road,

Acer Aspire Switch 12S convertible 2-in-1 - press picture courtesy of Microsoft

Acer Aspire Switch 12S convertible 2-in-1 – can benefit from better graphics thanks to Thunderbolt 3 and an external graphics module

The devices come in two classes:

  • Integrated graphics chipset (Acer Graphics Dock) – devices of this class have a hardwired graphics chipset similar to what is implemented in an all-in-one or small-form-factor computer.
  • Card cage (Razer Core, Akitio Node) – These devices are simply a housing where you can install a PCI-Express desktop graphics card of your choice. They have a power supply and interface circuitry to present the desktop graphics card to the host computer via a Thunderbolt 3 connection.

What will they offer?

Akitio Node Thunderbolt 3 "card cage" external graphics module - press image courtesy of Akitio

Akitio Node Thunderbolt 3 “card cage” external graphics module

All these devices will have their own video outputs but will yield what the high-performance graphics chipset provides through the host computer’s integral screen, the video outputs integrated with the host computer as well as their own video outputs. This is in contrast to what used to happen with desktop computers where the video outputs associated with the integrated graphics chipset became useless when you installed a graphics card in these computers.

I have read a few early reviews for the first generation of graphics modules and Thunderbolt-3 laptops. One of these was Acer’s integrated graphics module kitted out with a NVIDIA GTX960M GPU, known to be a modest desktop performer but its mobile equivalent is considered top-shelf for laptop applications. This was ran alongside an Acer TravelMate P658 and an Acer Aspire Switch 12S, with it providing as best as the graphics would allow but highlighting where the weakness was, which was the mobile-optimised Intel Core M processors in the Switch 12S convertible.

Simplified plug-in expansion for all computers

Intel Skull Canyon NUC press picture courtesy of Intel

The Intel Skull Canyon NUC can easily be “hotted up” with better graphics when coupled with an external graphics module

Another example was a manufacturer’s blog post about using their “card-cage” graphics dock with one of the Intel Skull Canyon “Next Unit Of Computing” midget computers which was equipped with the Thunderbolt 3 connection. This showed how the computer increased in graphics performance once teamed with the different graphics cards installed in that “card-cage” module.

It opened up the idea of using an “AV system” approach for enhancing small-form-factor and integrated computers. This is where you connect extra modules to these computers to increase their performance just like you would connect a better CD player or turntable or substitute an existing amplifier for something more powerful or plug in some better speakers if you wanted to improve your hi-fi system’s sound quality.

This usage case would earn its keep with an “all-in-one” computer which has the integrated monitor, the aforementioned “Next Unit Of Computing” midget computers or simply a low-profile desktop computer that wouldn’t accommodate high-performance graphics cards.

Software and performance issues can be a real stumbling block

What I had come across from the material I had read was that as long as the host computer had the latest version of the operating system, the latest BIOS and other firmware to support graphics via Thunderbolt 3, and the latest drivers to support this functionality then it can perform at its best. As well, the weakest link can affect the overall performance of the system, which can apply to various mobile system-on-chip chipsets tuned primarily to run cool and allow for a slim lightweight computer that can run on its own batteries for a long time.

At the moment, this product class is still not mature and there will be issues with compatibility and performance with the various computers and external graphics modules.

As well, not all graphics cards will work with every “card-cage” graphics module. This can be due to high-end desktop graphics cards drawing more current than the graphics module can supply, something that can be of concern with lower-end modules that have weaker power supplies, or software issues associated with cards that aren’t from the popular NVIDIA or AMD games-focused lineups. You may have to check with the graphics module’s vendor or the graphics card’s vendor for newer software or firmware to be assured of this compatibility.

Multiple GPUs – a possible reality

A situation that may have to be investigated as more of these products arrive is the concurrent use of multiple graphics processors in the same computer system no matter the interface or vendor. The ability to daisy-chain 6 Thunderbolt-3 devices on the same Thunderbolt-3 connection, along with premium desktop motherboards sporting this kind of connection along with their PCI-Express expansion slots, will make the concept become attractive and easy to implement. Similarly, some vendors could start offering Thunderbolt-3 expansion cards that plug in to existing motherboards’ PCI-Express expansion slots to give existing desktop PCs this functionality.

Here, the goal would be to allow multiple GPUs from different vendors to work together to increase graphics performance for high-end games or multimedia-production tasks like video transcoding or rendering of video or animation projects. Or it could be about improving the performance and efficiency of a multiple-display setup by allocating particular graphics processors to particular displays, something that would benefit larger setups with many screens and, in some cases, different resolutions.

Highly-portable gaming setups being highlighted as a use case

A usage class that was always put forward for these external graphics modules was the teenage games enthusiast who is studying at senior secondary school and is ready to study at university. Here, the usage case underscored the situation where they could be living in student accommodation like a college dorm / residence hall or be living in a share-house with other students.

The application focuses on the use of a laptop computer that can be taken around the campus but be connected to one of these modules when the student is at their home. I would add to this the ability to carry the graphics module between their room and the main lounge area in their home so that they could play their games on the bigger TV screen in that area. This is due to the device being relatively compact and lightweight compared to most desktop computers.

That same application can cover people who are living in accommodation associated with their job and this is likely to change frequently as they answer different work placements. An example of this would be people whose work is frequently away from home for significant amounts of time like those who work on ships, oil rigs or mines. Here, some of these workers may be using their laptop that they use as part of their work during their shift where applicable such as on a ship’s bridge, but use it as a personal entertainment machine in their cabin or the mess room while they are off-shift.

What could be seen more of these devices

Once the external graphics modules mature as a device class, they could end up moving towards two or three classes of device.

One of these would be the integrated modules with graphics chipsets considered modest for desktop use but premium for laptop use. The expansion abilities that these may offer could be in the form of a few extra USB connections, an SD card reader and / or a higher-grade sound module. Perhaps, they may come with an optical drive of some sort. Some manufacturers may offer integrated modules with higher-performance graphics chipsets along with more connections for those of us who want to pay a premium for extra performance and connectivity. These would be pitched towards people who want that bit more “pep” out of their highly-portable or compact computer that has integrated graphics.

Similarly, it could be feasible to offer larger-screen monitors which have discrete graphics chipsets integrated in them. They could also have the extra USB connections and / or secondary storage options, courting those users who are thinking of a primary workspace for their portable computer while desiring higher-performance graphics.

The card-cage variants could open up a class of device that has room for one or two graphics cards and, perhaps, sound cards or functionality-expansion cards. In some cases, this class of device could also offer connectivity and installation options for user-installable storage devices, along with extra sockets for other peripherals. This class of device could, again, appeal to those of us who want more out of the highly-compact computer they started with or that high-performance laptop rather than using a traditional desktop computer for high-performance computing.

Portable or highly-compact computers as a package

Manufacturers could offer laptops, all-in-one and other highly-compact or highly-portable computers that are part of matched-equipment packages where they offer one or more external graphics modules as a deal-maker option or as part of the package. These could differ by graphics chipset and by functionality such as external-equipment connectivity or integrated fixed or removable storage options.

This is in a similar vein to what has happened in the hi-fi trade since the 1970s where manufacturers were offering matched-equipment packages from their lineup of hi-fi components. Here they were able to allow, for example, multiple packages to have the same tape deck, turntable or CD player while each of the package was differentiated with increasingly-powerful amplifiers or receivers driving speakers that had differing levels of audio performance and cabinet size. It still was feasible to offer better and more capable source components with the more expensive packages or allow such devices to be offered as a way to make the perfect deal.

Conclusion

Expect that as more computers equipped with the Thunderbolt 3 over USB-C connection come on the market the external graphics module will become a simplified method of improving these computers’ graphic performance. It will be seen as a way for allowing highly-compact or highly-portable computers to benefit from high-performance graphics at some point in their life, something that this class of computer wouldn’t be able to normally do.

Send to Kindle

Passive Wi-Fi–a new trend for battery-operated Wi-Fi network devices

Articles

‘Passive Wi-Fi’ researchers promise to cut Wi-Fi power by 10,000x | PC World (IDG)

New “Passive Wi-Fi” Could Drastically Cut Power Needs For Connected Devices | Fortune

Passive WiFi – 10,000 times less power consumption than trad WiFi | Telecom TV

US engineers unveil Passive Wi-Fi, which consumes 10,000 times less power | Android Authority

Video (Click / Tap to play)

My Comments

A new direction that is being looked at for the Wi-Fi wireless-network ecosystem is the use of “passive Wi-Fi”. This is where Wi-Fi endpoints will not be needing the use of analogue RF amplification circuitry and can simply reflect these wireless signals back to access points or routers.

Traditional active Wi-Fi setups work analogously to a torch (flashlight) that is being used where it is actively putting out the light thanks to its batteries. But passive Wi-Fi works in a similar vein to a mirror that simply reflects the light without using any energy.

The advantage here with passive Wi-Fi is that devices implementing that technology don’t need to draw lots of current for them to operate on the network. This is so appealing towards mobile devices implementing it as a battery-saving measure.

But it also appeals towards how devices related to the smart home or Internet-Of-Things will be designed. This is because these devices can be designed to work for a long time on up to three AA or AAA Duracells or a coin battery, or could use energy-harvesting technologies like solar power or kinetic energy but work with a Wi-Fi network rather than the Bluetooth LE, Zigbee or Z-Wave networks that are optimised for low energy.

Here, it may be feasible to directly connect these devices to your home network and the Internet without the need to use bridge devices to achieve this goal. This is although it can be feasible to integrate Bluetooth LE, Zigbee and/or Z-Wave bridging functionality in to a Wi-Fi-capable router or access point, especially if there is a market expectation to have these devices also serve as “smart-home” or “IoT” hubs.

At the moment, passive Wi-Fi can work between 30-100 feet on a line-of-sight or through walls while passing a bandwidth of up to 11Mbps. The prototypes have been demonstrated with traditional Wi-Fi network equipment including a router and smartphone and this has proven that they can work in a standard Wi-Fi network. But there have been issues raised about requiring routers and access points to broadcast a “wake-up” call for these devices to report their presence and status.

A question that can be asked as this technology is designed is whether it could be feasible to design a Wi-FI front-end to switch between active and passive mode. Here, it could appeal to devices that enter passive mode simply to save energy but “go active” while in use with obvious use cases being mobile devices or Wi-Fi-based handheld controllers.

What it could lead to is that the goal to optimise all of the building-wide wireless-data technologies for low-power use has been nearly completed with the ability to have devices that exploit these technologies able to run for a long time on ordinary batteries.

Send to Kindle

Microsoft answers Amazon and Google without reinventing the wheel

Articles

Acer Switch Alpha 12 2-in-1 with keyboard press image courtesy of Acer

These Windows 10 computers will be part of Microsoft’s smart-home vision

Microsoft takes aim at Amazon’s Echo with Windows 10 HomeHub feature | The Verge

Windows 10 “Home Hub” feature will take on Amazon Echo and more | ARS Technica

How and why Microsoft is stepping up its focus on ‘families’ with Windows 10 | ZDNet

Home Hub, la réponse de Microsoft à Amazon Echo et Google Home | Ere Numérique (French Language / Langue Française)

My Comments

Microsoft and Apple recently built their voice-driven personal assistants in to their regular-computer operating systems rather than confining this class of software to mobile devices. As well, Apple baked in the HomeKit smart-home framework in to the iOS mobile-device operating system to make it work with devices that represent the Internet Of Things or the smart home.

But Amazon and Google went ahead with voice-activated smart-home assistants being part of their network-connected wireless-speaker products. These would work with some of the smart-home devices and offer calendar and similar functionality for the home at your request.

Sony VAIO Tap 20 adaptive all-in-one computer as a desktop

These “adaptive all-in-one” computers like the Sony VAIO Tap 20 can be part of the “smart home”

Microsoft has decided to go another path for integrating the smart home and the voice-driven personal assistant concept by working on another function that will appear in an upcoming major functionality-driven Windows 10 update. This is to be called “Home Hub” which is destined for the “Redstone 3” Windows 10 functionality update, intended to appear after the “Creators Update”.

The software is intended to be able to work on a regular desktop or laptop computer that can run the Windows 10 operating system. Here, it could easily put new life in to the “all-in-one” computer design including those “Adaptive All-In-One” computers of the Sony VAIO Tap 20 ilk, pushing them as a computer that can exist on the kitchen bench. It can also put the midget computers known as the “NUC” (Next Unit Of Computing) devices to use by having them connected to that small flatscreen TV typically used to watch daytime TV content. Let’s not forget that they will earn their keep with all of the detachable and convertible “2-in-1” computers working as a tablet but can make more use out of existing desktop and laptop computers.

ASUS VivoStick press picture courtesy of ASUS

ASUS VivoStick – their answer to Intel’s Compute Stick – can repurpose that small flatscreen TV as a monitor for the central computer

Here, this functionality is centred around a common household account which appointments and other resources can be shared to. It effectively serves the same purpose as the fridge door which ends up as the household’s noticeboard. These events will appear on a lock-screen which shows a calendar, tasklist and other common information. There will be the ability for third-party application developers to develop apps that can share information to this “common display”, thanks to application-programming interfaces that Microsoft will offer as part of the equation.

Users can still log in to their own account using Windows Hello or their traditional login methods that the system supports to see a combined view of their personal information and the shared common information.

Let’s not forget that Microsoft wants to use the Cortana voice-driven personal assistant as part of this solution but the problem with these voice-driven assistants is that they are dows usually trained to one operator and may not handle multiple users.  In the home context, there is the issue of people’s voice changing as they get older, such as a young boy using the system initially, but facing problems with Cortana when his voice breaks as part of him being a teenager.

Like with Amazon’s and Google’s implementations, it could be feasible for you to direct the Cortana implementation to stream music from your favourite third-party music services. This, again would be facilitated with the music services’ apps having API hooks to Cortana and the other software that is part of Windows 10 Home Hub.

But there will be the ability to have the Windows 10 Home Hub also work as part of the smart home by being a control or display surface for compatible smart lights, thermostats and door locks. This will be facilitated through the use of open-frame industry standards for communication between devices and the Windows 10 Home Hub, I would suspect that one of the most common applications for this would be to see status notifications for various systems on the lock-screen or to have the ability to ask Cortana or operate a control on that lock-screen to do things like turn down the heating or close the garage door.

It has been one of Microsoft’s many efforts to provide family-focused home computing like offering some software as household-wide licenses or providing integral parental controls on the Windows platform.

But there are some questions to raise concerning Windows 10 and the Home Hub.

One of these is whether the professional, educational and enterprise variants of Windows 10 will be able toe be equipped with the Home Hub. This is more so for the “work-home” laptop scenario where people use the same computing device between their workplace or place of study and their home.

Similarly, this extends to existing Windows 10 deployments where there is the desire to use existing computers that run the operating system. It is because there will be at least a lot of households that will maintain a few Windows 10 computers in some form. One of the questions is how simple is it to integrate extant computers and user accounts including domain-linked workplace accounts in to a Home Hub setup, achieving the goal to benefit from the common calendar and lockscreen.

Apple could take a leaf out of Microsoft’s book and link Siri, HomeKit and the MacOS regular-computing platform to provide a similar “home-central” service for their platforms while avoiding the need to “reinvent the wheel”.

How Microsoft have approached the smart-home trend and answer Amazon’s Echo and Google’s Home wireless speakers is to exploit their knowhow in Windows 10 and allow people to use existing computers and home networks to achieve this same goal.

Send to Kindle

Samsung implements auto-focus on the Galaxy S8 to make it a selfie smartphone

Article

Samsung Galaxy Note Edge press image courtesy of Samsung

The front camera on the next premium smartphones could end up being equipped with auto-focus technology

The Galaxy S8 may provide better selfies thanks to autofocus implementation | Android Authority

Previous coverage on “selfie” smartphones

What Makes That Smartphone A “Selfie” Smartphone

My Comments

Increasingly smartphone manufacturers are paying attention to the kind of photos a smartphone’s or tablet’s front-facing camera takes. This has been driven by the phenomenon where young people are using these cameras to take “selfies” – pictures of themselves. Even venue owners and event hosts are catering to this trend by providing “selfie photobooths” with the appropriate decorations and props so they can take the funniest-looking selfie.

The way most of the manufacturers have approached this issue includes front-facing cameras with a resolution not dissimilar to the rear-facing camera, use of a wide-angle lens on the front-facing camera or even integrating software logic to remove blemishes from the photos that are taken.

But Samsung has gone further with their front-facing camera by implementing an auto-focus mechanism. Typically, a smartphone would be equipped with auto-focus on the rear-facing camera because this is the one used for general photography but the front-facing camera gets a fair bit of use for both videocalls and selfies. But implementing an auto-focus camera for both of the smartphone’s cameras would be costly and not worth it due to the close proximity of the subjects.

Here, they have implemented an auto-focus cameras on both the front-facing camera and the rear-facing camera for their new Galaxy S8 Android smartphone. This will be seen as a way to differentiate their premium smartphones from the rest of the pack due to the ability to yield that sharp videocall image or selfie.

As the cost of auto-focus cameras for smartphones and tablets that yield acceptable resolution goes downhill, it could become a trend for front-facing cameras on the smartphones, tablets, laptops and similar devices to have this feature for the best Skype videocall or selfie.

Send to Kindle

Silicon Valley starts a war against fake news

Article

Facebook and Google to block ads on fake news websites | Adnews

Facebook Employees Are In Revolt Over Fake News | Gizmodo

Google and Facebook Take Aim at Fake News Sites | New York Times

Does the internet have a fake-news problem? | CNet

Google CEO says fake news is a problem and should not be distributed | The Verge

Want to keep fake news out of your newsfeed? College professor creates list of sites to avoid | Los Angeles Times

My Comments

Since Donald Trump gained election victory in the USA, there has been some concern amongst a few of Silicon Valley’s tech companies regarding the existence of “fake news”.

This is typically a story that is presented in order to refer to an actual news event but doesn’t relate to any actual news event. In some cases, such stories a hyped-up versions of an existing news item but in a lot of cases, these stories are built up on rumours.

The existence of Internet-distributed fake news has been of concern amongst journalists especially where newsroom budgets are being cut back and more news publishers and broadcasters are resorting to “rip-and-read” journalism, something previously associated with newscasts provided by music-focused FM radio stations.

Similarly, most of us are using Internet-based news sources as part of our personal news-media options or or only source of news, especially when we are using portable devices like ultraportable laptops, tablets or smartphones as our main Internet terminals for Web browsing.

Silicon Valley also see the proliferation of fake news as a threat to the provision of balanced coverage of news and opinion because they see this as a vehicle for delivering the populist political agenda rather than level-headed intelligent news. This is typically because the headline and copy in “fake news” reports is written in a way to whip up an angry sentiment regarding the topics concerned, thus discouraging further personal research.

But Facebook and Google are tackling this problem initially by turning off the advertising-money tap for fake-news sites. Facebook will apply this to ad-funded apps that work alongside these sites while Google will apply this as a policy for people who sign up to the AdSense online display-ads platform.

There is the issue of what kind of curating exists in the algorithms that list search results or news items on a search-engine or social-media page. It also includes how the veracity of news content is being deemed, even though Google and Facebook are avoiding being in a position where they can be seen as “arbiters of truth”.

The big question that can exist is what other actions could Silicon Valley take to curb the dissemination of fake news beyond just simply having their ad networks turn off the supply of advertising to these sites? This is because the popular search engines are essentially machine-generated indexes of the Web, while the Social Web and the blogosphere are ways where people share links to resources that exist on the Web.

Some people were suggesting the ability for a search engine like Google or a social network site like Facebook to have its user interface “flag” references to known fake-news stories, based on user or other reports. Similarly, someone could write desktop or mobile software like a browser add-on that does this same thing, or simply publish a publicly-available list of known “fake-news” Websites for people to avoid.

This is infact an angle that a US-based college professor had taken where she prepared a Google Docs resource listing the Websites hosting that kind of news, in order to help people clean their RSS newsfeeds of misinformation, with some mainstream online news sources including the New York Magazine providing a link to this resource.

The issue of fake news distributed via the Internet is becoming a real problem, but Silicon Valley is looking at different ways to solve this problem and bring to it the same level of respect that was associated with traditional media.

Send to Kindle

You could be using your phone to sign in to Facebook on the big screen

Article

Apple TV 4th Generation press picture courtesy of Apple

You could be able to log in to Facebook on this device using your smartphone’s Facebook client

Facebook Login Updated for tvOS, FireTV, Android | AdWeek SocialTimes

From the horse’s mouth

Facebook

Developer News Press Release

Improving Facebook Login For TV and Android

My Comments

A holy grail that is being achieved for online services is to allow users to authenticate with these services when using a device that has a limited user interface.

TV remote control

A typical smart-TV remote control that can only offer “pick-and-choose” or 12-key data entry

An example of this is a Smart TV or set-top device, where the remote control for these devices has a D-pad and a numeric keypad. Similarly, you have a printer where the only interface is a D-pad or touchscreen, with a numeric keypad only for those machines that have fax capabilities.

Here, it would take a long time to enter one’s credentials for these services due to the nature of the interface. This is down to a very small software keyboard on a touchscreen, using “SMS-style” text entry on the keypad or “pick-and-choose” text entry using the D-pad.

Facebook initially looked at this problem by displaying an authentication code on the device’s user interface or printing this code out when you want to use it from that device. Then you go to a Web-enabled computer or mobile device and log in to facebook.com/device and transcribe that code in to the page to authenticate the device with Facebook.

Here, they are realising that these devices have some role with the Social Web, whether to permit single sign-on, allow you to view photos on your account or use it as part of a comment trail. But they also know that most of us are working our Facebook accounts from our smartphones or tablets very frequently and are doing so with their native mobile client app.

But they are taking a leaf out of DIAL (DIscovery And Launch) which is being used as a way to permit us to throw YouTube or Netflix sessions that we start on our mobile devices to the big screen via our home networks. It avoids a long rigmarole of finding a “pairing screen” on both the large-screen and mobile apps, then transcribing a PIN or association code from the large screen to the mobile client to be able to have it on the TV screen,

This is where you will end up authenticating that big-screen app's Facebook login request

This is where you will end up authenticating that big-screen app’s Facebook login request

What Facebook are now doing for the 4th generation Apple TV (tvOS) and Android-based TV/video peripheral platforms (Android TV / Amazon FireTV) is to use the mobile client app to authenticate.

Here, you use a newer version of the Facebook mobile client, the Facebook Lite client or the Google Chrome Custom Tabs to authenticate with the big screen across the home network. The TV or set-top device, along with the mobile device running the Facebook mobile client both have to be on the same logical network which would represent most small networks. It is irrespective of how each device is physically connected to the network such as a mobile device using Wi-Fi wireless and the Apple TV connected via HomePlug AV500 powerline to the router for reliability.

What will happen is that the TV app that wants to use Facebook will show an authentication code on the screen. Then you go to the “hamburger” icon in your Facebook mobile client and select “Device Requests” under Apps. There will be a description of the app and the device that is wanting you to log in, along with the authentication code you saw an the TV screen. Once you are sure, you would tap “Confirm” to effectively log in from the big screen.

At the moment, this functionality is being rolled out to tvOS and Android-based devices with them being the first two to support the addition and improvement of application programming interfaces. But I would see this being rolled out for more of the Smart TV, set-top box and similar device platforms as Facebook works through them all.

Spotify login screen

This kind of single-sign-on could apply to your Smart TV

One issue that may have to crop up would be to cater for group scenarios, which is a reality with consumer electronics that end up being used by all of the household. Here, software developers may want to allow multiple people to log in on the same device, which may be considered important for games with a multiplayer element, or to allow multiple users to be logged in but with one user having priority over the device at a particular time like during an on-screen poll or with a photo app.

Another question that could be raised is where Facebook is used as the “hub” of a user’s single-sign-on experience. Here, an increasing number of online services including games are implementing Facebook as one of the “social sign-on” options and the improved sign-on experience for devices could be implemented as a way to permit this form of social sign-on across the apps and services offered on a Smart TV for example. It could subsequently be feasible to persist current login / logout / active-user status across one device with all the apps following that status.

Other social-media, messaging or similar platforms can use this technology as a way to simplify the login process for client-side devices that use very limited user interfaces. This is especially where the smartphone becomes the core device where the user base interacts with these platforms frequently.

Send to Kindle