Category: Computer Hardware Design

Designing for highly-compatible Internet Of Things

Article

D-Link DCH-3150 myDLink motion sensor

Smart Home and Internet Of Things devices need to be designed for compatibility and security before they become popular

How to bring true interoperability to the Internet of Things | Network World

My Comments

Increasingly, the concept of the “smart home” or Internet Of Things is becoming very real. Here, we are seeing a lot more consumer-electronics devices, home appliances and similar devices become connected to the home network and the Internet.

The “app-cessory” approach to network-controlled devices, where the only way to control these devices via your home network is through a manufacturer-supplied mobile-platform app, has now had its day. This typically asked that the device to be connected to your iOS or Android smartphone or tablet using one of three paths: a Bluetooth connection to the mobile device in the same vein as a Bluetooth headset; a Wi-Fi network created by the device that is controlled by the mobile-platform device; or the home network’s Wi-Fi segment.

The trend that is affecting these devices is to interlink them with a platform-based voice-driven “home assistant” of the Amazon Alexa or Google Home ilk. Here, the requirement is for the manufacturer to provide a “skill” or something similar to the “home-assistant” platform so that Alexa, for example, can interact with the device.

But the article is now highlighting the requirement for increased compatibility with the Internet Of Things. This is where the same device can operate across a range of different network setups and operating platforms.

Use of highly-capable hardware interfaces at the media-connection level

A direction that has assured “out-of-the-box” interoperability for regular-class and mobile-class computer devices along with an increasing number of consumer-electronics devices is to implement one or more multi-mode front-ends when handling the different interface types.

In the case of radio, it can mean being able to handle Wi-Fi, Bluetooth, Zigbee or similar technologies concurrently.With the wired networks, it would be about working with different media protocols over the same kind of wire, being Cat5 unshielded twisted pair, TV-antenna coaxial cable, AC wires used to power your appliances or traditional telephone wires.

Devolo Home Control Central Unit (Zentrale) press photo courtesy of Devolo

Devolo Home Control Central unit connected to router

In the case of a wireless connection, this is represented by the use of Bluetooth for peripheral-class device connection and Wi-Fi wireless networking to the latest standard for connecting to the home network and the Internet. Smartphones and some tablets will also implement a mobile-broadband modem that works across recent cellular mobile-telephony standards as well. As well, some consumer-electronics devices may implement a multifunction radio front-end that supports Zigbee or Z-Wave, typically to provide support for an RF-based remote control.

There are a significant number of “smart-home” or “Internet Of Things” devices that are designed to work solely with Bluetooth, Zigbee or Z-Wave. Examples of these range from temperature sensors, smart locks and movement sensors. These devices, typically battery-operated devices, use one of these technologies because of the fact that they are very thrifty on battery power thus allowing them to work on up to 3 AA Duracells or a 3V “pill-size” battery for months at an end or to work only on “harvested” power like kinetic energy.

But, if they want to liaise with your home network and the Internet, they have to deal with a gateway device that links between them and the home network. It is because, at the time of writing, no-one has effectively brought a Wi-Fi-capable single-mode or multimode radio front-end chipset that permits a battery-operated device to work in a power-efficient manner.

But another approach being called for is to have an Internet gateway device i.e. a home or small-business router being equipped with support for Bluetooth, Zigbee and / or Z-Wave along with Wi-Fi and Cat5 Ethernet for the home network. To the same extent, a Wi-Fi infrastructure device like an access point or range extender could simply be a bridge between other radio-network types like Zigbee or Bluetooth and the home network facilitated by the Wi-Fi or wired home-network connection.

Some manufacturers even have an “IoT hub” or gateway that links their Bluetooth, Zigbee or Z-Wave devices to your home network via an Ethernet connection. Here, this is offered as part of enabling their devices for online control via a Web dashboard or mobile-platform app. The current situation with most of these hubs is that they have the online-service hub that works with the manufacturer’s device.

There needs to be the ability to facilitate setups involving multiple gateways that link the home network with Zigbee or similar “IoT” radio segments. This is a reality with most of these devices being limited in their radio coverage in order to conserve battery power because they are expected to run on a commodity battery supply like two or three AA Duracells for months at a time or, in some cases, work on harvested electrical energy. You may find that having one of the gateways located near an IoT endpoint device like a smart lock may assure reliable connected operation from that device.

In these setups, there needs to be the ability to see a collection of these “IoT-specific” radio segments as one logical segment, along with the ability to discover and enumerate each device no matter which gateway or bridge device it is connected to and what kind of networks is used as the backbone.

Flexible software to the application level

Kwikset Kevo cylindrical deadbolt in use - Kwikset press image

To provide extended monitoring and control to the Kwikset Kevo deadbolt, you have to use a Bluetooth bridge supplied by Kwikset

Another issue raised regarding the Internet Of Things is compatibility across multiple software platforms and protocols.

A design practice that has been known to be successful was for recent network-connected home-AV equipment like Wi-Fi wireless speakers to support Apple AirPlay, Google Chromecast and DLNA “out of the box”. Here, you could stream content to these devices using most computer devices, whether it be your iPhone, Android tablet or Windows computer, or whether it is hosted on your NAS device.

Here, the goal is for a device to support many different software platforms, frameworks and protocols that are needed to do its job. To the same extent, it could be feasible for a device to work with different cloud services like Google Home, Amazon Alexa or IFTTT. What this can mean is that a device can work with different control and display surfaces from different manufacturers. It also means that the data that a piece of equipment shares is set in a known standard so that any software developer working on an IoT project can make use of this data in their code.

For example, the Open Connectivity Foundation’s standards which include the UPnP standards and are supported by the “open-frame” computing community, along with the Apple HomeKit framework will be required to be supported by network-connected devices.

Here, it will be about identifying every one of the standards supported by the physical medium that the IoT device uses to link with other devices and the network. Then implementing all of the current standards supported by that medium in a vendor-agnostic manner.

Secure by design

An issue that has been raised recently is the issue of data security practices implemented by the software that runs Internet-Of-Things and dedicated-purpose devices. Situations that have come to the fore include the Mirai botnet that scoped in network videosurveillance cameras and home-network routers to perform distributed denial-of-service attacks against online resources like the Krebs On Security Website and the DNS records held by Dyn, a dynamic-DNS provider, affecting a large number of Internet household names.

Here, the issue being called out is designing the software in this class of device for security along with a continual software-maintenance cycle. But it also includes the implementation of secure-software-execution practices not uncommon with the latest desktop and mobile operating systems. This includes secure-boot, trusted-execution and sandboxing to prevent unwanted code from running along with data-in-transit protection and authentication at the network level.

The concept of a continual software-maintenance approach where the firmware and other software associated with the Internet Of Things is always updated with these updates installed “in the field” as they are available, allows for the removal of software bugs and security exploits as they become known. It also allows the software to be “tuned” for best performance and manufacturers can even roll out newer functionality for their devices.

In some cases, it could even lead to a device being compatible with newer and revised standards and protocols rather than seeing one that ends up being limited because it doesn’t support the newer better protocol. But there can be the question about this kind of software update being used as a way to enforce unpopular device-design requirements upon an existing installed base of devices and changes how they operate. This could be brought about by a government mandate or an industry expectation, such as an eco-requirement for HVAC equipment required by a state energy-conservation department or a digital-rights-management expectation required at the behest of Hollywood.

To make the IoT hardware and software ecosystem work properly, there needs to be an underscored requirement for compatibility with prior and newer devices along with the ability to work securely and with properly-maintained software.

Send to Kindle

What’s inside your computer (INFOGRAPHIC)

Some of you who have a traditional “three-piece” desktop computer system where there is a separate box where all the activity takes place, may refer to this box of your computer setup as the “hard disk” even though it is known as a “system unit”. This is because the hard disk, amongst the other key computing subsystems like the CPU processor and the RAM exists in that box.

This infographic shows what the key parts of your computer are and is based on one of the newer small-form-factor designs that are common in the office and home.

Desktop computer system unit - inside view

What’s inside your computer

 

Send to Kindle

What is a GPU all about?

Article

Lenovo ThinkPad X1 Carbon Ultrabook

The GPU, whether dedicated or integrated is what paints the picture on your computer screen

What Makes A GPU Different From A CPU? | Gizmodo

My Comments

A graphics processing unit or GPU is a special data-processing chipset that effectively “paints” the images that you are to see on your computer screen. This is compared to the central processing unit or CPU which is focused on handling the data that your computer is dealing with at your command and being the system’s “commander” processor.

The idea of a separate processor is to effectively work with the shapes, pixels and colours that constitute what you see on the screen and the highly-sophisticated GPUs handle this using multiple “cores” or unique processors. Another factor worth considering is that video editing, animation and transcoding programs are making use of the GPU to transcode the video material between different formats or rendering an animation or a sequence of shorter video clips in to one longer video clip.

Gaming rig

A “gaming rig” tower desktop computer equipped with high-performance display cards

The higher-performance GPUs, typically offered as display cards that are installed in desktop computers especially “gaming rigs” set up by computer-games enthusiasts, use multiple “cores” or unique processors so they can realise the high-resolution graphics very quickly and responsively. Some of these cards even implement setups like “Crossfire” with the ability to gang two display cards together for increased performance.

Integrated vs dedicated GPUs

Typically the difference between an integrated or dedicated GPU is that a dedicated GPU has its own memory and other resources for “painting” the graphics images while an integrated GPU “borrows” resources like RAM memory from the system’s CPU. As well, a lot of these dedicated GPUs are designed and developed by companies who specialise in that field.

The benefit of a dedicated GPU is that it can turn out the graphics images required by demanding applications like games, video editing, CAD and the like efficiently because its resources are focused on what you see while the CPU and system RAM are focused on working out what is to happen.

Sony VAIO S Series ultraportable STAMINA-SPEED switch

Sony VAIO S Series – equipped with dual graphics with an easy-to-use operating-mode switch

For example, a game needs the use of the CPU to answer the players’ commands, apply the game’s rules and position each of the elements while it needs the GPU to visually represent where everything is. Here, the dedicated GPU can handle how everything is represented without encumbering the CPU’s tasks relating to how the game runs.

The main disadvantage with dedicated GPUs that affects laptops and other portable computers is that they can quickly drain the computer’s battery. This has been answered in a few ways like equipping laptops with integrated and dedicated graphics chipsets and adding logic like NVIDIA’s Optimus to switch between the different chipsets, in a similar vein to how the overdrive or “sports mode” in some cars work. In most cases, this logic engaged the dedicated graphics if the computer was running a graphics-intensive program like a game or video-editing program or was running on external power.

External GPU docks

Alienware high-performance laptop computer with Graphics Amplifier external GPU module

Alienware high-performance laptop computer with Graphics Amplifier external GPU module

A new trend that is starting to appear and benefit laptop-computer users is the “external GPU” dock or module that connects to the laptop computer. These appear in two different forms – a “card cage” like the Alienware Graphics Amplifier where a user can install a desktop graphics card, or a graphics module which has the graphics hardware installed by the manufacturer.

Initially these devices were connected to the host computer using a connection that was proprietary to the manufacturer but now they are implementing the Intel Thunderbolt 3 via USB Type-C connection due to it offering PCI-Express data-transfer bandwidth, thus allowing for increased interchangeability between computers and docks. Most of these implementations will have the ability to send the graphics back to the host computer’s screen or to an external display that is connected directly to the external GPU module.

Alienware Graphics Amplifier expansion module

A graphics expansion module that could option up budget and mainstream laptops

These devices have appealed as a way to “option up” laptop, all-in-one and similar computers for high-performance dedicated graphics. It is more so where you don’t need to have dedicated graphics all the time, rather when you have that laptop or 2-in-1 “back home” and ready to work or play.

Conclusion

The graphics processors or GPUs, whether integrated on a computer’s motherboard, installed on a display card or housed in an external GPU module, are processors that look after “painting” the images you see on your computer’s screen.

Send to Kindle

Google’s Project Ara modular smartphone is for real

Article

Google Project Ara modular phone - for real

Google Project Ara modular phone – for real

Google’s Project Ara: build your dream phone | The Age

My Comments

There has been some previous coverage about Google’s “Project Ara” modular smartphone, but there was some doubt about this phone being for real.

This mobile phone, like the LG G5 smartphone, can be improved by you buying and adding extra modules that offer additional or better functionality. It is very similar to how the IBM PC evolved where it was feasible to add on extra parts to improve the computer’s functionality.

Google had put the Project Ara concept smartphone on the slow burner and LG advanced their take on a smartphone in the form of the G5 having an improved camera or a hi-fi-grade audio DAC module available as options. Now they have come forth with a firm proof-of-concept to be offered to developers so they can design the modular hardware and to have a system ready for the masses by next year.

Google will push the idea of requiring the modules to be certified by themselves in order to assure quality control and the user experience for installing or upgrading any of these modules will be very similar to replacing a microSD card in your Android smartphone. This is where you tell the operating system that you wish to remove the card before you open up your phone and swap out the card, something I do with my Samsung phone when I want to play different music because I see the microSD cards as though they are cassettes or MiniDiscs that contain music ready to play.

Like with computers, the modular phones will still appeal to those of us who are tech enthusiasts and don’t mind customising our phones to suit our needs. Personally, I would like to see this same modularity looked at for tablets, 2-in-1s and laptops pitched at this same user class who values modularity and customisability.

Send to Kindle

HP gives the convertible 2-in-1 the Ford Mustang treatment

Article

HP’s Pavilion x360 affordable convertible comes in 15-inch version now | The Verge

HP’s new Pavilion PCs include a 15-inch hybrid laptop | Engadget

From the horse’s mouth

HP

Press Release

Video – Click or tap to play

My Comments

When Ford launched the Mustang in 1964, they used a strategy where they could pitch it as a car affordable to young Americans yet something that would appeal to them. Here, they offered a reasonably-equipped baseline model at an affordable price but provided a litany of options that they could “buy on” such as powerful engines, transmissions that suited their needs, air-conditioning, radios with different capabilities and the like. This tactic followed through across Detroit with most of the vehicle builders offering youth-focused “pony cars” that were designed and offered in a similar vein to the Mustang.

HP have taken this approach when they launched the latest range of Pavilion x360 convertible 2-in-1 computers for this model year. Here, they offered the fold-over convertible computers in different screen sizes including the 15” variant which some would consider as too big for a tablet but big enough for a mainstream laptop. This may appeal for the common activity of viewing photos and video content on portable computers, but it could allow you to make best use of the touch-enabled apps and games which are filling up the Windows Store. This range would also come with differing colours like silver, gold, red, purple or blue

As for processors, there would be variants that have horsepower ranging from Intel Celeron to Intel Core i7. You can have your system with up to 8Gb RAM and up to 1Tb hard disk or 128Gb solid-state storage capacity. There are the expected features like 802.11ac Wi-Fi and B&O sound tuning and connectivity in the form of 3 USB sockets, SD card reader for your “digital film” and an HDMI video port.

The 15” model is being pitched as an alternative to the traditional mainstream 15” laptop that most students would end up with. As I have seen in the video clip, I see the newer HP Pavilion x360 being pitched also as an alternative to, guess what, the 15” Apple MacBook Pro especially when it comes to creative work or to be seen in the DJ booth at the trendiest nightclubs.

Send to Kindle

The 3.5mm digital-analogue audio socket is still relevant for today’s portable computing equipment

Laptops like the Toshiba Satellite Radius 12 could benefit from a 3.5mm digital-analogue audio output jack for an audio connection

Laptops like the Toshiba Satellite Radius 12 could benefit from a 3.5mm digital-analogue audio output jack for an audio connection

Increasingly, there has been the rise of portable audio equipment associated with computers and there are opportunities to exploit what this all about for better sound.

Most of this equipment is implementing a 3.5mm tip-ring-sleeve phone socket for analogue line-level audio input or output connections. This is because the socket type is considered to be a “low-profile” connection that allows the equipment to be designed to be slim and neat. The same connector even appeals to the traditional PC expansion cards where the socket can easily exist on the card’s bracket.

Sony MZ-R70 MiniDisc Walkman image courtesy of Pelle Sten (Flickr http://flickr.com/people/82976024@N00)

One of the Sony MiniDisc Walkmans that implemented a 3.5mm digital-analogue audio input jack

Some devices, namely video projectors use the 3.5mm phone jack for this purpose, whether as an input or an output while this connector is also used as a so-called ad-hoc “walk-up” audio input or output connection on amplifiers or stereo systems such as an auxiliary input. This exploits the ability associated with the “phone” sockets where they can survive being connected and disconnected repeatedly thanks to their original use on the old telephone switchboards.

Sony took this connection type further during the MiniDisc era by equipping some of the CD Walkmans with a line-out jack that also had an LED in it and equipping their MiniDisc Walkman recorders with a mic/line input jack that had a photodiode in this socket. Then the user would connect a fibre-optic cable with 3.5mm fibre-optic connections on each end to provide a digital link between the CD Walkman and the MiniDisc Walkman to digitally copy a CD to MD with the sound transferred in the SP/DIF digital domain.

Economy data projector with VGA input sockets

HDMI-equipped projectors could even exploit the 3.5mm digital-analogue output connection for use with sound systems that have a digital input

You would still be able to connect the portable device to a normally-sessile device like a digital amplifier by using an adaptor cable which had a Toslink plug on one end and a 3.5mm fibre-optic connection on the other end or use a Toslink-3.5mm adaptor with an existing Toslink fibre-optic cable.

A few other companies exploited this connection beyond the portable realm with Pace implementing it as a digital audio output on some of the cable-TV / satellite-TV set-top boxes. The set-top application implemented this connection just for the digital-audio application while the analogue audio connection was facilitated through the multiple-pin SCART connection.

This same single-socket connection could be easily implemented for a video projector that uses an HDMI input and a 3.5mm audio-output jack so you could have a digital connection to a sound system rather than an analogue audio link. This can also apply to laptop computers which are increasingly being purposed as party jukeboxes by younger people along with “mini-DJ” accessories pitched at iPod/iPhone users who want to play DJ.

Conversely, TVs and stereo systems could implement the same digital/analogue input for the auxiliary-input connection that is reserved on a TV for computer equipment or on the front of a stereo system for “walk-up” connection of portable digital-audio players.

As far as equipment design is concerned, the single socket for a SPDIF-optical-digital or analogue audio connection saves on designing and budgeting for two sockets if you want to facilitate both a digital and an analogue audio connection.  This is more important if the goal is to design equipment that has a low profile or is highly portable. The same approach can also appeal to providing for an ad-hoc digital/analogue input or output where the connection exists on an “as-needed” basis rather than a permanent basis.

Personally, I would like to see the 3.5mm digital-analogue audio jack that Sony valued during MiniDisc’s reign as something that can work with portable computing equipment, video projectors, smartphones and the like for transmitting audio in the SPDIF digital domain.

Send to Kindle

Using Bluetooth for wireless keyboards, mice and game controllers

Bluetooth could be the preferred way to go for all wireless keyboard and mice applications

Bluetooth could be the preferred way to go for all wireless keyboard and mice applications

A lot of wireless mice and keyboards offered at affordable prices and pitched for use with desktop computers are implementing a proprietary wireless setup which requires them to use a special USB transceiver dongle.

This is compared to some wireless mice, keyboards and games controllers that are offered for laptops and tablets where they have integral Bluetooth support. This is because the laptop and tablet computers are the main computers that come with Bluetooth on board. It is compared to desktops, mainly traditional “three-piece” desktops, that don’t have this feature and require the use of a USB Bluetooth dongle to gain Bluetooth connectivity.

Wireless mouse dongle

The typical easy-to-lose dongle that comes with most wireless mice

A reality that is coming crystal clear is that the laptop computer along with the all-in-one desktop computer is being seen as a viable alternative to the traditional “three-piece” desktop computer for one’s main computing device. This is underscored with laptops that are taken between work and home along with myself seeing quite a few computer setups where a laptop computer is hooked up to a traditional keyboard and mouse and one or two desktop-grade monitors. Some of these setups even run the laptop’s screen as part of a multi-screen setup.

Sony VAIO J Series all-in-one computer keyboard

Bluetooth shouldn’t just be for mobile keyboards

To the same extent, most of the “all-in-one” desktops are being equipped with Bluetooth functionality as a matter of course. This is more so where the goal is to compete with the Apple iMac range of “all-in-ones” or make this class of computer more impressive.

The Bluetooth advantage does away with the need to install a USB wireless dongle for that wireless keyboard or mouse or the risk of losing one of these dongles. For traditional desktop users, they can use and keep one Bluetooth dongle which works well if you want to move a Bluetooth keyboard and/or mouse between a secondary laptop and the desktop computer. Similarly the same Bluetooth dongle can support multiple devices like a keyboard, mouse, game controller and multipoint-capable Bluetooth headset.

The gap I am drawing attention to is the lack of traditional-sized keyboards, trackballs and mice fit for use with desktop computers, including novelty mice like the “model-car” mice, that work using Bluetooth. Manufacturers could offer a range of traditional-sized input devices that work with Bluetooth, preferably having Bluetooth LE (Smart) support, as part of their product ranges to cater for laptop-based and all-in-one-based personal computing setups.

Having Bluetooth LE (Smart) support would benefit this class of device because users shouldn’t need to be changing the peripherals’ batteries frequently which is something that can affect Bluetooth setups.

As well, there can be an effort towards improving responsiveness for Bluetooth keyboards, mice and games controllers to maintain Bluetooth’s appeal to the gaming community. Here, this would also be about working with other Bluetooth device clusters such as in a LAN-party environment where toe goal for gamers is to frag each other out rather than being “trampled on” by the enemy.

What really should be looked at is to standardise on Bluetooth as a way to wirelessly connect input devices like keyboards and mice to computer equipment.

Send to Kindle

USB.org to introduce authentication in to the USB Type-C platform

Article

The USB Type-C connection will now be able to be authenticated irrespective of vendor

The USB Type-C connection will now be able to be authenticated irrespective of vendor

New USB Type-C Authentication spec can stop faulty cables before they do damage | Windows Central

From the horse’s mouth

USB.org

Press Release (via BusinessWire)

My Comments

Increasingly the USB connection standard has shown up a need to verify or authenticate device connections on a hardware level. Initially Apple had engaged in this practice with their iOS devices that use the Lightning connector to make sure that properly licensed Lightning cables are used with these devices. But there have been other reasons that this kind of authentication is needed.

One of the reasons was the existence of fake charging devices that are typically installed in public locations. These espionage tools look like plug-in AC chargers or “charging bars”  but are really computing devices designed to harvest personal and corporate data from visitors’ smartphones and tablets. The mobile operating systems have been worked to address this problem whether through asking users what role the mobile device plays when it is connected to a host computing device or whether you trust the host device you connect your mobile device to it.

But there has also been concern raised about ultra-cheap USB Type-C cables, typically Type-A adaptor cables, that aren’t wired to standard and could place your laptop, smartphone or tablet at risk of damage. In this case, users want to be sure they are using good-quality properly-designed cables and power-supply equipment so that their devices aren’t at risk of damage.

The USB implementers Forum have established a connection-level authentication protocol for USB Type-C connections. This implements some of the authentication methods used by Apple for their Lightning connection to verify cables along with the ability to verify the devices that are on the other end of a USB Type-C connection.

For example, a traveller could rectify the “fake charger” situation by setting their mobile gadgets only to charge from certified USB Type-C chargers. Similarly, a business can use low-level authentication to verify and approve USB storage devices and modems to the computers under their control are connected to in order to prevent espionage and sabotage. Vehicle builders that supply software updates for their vehicles to rectify cyberattacks on vehicle control units can use this technique as part of their arsenal for authenticating any of these updates delivered to customers via USB sticks.

What needs to be established is that the USB interface chipsets installed on motherboards and other circuit boards need to be able to support this kind of authentication. Similarly, operating systems and device firmware would need to support the low-level authentication in order to reflect the user’s choice or company’s policy and communicate the status concerning USB Type-C devices properly to the end-user.

At least it is an industry-wide effort rather than a vendor-specific effort to verify and authenticate USB devices at the electrical-connection level rather than at higher levels.

Send to Kindle

Acer advances a Thunderbolt 3 graphics dock for their laptops

Article

Acer unveils its first external mobile GPU dock powered via Thunderbolt 3  | Neowin

My Comments

I had previously covered the issue of using Intel’s Thunderbolt 3 technology to facilitate the design and use of an external graphics module or dock for laptops. This idea was put forward by Sony with the VAIO Z-Series premium Ultrabook and by Alienware through the use of a “card-cage” dock that worked with some of their laptops.  Both these devices illustrated the possibility of allowing for improved graphics on portable or compact equipment, whether through a graphics module that has the graphics chipset integrated in its circuitry or a “card-cage” expansion module that allows you to install one or two desktop graphics cards in to that module.

But the Thunderbolt 3 technology which uses the USB Type-C connector as a physical connection has been known to have the same bandwidth as the PCI Express internal connection used to connect display cards to the motherboard in a regular computer. This appeals because there is no need to reinvent the wheel when designing an external-graphics-module solution for that portable-computing or low-profile computing product.

Now Acer have premiered a Thunderbolt 3 external graphics dock for their laptop products and had demonstrated it working with their Core-M-powered Switch 12.5 convertible laptop. This graphics module implements a NVIDIA GTX-960M graphics chipset in a small dedicated box and adds extra connectivity to the host laptop in the form of 3 extra USB 3.0 ports, an Ethernet port and the ability to connect to external displays via HDMI or 2 DisplayPort connections. It also exploits the USB 3.1 subsystem by providing the ability to power and charge the host laptop via the USB Type-C connection thanks to a DC power-supply connection on the graphics module itself.

This has been able to show real graphics performance benefits using the 3DMark II theoretical graphics benchmark where the Switch 12.5 came in at 940 on its own graphics chipset and on 4048 when used with this dock.  This device is the first of its kind to have a release price called for it with it costing around EUR€300, but there isn’t an estimated release date.

For Acer, it could be feasible for them to use the same external graphics docks across most, if not all, of their consumer and business laptop range that has the Thunderbolt 3 connection.

The question with the Thunderbolt 3 graphics-module application will arise is whether there will be the ability for one external graphics-module or card-cage module made by one manufacturer to work at their full potential with Thunderbolt-3-equipped laptops offered by other manufacturers.

If so, this could encourage computer manufacturers to use the Thunderbolt 3 technology on their portable, all-in-one or low-profile computers as a graphics-expansion option without needing to offer a graphics dock while computer-peripheral manufacturers can make external graphics solutions such as graphics expansion docks, desktop monitors with integrated graphics subsystems, and the like to work with other computers.

I see this concept appealing in a few ways:

  • An ultraportable computer being able to benefit from discrete graphics when used “at the desk” or “at home” thanks to an external graphics dock. This could open up the ability for a user to have one graphics dock at the office and another at home with these devices serving a “work-home-travel” computer.
  • The possibility of offering an affordable laptop or all-in-one desktop computer to most customers with the ability for these customers to expand their computer’s capabilities to suit their needs thanks to an external graphics module.
  • The ability for gaming-grade or workstation-grade computers that don’t offer much in the way of graphics-upgrade potential like laptops or all-in-ones to be upgraded to multiple-GPU performance and the latest graphics-processor technology thanks to an add-on graphics module or card-cage. In some ways, it could bring the separate-boxes “hi-fi approach” to the concept of improving personal computer equipment.

Once a level playing field is achieved regarding Thundberbolt 3 over USB Type C for graphics docks through the use of open standards, it can lead to the idea of allowing low-profile and portable computers to benefit from high-performance graphics.

Send to Kindle

Infocus ups the capacity for its Kangaroo mini-PC

Article

The $170 Kangaroo Plus pocketable PC doubles the RAM and storage | Windows Central

Previous Coverage on HomeNetworking01.info

InFocus Presents A PC As Big As A Smartphone

From the horse’s mouth

Infocus

Press Release

My Comments

InFocus, known for their range of value-priced projectors, had previously released the Kangaroo mini-PC which is about the size of a smartphone. But like most of the “Next Unit Of Computing” devices which represent the ultra-small “fixed-location” computers, this model used an Intel Atom CPU, 2Gb RAM and up to 32Gb solid-state storage. This made people think of them as being “toys” rather than tools.

But InFocus raised the game for this series of computers by offering the Kangaroo Plus “deluxe” version of their small-form-factor computer. Here, this is equipped with 64Gb of data storage capacity and 4Gb RAM which is considered iy most computer users to be a realistic amount of baseline memory. It was offered in response to customers expressing a need for real capacities on both the “primary-storage” RAM and the non-volatile secondary storage.

There is still the ability to use an Apple iPad as the display and input surface for the InFocus Kangaroo through the use of a special cable and an iOS app. It also works with the Kangaroo Dock expansion module so you can safely upgrade your existing Kangaroo pocket computer to the bigger-capacity model without dumping that accessory.

Could this be a sign of hope for small-form-factor desktop computers to have specifications that can allow for most elementary desktop uses? Would this also be a sign that these computers could end up being specified as part of a standard operating environment?

Send to Kindle