Category: Graphics subsystems

Pocket-sized NVIDIA external graphics module about to arrive

Article

Pocket AI external graphics module press image courtesy mf Adlink

Pocket AI portable external graphics module

Pocketable RTX 3050 GPU Has Thunderbolt Power | Tom’s Hardware (tomshardware.com)

Adlink readies Pocket AI palm-sized portable Nvidia GPU for AI applications – NotebookCheck.net News

From the horse’s mouth

Adlink

Pocket AI | Portable GPU with NVIDIA RTX GPU (Product Page)

ADLINK launches portable GPU accelerator with NVIDIA RTX A500 in NVIDIA GTC (Press Release)

My Comments

Sonnet put up one of the first Thunderbolt-3-connected portable external graphics modules that you can put in your laptop bag or backpack and allow you to improve your laptop’s graphics performance when you are away from home or work. It doesn’t matter whether you are using an external monitor or the laptop’s own display to see the results. Subsequently a few other manufacturers offered similarly-portable external graphics modules that achieve this same goal in something that can be easily transported.

Dell XPS 13 8th Generation Ultrabook at QT Melbourne rooftop bar

The Dell XPS 13 series of ultraportable computers and other Thunderbolt-3-equipped laptops can benefit from external graphics modules

But a mixture of circumstances ranging from COVID with the associated supply-chain issues and the “Crypto Gold Rush” bubble had impacted on the availability and cost of graphics processors. This meant that it was hard to find a portable external graphics module that was in the price range for even the most serious creator or gamer.

Subsequently the supply chain issues started to ease up and the Crypto Gold Rush bubble had burst which led to more graphics-processor silicon being made available. Some manufacturers even made a point of positioning particular graphics silicon away from crypto mining through means like software crippling or creating auxiliary processor cards dedicated to that task. This has freed up the supply of graphics silicon for high-performance-computing tasks like gaming, creator / prosumer computing, certified workstations and artificial intelligence.

That led to Sonnet refreshing their portable external graphics module with newer AMD graphics silicon and offering it as two different units powered by different GPUs. That allowed for some form of product differentiation in this product class.

Now Adlink are promoting a pocketable external graphics module for use with Thunderbolt-3-equipped laptops and “next-unit-of-computing” computers. This integrated external graphics module which is powered by NVIDIA RTX silicon is being pitched at artificial-intelligence use cases but can come in to play for gamers, creators and the like.

The Pocket AI external graphics module uses a soldered-in NVIDIA RTX A500 mobile graphics processor with 4Gh display memory. The Tom’s Hardware article reckoned that this offered performance akin to an NVIDIA RTX-3050 mobile graphics processor offered for recent entry-level discrete-GPU laptops.

Rather than using a device-specific power supply, the Pocket AI uses a USB Power-Delivery-compliant power supply with at least 45 watts output. This means you could get away with a USB-PD charger that works from household AC mains power or from DC power used in automotive, marine or aircraft setups; or even a suitably-rated powerbank.

It then connects to the host computer using a Thunderbolt 3 or 4 connection, as expected for an external graphics module. But there doesn’t seem to be any external connections for other devices like monitors or other USB devices.

Here, you would benefit from the essential features offered by NVIDIA for its recent GeForce RTX graphics processors including RTX Broadcast which would benefit videocall and online broadcast use cases.

It is a pocketable unit, which you may say is equivalent to something like a pocket radio or early-generation pocket scientific calculator that is run on AA batteries. This leads to something that doesn’t take much room in a messenger style laptop bag, briefcase or backpack.

The Pocket AI external graphics module is being pitched towards artificial-intelligence and machine-learning use cases where a secondary processor is needed to facilitate these tasks.

But it is being pitched towards gamers with the commentary being towards adding a bit of “pep” towards the gaming experience on something like a Dell XPS 13. It is also said to come in to its own for work like graphics and multimedia, statistics or computer programming on those kind of computers.It is also about bringing NVIDIA RTX-specific features to your computer for use cases like video conferencing, video editing and similar uses and would easily court the “creator” or “prosumer” user class.

From what I see of the Adlink Pocket AI external graphics module, it could benefit from further connectivity options like DisplayPort or HDMI ports for external monitors. Or it could be about having a downstream Thunderbolt 3//4 port to allow it to be used with Thunderbolt 3 hubs or docks.

What this device is showing is a slow resurgence of reasonably priced discrete GPUs including external graphics modules that has come about after the crypto-asset bubble had burst. It may even allow computer manufacturers to invest in Thunderbolt 4 and related connection options on laptops, all-in-one computers and low-profile desktop computers for these devices; as well as to see a wide range of external graphics modules on the shelf.

It could be a chance for more manufacturers to offer highly-portable external graphics modules with entry-level or modest graphics silicon to court people who want that bit more out of their thin-and-light notebook computers.

Intel to offer integrated graphics fit for newer video games

Article

Intel Xe graphics strategy slide courtesy of Intel Corporation

Intel’s GPU strategy is rooted in Xe, a single architecture that can scale from teraflops to petaflops. At Architecture Day in August 2020, Intel Chief Architect Raja Koduri, Intel fellows and architects provided details on the progress Intel is making. (Credit: Intel Corporation)

Intel’s Xe Graphics Might Mean You No Longer Need a Separate Graphics Card to Play Games | Gizmodo Australia

Intel Xe Graphics: Release Date, Specs, Everything We Know | Tom’s Hardware

From the horse’s mouth

Intel

Intel Delivers Advances Across 6 Pillars of Technology, Powering Our Leadership Product Roadmap (Press Release)

My Comments

When one thinks of Intel’s graphics processor technology, they often think of the integrated graphics processors that use system RAM memory to “paint” the images you see on the screen. Typically these graphics processors are not considered as great as dedicated graphics processors of the like NVIDIA or AMD offer which use their own display memory.

Such processors are often associated with everyday business and personal computing needs like Web browsing, office productivity applications or consuming video content. They could be useful for basic photo editing or playing casual or “retro” games that aren’t graphically demanding, but wouldn’t do well with high-demand tasks like advanced photo/video editing or today’s video-game blockbusters.

Integrated graphics technology is typically preferred for use within laptops, tablets and 2-in-1s as an everyday graphics option for tasks like word-processing, Web surfing, basic video playback and the like. This is  especially because these computers need to run in a power-efficient and thermal-efficient manner, due to them being designed for portability and to be run on battery power. Let’s not forget that laptops with discrete graphics also implement integrated graphics for use as a power-efficient “lean-burn” option.

This same graphics technology also appeals to low-profile desktop computers including some “all-in-ones” and “next unit of computing” systems due to the chipsets yielding less heat and allowing for the compact design.

But typically most regular computers running desktop operating systems are nowadays specified with at least 8Gb of system RAM memory, if not 16Gb. Here, it may be inconsequential about the amount of memory used by the integrated graphics for some graphics tasks using the computer’s own screen. Let’s not forget that the Full HD (1080p) screen resolution is often recommended for a laptop’s integrated screen due to it being a power-efficient specification.

Intel has defined its new Xe graphics infrastructure platform that will be part of the Tiger Lake computing platform to be more capable than this. These GPU chips will maintain the same physical die size as prior Intel integrated graphics chips so as to avoid the need to reengineer computer designs when a silicon refresh to Tiger Lake is needed.

The more powerful Intel Xe variants will be offered with more powerful Tiger Lake CPUs. It will be similar to the current-issue Intel Iris Plus integrated graphics processors, and will be pitched for content creators. But I would say that these will simply appear in products similar to the former “multimedia laptops” that have increased multimedia performance.

One of the design goals for the Intel Xe LP (low power / low performance) integrated GPUs, especially the higher-performance variants is to play a graphically-rich AAA action game at Full HD resolution with a good frame rate. Being able to play such a game at Full HD that way would cater towards the preference for Full HD displays within 13”-15” laptops and similar portable computers due to this display specification being more power efficient than 4K UHD displays for that screen-size range.

A question I would raise is whether the frame rate would approach the standard of 60 Hz or how much of a power load this places on the computer’s batteries. As well, one would also need to know how much of the game’s “eye-candy” is being enabled during play on an Intel Xe LP integrated graphics setup.

Intel Xe-HP graphics chipset presentation slide courtesy of Intel Corporation

Xe-HP is the industry’s first multitiled, highly scalable, high-performance architecture, providing data center-class, rack-level media performance, GPU scalability and AI optimization. It covers a dynamic range of compute from one tile to two and four tiles, functioning like a multicore GPU. At Architecture Day in August 2020, Intel Chief Architect Raja Koduri, Intel fellows and architects provided details on the progress Intel is making. (Credit: Intel Corporation)

Intel will also intend to offer a dedicated graphics processor in the form of the Xe HP chipset codenamed DG1. It will be their first dedicated GPU that Intel has offered since 1998-2000 with a graphics card that they partnered for use with Pentium III and Celeron CPUs. This GPU will be capable of doing ray-tracing amongst other high-end gaming activities and it could be interesting to see how this chipset stands up to AMD or NVIDIA performance gaming GPUs.

The Intel Xe HP graphics platform will primarily be pitched at data center and server applications. But Intel is intending to offer a “client-computing” variant of this high-performance graphics platform as the Xe HPG. Here, this will be pitched at enthusiasts and gamers who value performance. But I am not sure what form factors this will appear in, be it a mobile dedicated GPU for performance-focused laptops and “all-in-ones” or small external graphics modules, or as a desktop expansion card for that gaming rig or “card-cage” external graphics module.

But Intel would need to offer this GPU not just as a “contract install” unit for computer builders to supply on a line-fit basis, but offer it through the “build-it-yourself” / computer-aftermarket sectors that serve hobbyist “gaming-rig” builders and the external graphics module sector. This sector is where NVIDIA and AMD are dominating within.

The accompanying software will implement adaptive graphics optimisation approaches including “there-and-then” performance tuning in order to cater towards new high-performance software needs. This would be seen as avoiding the need to update graphics driver software to run the latest games.

It could be seen as an attempt by Intel to cover the spread between entry-level graphics performance and mid-tier graphics performance. This could be a chance for Intel to make a mark for themselves when it comes to all-Intel computers pitched for everyday or modest computing expectations.

I also see Intel’s Xe graphics processor products as a way for them to be a significant third force when it comes to higher-performance “client computer” graphics processing technology. This is with NVIDA and AMD working on newer graphics silicon platforms and could definitely “liven up” the market there.

But it could lead to one or two of these companies placing a lot of effort on the high-end graphics technology space including offering such technology to the aftermarket. This is while one or two maintain an effort towards supplying entry-level and mid-tier graphics solutions primarily as original-equipment specification or modest aftermarket options.

Windows 10 to offer greater control over graphics processors

Article

Dell XPS 17 laptop press picture courtesy of Dell Australia

Microsoft will be introducing support for better management of graphics processors in computers like the Dell XPS 17 that have Thunderbolt 3 and onboard discrete graphics processors

Windows 10 to give power users more control over their GPUs | Bleeping Computer

New Windows 10 Build 20190 brings default GPU & GPU per application settings along with new post-update experience in Dev channel | Windows Central

My Comments

Previously Windows provided operating-system-level support for handling multiple graphics processors. This is due to the reality of many mainstream laptops being equipped with a discrete graphics processor along with the integrated graphics chipset, providing the ability for users to switch between high-performance or power efficiency depending on whether the computer is on AC power or batteries.

Akitio Node Thunderbolt 3 "card cage" external graphics module - press image courtesy of Akitio

.. so users can choose whether an app uses the onboard discrete graphics processor or a desktop graphics card installed in an external graphics module like this Akitio Node unit

This was achieved either at hardware or graphics-chipset level with special software like NVIDIA Optimus achieving this goal. Then Microsoft recently added this functionality to Windows to allow you to determine whether an application chooses the higher-performance or power-efficient graphics processor.

But in an upcoming feature upgrade, Microsoft will allow fine-tuned control over which graphics chipset applications use. The function is in testing through their Windows Insider beta-testing program.

It will cater towards users who run more than two discrete graphics processors on their system such as a gaming rig with two graphics cards or a laptop / all-in-one / low-profile desktop that has a discrete GPU and Thunderbolt 3 and is connected to an external graphics module. This kind of configuration is primarily offered on Intel-powered premium consumer and business clamshell laptops with a screen size of 15” or more as well as most of the Intel-powered performance-focused laptops.

Here, the users can specify which graphics processor is the one for high-performance computing or can specify that a particular application uses a particular graphics processor.

It is also being driven by the rise of USB4 and Thunderbolt 4, where there will be an effort to make these high-performance USB-C-based peripheral-connection ports ubiquitous and affordable. This could then open up the path for laptops and low-profile / all-in-one desktops to have these ports with their presence being sold on the ability to upgrade the computer’s graphics with an external graphics module.

In the case of a laptop equipped with a discrete GPU and Thunderbolt 3, users may find that the onboard discrete GPU is really a “mobile-grade” type that is intended to be power-efficient but doesn’t perform as good as a desktop graphics card. Here, they would install a desktop graphics card in to a “card-cage” external graphics module and connect it to the computer for better graphics performance.

This may work as a way to allow the use of “fit-for-purpose” graphics processors like a mobile-workstation GPU for a CAD program alongside a gaming-grade GPU for a game. Or a user could run a video-editing program specifically with a graphics processor that is good at rendering the video content while they have another graphics processor available for other tasks.

Personally, I would also like to see Windows offer the ability for users to create an order of preference for high-performance graphics processors either for default high-performance use or for a particular application’s needs. This would come in to its own for graphics-card reviewers who are comparing against their “daily-driver” graphics card, or people who are moving a Thunderbolt-3-equipped laptop between multiple external graphics modules.

Similarly, the control over multiple-graphics-processor setups that Windows is to offer could also evolve towards “task-specific” GPU use. Here, it could be about focusing a graphics chipset towards batch calculation workloads rather than display-focused workloads. This is because people involved with video-editing, media transcoding, statistics, cryptocurrency or similar tasks may prefer to use the kind of chipset that is a “number-cruncher” for those tasks rather than one that excels at interactive computing.

At least Microsoft is working towards answering the needs of power users who deal with two or more graphics processors as part of their personal-computing lives.

Intel to make graphics driver updates independent of PC manufacturer customisations

Article

Dell XPS 13 Kaby Lake

Laptops with Intel graphics infrastructure like this Dell XPS 13 will benefit from having any manufacturer-specific customisations to the graphics driver software delivered as a separate item from that drive code

Intel graphics drivers can now be updated separately from OEM customizations | Windows Central

From the horse’s mouth

Intel

Intel Graphics – Windows 10 DCH drivers (Latest download site)

My Comments

Intel is now taking a different approach to packaging the necessary Windows driver software for its graphics infrastructure. This will affect any of us who have Intel graphics infrastructure in our computers, including those of us who have Intel integrated-graphics chipsets working alongside third-party discrete graphics infrastructure in our laptops as an energy-saving measure.

Previously, computer or motherboard manufacturers who wanted to apply any customisations to their Intel integrated-graphics driver software for their products had to package the customisations with the driver software as a single entity. Typically it was to allow the computer manufacturer to optimise the software for their systems or introduce extra display-focused features peculiar to their product range.

Dell Inspiron 15 Gaming laptop

.. even if the Intel graphics architecture is used as a “lean-burn” option for high-performance machines like this Dell Inspiron 15 7000 Gaming laptop when they are run on battery power

This caused problems for those of us who wanted to keep the driver software up-to-date to get the best out of the integrated graphics infrastructure in our Intel-based laptops.

If you wanted to benefit from the manufacturer-supplied software customisations, you had to go to the manufacturer’s software-support Website to download the latest drivers which would have your machine’s specific customisations.

Here, the latest version of the customised drivers may be out-of-step with the latest graphics-driver updates offered by Intel at its Website and if you use Intel’s driver packages, you may not benefit from the customisations your machine’s manufacturer offered.

The different approach Intel is using is to have the graphics driver and the customisations specific to your computer delivered as separate software packages.

Here, Intel will be responsible for maintaining their graphics-driver software as a separate generic package which will have API “hooks” for any manufacturer-specific customisation or optimisation code to use. Users can pick this up from the Intel driver-update download site, the manufacturer’s software update site or Windows Update. Then the computer manufacturer will be responsible for maintaining the software peculiar to their customisations and offering the updates for that software via their support / downloads Website or Microsoft’s Windows Update.

It may be seen as a two-step process if you are using Intel’s and your computer manufacturer’s Websites or software-update apps for this purpose. On the other hand, if you rely on Windows Update as your driver-update path, this process would be simplified.

The issue of providing computer-specific customisations for software drivers associated with computer hardware subsystems will end up being revised after Intel’s effort. This will be more so with sound subsystems for those laptops that have their audio tuned by a name of respect in the audio industry, or common network chipsets implemented in a manufacturer-peculiar manner.

At least you can have your cake and eat it when it comes to running the latest graphics drivers on your Intel-based integrated-graphics-equipped laptop.

The trends affecting personal-computer graphics infrastructure

Article

AMD Ryzen CPUs with integrated Vega graphics are great for budget-friendly PC gaming | Windows Central

My Comments

Dell Inspiron 13 7000 2-in-1 Intel 8th Generation CPU at QT Melbourne hotel

Highly-portable computers of the same ilk as the Dell Inspiron 13 7000 2-in-1 will end up with highly-capable graphics infrastructure

A major change that will affect personal-computer graphics subsystems is that those subsystems that have a highly-capable graphics processor “wired-in” on the motherboard will be offering affordable graphics performance for games and multimedia.

One of the reasons is that graphics subsystems that are delivered as an expansion card are becoming very pricey, even ethereally expensive, thanks to the Bitcoin gold rush. This is because the GPUs (graphics processors) on the expansion cards are being used simply as dedicated computational processors that are for mining Bitcoin. This situation is placing higher-performance graphics out of the reach of most home and business computer users who want to benefit from this feature for work or play.

But the reality is that we will be asking our computers’ graphics infrastructure to realise images that have a resolution of 4K or more with high colour depths and dynamic range on at least one screen. There will even be the reality that everyone will be dabbling in games or advanced graphics work at some point in their computing lives and even expecting a highly-portable or highly-compact computer to perform this job.

Integrated graphics processors as powerful as economy discrete graphics infrastructure

One of the directions Intel is taking is to design their own integrated graphics processors that use the host computer’s main RAM memory but have these able to serve with the equivalent performance of a baseline dedicated graphics processor that uses its own memory. It is also taking advantage of the fact that most recent computers are being loaded with at least 4Gb system RAM, if not 8Gb or 16Gb. This is to support power economy when a laptop is powered by its own battery, but these processors can even support some casual gaming or graphics tasks.

Discrete graphics processors on the same chip die as the computer’s main processor

Intel Corporation is introducing the 8th Gen Intel Core processor with Radeon RX Vega M Graphics in January 2018. It is packed with features and performance crafted for gamers, content creators and fans of virtual and mixed reality. (Credit: Walden Kirsch/Intel Corporation)

This Intel CPU+GPU chipset will be the kind of graphics infrastructure for portable or compact enthusiast-grade or multimedia-grade computers

Another direction that Intel and AMD are taking is to integrate a discrete graphics subsystem on the same chip die (piece of silicon) as the CPU i.e. the computer’s central “brain” to provide “enthusiast-class” or “multimedia-class” graphics in a relatively compact form factor. It is also about not yielding extra heat nor about drawing on too much power. These features are making it appeal towards laptops, all-in-one computers and low-profile desktops such as the ultra-small “Next Unit of Computing” or consumer / small-business desktop computers, where it is desirable to have silent operation and highly-compact housings.

Both CPU vendors are implementing AMD’s Radeon Vega graphics technology on the same die as some of their CPU designs.

Interest in separate-chip discrete graphics infrastructure

Dell Inspiron 15 Gaming laptop

The Dell Inspiron 15 7000 Gaming laptop – the kind of computer that will maintain traditional soldered-on discrete graphics infrastructure

There is still an interest in discrete graphics infrastructure that uses its own silicon but soldered to the motherboard. NVIDIA and AMD, especially the former, are offering this kind of infrastructure as a high-performance option for gaming laptops and compact high-performance desktop systems; along with high-performance motherboards for own-build high-performance computer projects such as “gaming rigs”. The latter case would typify a situation where one would build the computer with one of these motherboards but install a newer better-performing graphics card at a later date.

Sonnet eGFX Breakaway Puck integrated-chipset external graphics module press picture courtesy of Sonnet Systems

Sonnet eGFX Breakaway Puck integrated-chipset external graphics module – the way to go for ultraportables

This same option is also being offered as part of the external graphics modules that are being facilitated thanks to the Thunderbolt 3 over USB-C interface. The appeal of these modules is that a highly-portable or highly-compact computer can benefit from better graphics at a later date thanks to one plugging in one of these modules. Portable-computer users can benefit from the idea of working with high-performance graphics where they use it most but keep the computer lightweight when on the road.

Graphics processor selection in the operating system

For those computers that implement multiple graphics processors, Microsoft making it easier to determine which graphics processor an application is to use with the view of allowing the user to select whether the application should work in a performance or power-economy mode. This feature is destined for the next major iteration of Windows 10.

Here, it avoids the issues associated with NVIDIA Optimus and similar multi-GPU-management technologies where this feature is managed with an awkward user interface. They are even making sure that a user who runs external graphics modules has that same level of control as one who is running a system with two graphics processors on the motherboard.

What I see now is an effort by the computer-hardware industry to make graphics infrastructure for highly-compact or highly-portable computers offer similar levels of performance to baseline or mid-tier graphics infrastructure available to traditional desktop computer setups.

Microsoft to improve user experience and battery runtime for mobile gaming

Article – From the horse’s mouth

Candy Crush Saga gameplay screen Android

Microsoft researching how games like Candy Crush Saga can work with full enjoyment but not demanding much power

Microsoft Research

RAVEN: Reducing Power Consumption of Mobile Games without Compromising User Experience (Blog Post)

My Comments

A common frustration that we all face when we play video games on a laptop, tablet or smartphone is that these devices run out of battery power after a relatively short amount of playing time. It doesn’t matter whether we use a mobile-optimised graphics infrastructure like what the iPad or our smartphones are equipped with, or a desktop-grade graphics infrastructure like the discrete or integrated graphics chipsets that laptops are kitted out with.

What typically happens in gameplay is that the graphics infrastructure paints multiple frames to create the illusion of movement. But most games tend to show static images for a long time, usually while we are planning the next move in the game. In a lot of cases, some of these situations may use a relatively small area where animation takes place be it to show a move taking place or to show a “barberpole” animation which is a looping animation that exists for effect when no activity takes place.

Microsoft is working on an approach for “painting” the interaction screens in a game so as to avoid the CPU and graphics infrastructure devoting too much effort towards this task. This is a goal to allow a game to be played without consuming too much power and takes advantage of human visual perception for scaling frames needed to make an animation. There is also the concept of predictability for interpreting subsequent animations.

But a lot of the theory behind the research is very similar to how most video-compression codecs and techniques work. Here, these codecs use a “base” frame that acts as a reference and data that describes the animation that takes place relative to that base frame. Then during playback or reception, the software reconstructs the subsequent frames to make the animations that we see.

The research is mainly about an energy-efficient approach to measuring these perceptual differences during interactive gameplay based on the luminance component of a video image. Here, the luminance component of a video image would be equivalent to what you would have seen on a black-and-white TV. This therefore can be assessed without needing to place heavy power demands on the computer’s processing infrastructure.

The knowledge can then be used for designing graphics-handling software for games that are to be played on battery-powered devices, or to allow a “dual-power” approach for Windows, MacOS and Linux games. This is where a game can show a detailed animation with high performance on a laptop connected to AC power but allow it not to show that detailed animation while the computer is on battery power.

Windows to fully manage multiple graphics processor setups

Article – From the horse’s mouth

Dell Inspiron 15 Gaming laptop

The Dell Inspiron 15 7000 Gaming laptop – the process of selecting which graphics processor  an app or game should use in this Optimus-equipped laptop will soon be managed by Windows 10

Microsoft

Announcing Windows 10 Insider Preview Build 17093 for PC (Windows Experience Blog)

Previous Coverage

What is a GPU all about?

My Comments

Over the last few years, an increasing number of laptop-computer manufacturers worked with graphics-card vendors to implement dual-graphics-processor setups in their portable computing products.

This offered a function that works in a similar manner to the “performance / economy” or “sports mode” switch present in an increasing number of cars. Here, the transmission can be set to give the car sports-like performance or to allow it work more efficiently, typically by determining when the transmission changes gear in relation to the engine’s RPM. NVIDIA markets this function as Optimus while AMD markets it as Dynamic Switchable Graphics.

Sony VAIO S Series ultraportable STAMINA-SPEED switch

Sony VAIO S Series – equipped with dual graphics with an easy-to-use operating-mode switch

Initially Sony implemented a hardware switch to select the graphics processor on their VAIO S Series that I previously reviewed but you manage this function through a control app offered by NVIDIA or AMD depending on the discrete graphics chipset installed. From my experience, these programs can be very confusing to operating especially if you want to allow particular software to run in high-performance or economy mode, or simply override these settings.

Intel Corporation is introducing the 8th Gen Intel Core processor with Radeon RX Vega M Graphics in January 2018. It is packed with features and performance crafted for gamers, content creators and fans of virtual and mixed reality. (Credit: Walden Kirsch/Intel Corporation)

This Intel CPU+GPU chipset will be the reason Microsoft will be providing operating-system management of multiple graphics processors

Microsoft have now integrated in to a preview build of the next iteration of Windows 10 the ability to manage these settings using the operating system’s interface. This setup also applies to desktop systems equipped with two discrete GPUs such as a baseline graphics card and a performance-focused graphics card; or systems connected to an external graphics module. It can cater towards a situation where a computer is equipped with two built-in graphics processors and an external graphics module, a situation that can be made real with Intel’s new CPU+discrete GPU system-on-chip or a gaming laptop with a regular games-grade GPU, when computers with this kind of hardware also have Thunderbolt 3 ports.

Akitio Node Thunderbolt 3 "card cage" external graphics module - press image courtesy of Akitio

.. as will external graphics modules like this Akitio Node Thunderbolt 3 “card cage” external graphics module

The user experience requires you to select a program, be it a Classic (traditional Windows desktop) app or a Universal (Windows Store) app, then choose whether to let the system choose the GPU to use, or to use the GPU offering the highest performance, or the GPU that is the most economical. Here, it could cater for the external graphics modules or systems with three graphics processors by choosing the one with the most horsepower, typically the graphics processor in an external graphics module.

There is the ability for an application or game to choose the graphics processor to work with and this management ability won’t override that choice.  The ability to choose the graphics processor for a program to work with on the basis of whether it is power-saving or higher-performance makes it feasible to work with setups where you may connect or disconnect GPUs on a whim such as when you use external graphics modules.

What users may eventually want is to allow Windows to select the graphics processor for an application based on the kind of power source the host computer is using. Here, such an option could allow an app to use high-performance graphics like a discrete graphics chipset while the computer is running from AC power, but use a power-conserving graphics setup while running on batteries.

Other goals that may be seen would include the ability for Windows to manipulate multiple graphics processors to optimise for higher graphics and system performance for particular situations. This could range from using an integrated graphics processor in a setup using a discrete or external graphics processor for its graphics needs to improve performance for supplementary tasks to allocating GPUs to particular display clusters.

At least Microsoft has started on the idea of “baking in” multiple-graphics-processor management into Windows 10 rather than relying on software supplied by graphics-processor vendors to do the job.

Sonnet shows up a highly-portable external graphics module

Articles

Sonnet eGFX Breakaway Puck integrated-chipset external graphics module press picture courtesy of Sonnet Systems

Sonnet eGFX Breakaway Puck integrated-chipset external graphics module – the way to go for ultraportables

This Little Box Can Make Even the Junkiest Laptop a Gaming PC | Gizmodo

From the horse’s mouth

Sonnet

eGFX Breakaway Puck (Product Page, Press Release)

My Comments

Increasingly there has been the rise of external graphics modules that connect to your laptop or small-form-factor desktop computer via its Thunderbolt 3 port. This has allowed this class of computer to benefit from better graphics hardware even though they don’t have the ability for you to fit a graphics card in them. Similarly, they would appeal to users who have an ultraportable computer and mainly want the advanced graphics in a particular environment like home or office but don’t care about it on the road.

A highly-portable approach to giving the Dell XPS 13 Kaby Lake and its ilk discrete graphics

But most of these devices have come in the form of a “card-cage” which houses a desktop-grade graphics card, which is all and well if you are thinking of using gaming-grade or workstation-grade desktop graphics hardware. As well, these “card-cage” units would take up a lot of space, something that may not be beneficial with cramped desktop or entertainment-unit spaces.

Acer previously issued one of these external graphics modules which has an integrated NVIDIA graphics chipset but Sonnet has now come to the fore with the eGFX Breakaway Puck that uses integrated AMD Radeon graphics silicon. This device is sold as two different models with one equipped with the AMD Radeon RX560 GPU and another with the better-performing Radeon RX570 GPU.

The Sonnet eGFX Breakaway Puck can be stuffed in to a backpack’s pocket this making it appeal to users who are likely to be travelling more. As well, they offer a VESA-compliant bracket so that this external graphics module can be mounted on a display stand or arm for those of us who want as much space on the desktop as possible.

Connectivity for external displays is in the form of 3 DisplayPort outlets and 1 HDMI 2.0b outlet to cater for multi-monitor setups. It also exploits the Power Delivery standard to supply up to 45W of power to the host computer which can mean that you don’t need to use the computer’s charger to power the host computer.

There could be some improvements regarding connectivity like having another Thunderbolt 3 / USB-C port for connection to other peripherals, something that can be of concern with ultraportables that use very few connections. But I would see this opening up the idea for similarly-sized integrated-chipset external graphics modules both as highly-portable “add-ons” for laptop computers or to create “building-block” approaches to small-form-factor “NUC-style” desktop computer setups.

NVIDIA offers external graphics module support for their Quadro workstation graphics cards

Articles

Razer Blade gaming Ultrabook connected to Razer Core external graphics module - press picture courtesy of Razer

NVIDIA allows you to turn a high-performance Ultrabook like the Razer Blade in to a mobile workstation when you plant a Quadro graphics card in an external graphics module like the Razer Core

Nvidia rolls out external GPU support for Nvidia Quadro | CNet

NVIDIA External GPUs Bring New Creative Power to Millions of Artists and Designers | MarketWired

From the horse’s mouth

NVIDIA

Press Release

My Comments

Over the last year, there has been a slow trickle of external graphics modules that “soup up” the graphics capabilities of computers like laptops, all-in-ones and highly-compact desktops by using outboard graphics processors. Typically these devices connect to the host computer using a Thunderbolt 3 connection which provides a bandwidth equivalent to the PCI Express expansion-card standard used for desktop computers.

At the moment, this approach for improving a computer system’s graphics abilities has been focused towards gaming-grade graphics cards and chipsets, which has left people who want workstation-grade graphics performance in the lurch.

But NVIDIA has answered this problem by providing a driver update for their TITAN X and Quadro workstation graphics cards. This allows Windows to work with these cards even if they are installed in a “card-cage” external graphics module rather than on the host computer’s motherboard.

Not just that, NVIDIA are to start allowing external-graphics-module manufacturers to tender their products for certification so that they are proven by NVIDIA to allow these cards to work reliably to optimum performance. This may be different to the context of a certified workstation where all the components in that particular computer are certified by Autodesk and similar software vendors to work reliably and perform at their best with their CAD or similar software.

What is being pitched in this context is a “thin-and-light” laptop of the Dell XPS 13 kind (including the 2-in-1 variant);  an “all-in-one” desktop computer like the HP Envy 34 Curved All-In-One or an ultra-compact “next unit of computing” unit like the Intel Skull Canyon being able to do workstation-class tasks with the kind of graphics card that best suits this computing requirement.

The question that some workstation users will then raise is whether the computer’s main processor and RAM are up to these tasks even though a workstation-grade graphics card is added on; and then consider this approach unsatisfactory even though the host computer has a lot of RAM and / or runs with a Core i7 CPU. But something like a gaming laptop that uses a gaming-calibre graphics chipset may benefit from the Quadro in an external graphics “card cage” module when this system is destined to do a lot of video editing, CAD or animation work.

Personally, I see the concept of the Quadro workstation graphics chipset in an external graphics module as a way to allow hobbyists and small-time professionals to slowly put their foot in the door of high-performance workstation computing.

Investing in an external graphics module for your laptop

Razer Blade gaming Ultrabook connected to Razer Core external graphics module - press picture courtesy of Razer

Razer Blade gaming Ultrabook connected to Razer Core external graphics module

Just lately, as more premium and performance-grade laptops are being equipped with a Thunderbolt 3 connection, the external graphics modules, also known as graphics docks or graphics docking stations, are starting to trickle out on to the market as a performance-boosting accessory for these computers.

The Thunderbolt 3 connection, which uses the USB Type-C plug and socket, is able to provide a throughput similar to a PCI-Express card bus and has put forward a method of improving a laptop’s, all-in-one’s or small-form-factor computer’s graphics ability. This is being facilitated using the external graphics modules or docks that house graphics processors in the external boxes and link these to the host computer using the above connection. What it will mean is that these computers can benefit from desktop-grade or performance-grade graphics without the need to heavily modify them and, in the case of portable computers, can allow for “performance” graphics to be enjoyed at home or in the office while you have battery-conserving baseline graphics on the road,

Acer Aspire Switch 12S convertible 2-in-1 - press picture courtesy of Microsoft

Acer Aspire Switch 12S convertible 2-in-1 – can benefit from better graphics thanks to Thunderbolt 3 and an external graphics module

The devices come in two classes:

  • Integrated graphics chipset (Acer Graphics Dock) – devices of this class have a hardwired graphics chipset similar to what is implemented in an all-in-one or small-form-factor computer.
  • Card cage (Razer Core, Akitio Node) – These devices are simply a housing where you can install a PCI-Express desktop graphics card of your choice. They have a power supply and interface circuitry to present the desktop graphics card to the host computer via a Thunderbolt 3 connection.

What will they offer?

Akitio Node Thunderbolt 3 "card cage" external graphics module - press image courtesy of Akitio

Akitio Node Thunderbolt 3 “card cage” external graphics module

All these devices will have their own video outputs but will yield what the high-performance graphics chipset provides through the host computer’s integral screen, the video outputs integrated with the host computer as well as their own video outputs. This is in contrast to what used to happen with desktop computers where the video outputs associated with the integrated graphics chipset became useless when you installed a graphics card in these computers.

I have read a few early reviews for the first generation of graphics modules and Thunderbolt-3 laptops. One of these was Acer’s integrated graphics module kitted out with a NVIDIA GTX960M GPU, known to be a modest desktop performer but its mobile equivalent is considered top-shelf for laptop applications. This was ran alongside an Acer TravelMate P658 and an Acer Aspire Switch 12S, with it providing as best as the graphics would allow but highlighting where the weakness was, which was the mobile-optimised Intel Core M processors in the Switch 12S convertible.

Simplified plug-in expansion for all computers

Intel Skull Canyon NUC press picture courtesy of Intel

The Intel Skull Canyon NUC can easily be “hotted up” with better graphics when coupled with an external graphics module

Another example was a manufacturer’s blog post about using their “card-cage” graphics dock with one of the Intel Skull Canyon “Next Unit Of Computing” midget computers which was equipped with the Thunderbolt 3 connection. This showed how the computer increased in graphics performance once teamed with the different graphics cards installed in that “card-cage” module.

It opened up the idea of using an “AV system” approach for enhancing small-form-factor and integrated computers. This is where you connect extra modules to these computers to increase their performance just like you would connect a better CD player or turntable or substitute an existing amplifier for something more powerful or plug in some better speakers if you wanted to improve your hi-fi system’s sound quality.

This usage case would earn its keep with an “all-in-one” computer which has the integrated monitor, the aforementioned “Next Unit Of Computing” midget computers or simply a low-profile desktop computer that wouldn’t accommodate high-performance graphics cards.

Software and performance issues can be a real stumbling block

What I had come across from the material I had read was that as long as the host computer had the latest version of the operating system, the latest BIOS and other firmware to support graphics via Thunderbolt 3, and the latest drivers to support this functionality then it can perform at its best. As well, the weakest link can affect the overall performance of the system, which can apply to various mobile system-on-chip chipsets tuned primarily to run cool and allow for a slim lightweight computer that can run on its own batteries for a long time.

At the moment, this product class is still not mature and there will be issues with compatibility and performance with the various computers and external graphics modules.

As well, not all graphics cards will work with every “card-cage” graphics module. This can be due to high-end desktop graphics cards drawing more current than the graphics module can supply, something that can be of concern with lower-end modules that have weaker power supplies, or software issues associated with cards that aren’t from the popular NVIDIA or AMD games-focused lineups. You may have to check with the graphics module’s vendor or the graphics card’s vendor for newer software or firmware to be assured of this compatibility.

Multiple GPUs – a possible reality

A situation that may have to be investigated as more of these products arrive is the concurrent use of multiple graphics processors in the same computer system no matter the interface or vendor. The ability to daisy-chain 6 Thunderbolt-3 devices on the same Thunderbolt-3 connection, along with premium desktop motherboards sporting this kind of connection along with their PCI-Express expansion slots, will make the concept become attractive and easy to implement. Similarly, some vendors could start offering Thunderbolt-3 expansion cards that plug in to existing motherboards’ PCI-Express expansion slots to give existing desktop PCs this functionality.

Here, the goal would be to allow multiple GPUs from different vendors to work together to increase graphics performance for high-end games or multimedia-production tasks like video transcoding or rendering of video or animation projects. Or it could be about improving the performance and efficiency of a multiple-display setup by allocating particular graphics processors to particular displays, something that would benefit larger setups with many screens and, in some cases, different resolutions.

Highly-portable gaming setups being highlighted as a use case

A usage class that was always put forward for these external graphics modules was the teenage games enthusiast who is studying at senior secondary school and is ready to study at university. Here, the usage case underscored the situation where they could be living in student accommodation like a college dorm / residence hall or be living in a share-house with other students.

The application focuses on the use of a laptop computer that can be taken around the campus but be connected to one of these modules when the student is at their home. I would add to this the ability to carry the graphics module between their room and the main lounge area in their home so that they could play their games on the bigger TV screen in that area. This is due to the device being relatively compact and lightweight compared to most desktop computers.

That same application can cover people who are living in accommodation associated with their job and this is likely to change frequently as they answer different work placements. An example of this would be people whose work is frequently away from home for significant amounts of time like those who work on ships, oil rigs or mines. Here, some of these workers may be using their laptop that they use as part of their work during their shift where applicable such as on a ship’s bridge, but use it as a personal entertainment machine in their cabin or the mess room while they are off-shift.

What could be seen more of these devices

Once the external graphics modules mature as a device class, they could end up moving towards two or three classes of device.

One of these would be the integrated modules with graphics chipsets considered modest for desktop use but premium for laptop use. The expansion abilities that these may offer could be in the form of a few extra USB connections, an SD card reader and / or a higher-grade sound module. Perhaps, they may come with an optical drive of some sort. Some manufacturers may offer integrated modules with higher-performance graphics chipsets along with more connections for those of us who want to pay a premium for extra performance and connectivity. These would be pitched towards people who want that bit more “pep” out of their highly-portable or compact computer that has integrated graphics.

Similarly, it could be feasible to offer larger-screen monitors which have discrete graphics chipsets integrated in them. They could also have the extra USB connections and / or secondary storage options, courting those users who are thinking of a primary workspace for their portable computer while desiring higher-performance graphics.

The card-cage variants could open up a class of device that has room for one or two graphics cards and, perhaps, sound cards or functionality-expansion cards. In some cases, this class of device could also offer connectivity and installation options for user-installable storage devices, along with extra sockets for other peripherals. This class of device could, again, appeal to those of us who want more out of the highly-compact computer they started with or that high-performance laptop rather than using a traditional desktop computer for high-performance computing.

Portable or highly-compact computers as a package

Manufacturers could offer laptops, all-in-one and other highly-compact or highly-portable computers that are part of matched-equipment packages where they offer one or more external graphics modules as a deal-maker option or as part of the package. These could differ by graphics chipset and by functionality such as external-equipment connectivity or integrated fixed or removable storage options.

This is in a similar vein to what has happened in the hi-fi trade since the 1970s where manufacturers were offering matched-equipment packages from their lineup of hi-fi components. Here they were able to allow, for example, multiple packages to have the same tape deck, turntable or CD player while each of the package was differentiated with increasingly-powerful amplifiers or receivers driving speakers that had differing levels of audio performance and cabinet size. It still was feasible to offer better and more capable source components with the more expensive packages or allow such devices to be offered as a way to make the perfect deal.

Conclusion

Expect that as more computers equipped with the Thunderbolt 3 over USB-C connection come on the market the external graphics module will become a simplified method of improving these computers’ graphic performance. It will be seen as a way for allowing highly-compact or highly-portable computers to benefit from high-performance graphics at some point in their life, something that this class of computer wouldn’t be able to normally do.