Computer building and repair Archive

Intel to offer integrated graphics fit for newer video games

Article

Intel Xe graphics strategy slide courtesy of Intel Corporation

Intel’s GPU strategy is rooted in Xe, a single architecture that can scale from teraflops to petaflops. At Architecture Day in August 2020, Intel Chief Architect Raja Koduri, Intel fellows and architects provided details on the progress Intel is making. (Credit: Intel Corporation)

Intel’s Xe Graphics Might Mean You No Longer Need a Separate Graphics Card to Play Games | Gizmodo Australia

Intel Xe Graphics: Release Date, Specs, Everything We Know | Tom’s Hardware

From the horse’s mouth

Intel

Intel Delivers Advances Across 6 Pillars of Technology, Powering Our Leadership Product Roadmap (Press Release)

My Comments

When one thinks of Intel’s graphics processor technology, they often think of the integrated graphics processors that use system RAM memory to “paint” the images you see on the screen. Typically these graphics processors are not considered as great as dedicated graphics processors of the like NVIDIA or AMD offer which use their own display memory.

Such processors are often associated with everyday business and personal computing needs like Web browsing, office productivity applications or consuming video content. They could be useful for basic photo editing or playing casual or “retro” games that aren’t graphically demanding, but wouldn’t do well with high-demand tasks like advanced photo/video editing or today’s video-game blockbusters.

Integrated graphics technology is typically preferred for use within laptops, tablets and 2-in-1s as an everyday graphics option for tasks like word-processing, Web surfing, basic video playback and the like. This is  especially because these computers need to run in a power-efficient and thermal-efficient manner, due to them being designed for portability and to be run on battery power. Let’s not forget that laptops with discrete graphics also implement integrated graphics for use as a power-efficient “lean-burn” option.

This same graphics technology also appeals to low-profile desktop computers including some “all-in-ones” and “next unit of computing” systems due to the chipsets yielding less heat and allowing for the compact design.

But typically most regular computers running desktop operating systems are nowadays specified with at least 8Gb of system RAM memory, if not 16Gb. Here, it may be inconsequential about the amount of memory used by the integrated graphics for some graphics tasks using the computer’s own screen. Let’s not forget that the Full HD (1080p) screen resolution is often recommended for a laptop’s integrated screen due to it being a power-efficient specification.

Intel has defined its new Xe graphics infrastructure platform that will be part of the Tiger Lake computing platform to be more capable than this. These GPU chips will maintain the same physical die size as prior Intel integrated graphics chips so as to avoid the need to reengineer computer designs when a silicon refresh to Tiger Lake is needed.

The more powerful Intel Xe variants will be offered with more powerful Tiger Lake CPUs. It will be similar to the current-issue Intel Iris Plus integrated graphics processors, and will be pitched for content creators. But I would say that these will simply appear in products similar to the former “multimedia laptops” that have increased multimedia performance.

One of the design goals for the Intel Xe LP (low power / low performance) integrated GPUs, especially the higher-performance variants is to play a graphically-rich AAA action game at Full HD resolution with a good frame rate. Being able to play such a game at Full HD that way would cater towards the preference for Full HD displays within 13”-15” laptops and similar portable computers due to this display specification being more power efficient than 4K UHD displays for that screen-size range.

A question I would raise is whether the frame rate would approach the standard of 60 Hz or how much of a power load this places on the computer’s batteries. As well, one would also need to know how much of the game’s “eye-candy” is being enabled during play on an Intel Xe LP integrated graphics setup.

Intel Xe-HP graphics chipset presentation slide courtesy of Intel Corporation

Xe-HP is the industry’s first multitiled, highly scalable, high-performance architecture, providing data center-class, rack-level media performance, GPU scalability and AI optimization. It covers a dynamic range of compute from one tile to two and four tiles, functioning like a multicore GPU. At Architecture Day in August 2020, Intel Chief Architect Raja Koduri, Intel fellows and architects provided details on the progress Intel is making. (Credit: Intel Corporation)

Intel will also intend to offer a dedicated graphics processor in the form of the Xe HP chipset codenamed DG1. It will be their first dedicated GPU that Intel has offered since 1998-2000 with a graphics card that they partnered for use with Pentium III and Celeron CPUs. This GPU will be capable of doing ray-tracing amongst other high-end gaming activities and it could be interesting to see how this chipset stands up to AMD or NVIDIA performance gaming GPUs.

The Intel Xe HP graphics platform will primarily be pitched at data center and server applications. But Intel is intending to offer a “client-computing” variant of this high-performance graphics platform as the Xe HPG. Here, this will be pitched at enthusiasts and gamers who value performance. But I am not sure what form factors this will appear in, be it a mobile dedicated GPU for performance-focused laptops and “all-in-ones” or small external graphics modules, or as a desktop expansion card for that gaming rig or “card-cage” external graphics module.

But Intel would need to offer this GPU not just as a “contract install” unit for computer builders to supply on a line-fit basis, but offer it through the “build-it-yourself” / computer-aftermarket sectors that serve hobbyist “gaming-rig” builders and the external graphics module sector. This sector is where NVIDIA and AMD are dominating within.

The accompanying software will implement adaptive graphics optimisation approaches including “there-and-then” performance tuning in order to cater towards new high-performance software needs. This would be seen as avoiding the need to update graphics driver software to run the latest games.

It could be seen as an attempt by Intel to cover the spread between entry-level graphics performance and mid-tier graphics performance. This could be a chance for Intel to make a mark for themselves when it comes to all-Intel computers pitched for everyday or modest computing expectations.

I also see Intel’s Xe graphics processor products as a way for them to be a significant third force when it comes to higher-performance “client computer” graphics processing technology. This is with NVIDA and AMD working on newer graphics silicon platforms and could definitely “liven up” the market there.

But it could lead to one or two of these companies placing a lot of effort on the high-end graphics technology space including offering such technology to the aftermarket. This is while one or two maintain an effort towards supplying entry-level and mid-tier graphics solutions primarily as original-equipment specification or modest aftermarket options.

Send to Kindle

Windows 10 to offer greater control over graphics processors

Article

Dell XPS 17 laptop press picture courtesy of Dell Australia

Microsoft will be introducing support for better management of graphics processors in computers like the Dell XPS 17 that have Thunderbolt 3 and onboard discrete graphics processors

Windows 10 to give power users more control over their GPUs | Bleeping Computer

New Windows 10 Build 20190 brings default GPU & GPU per application settings along with new post-update experience in Dev channel | Windows Central

My Comments

Previously Windows provided operating-system-level support for handling multiple graphics processors. This is due to the reality of many mainstream laptops being equipped with a discrete graphics processor along with the integrated graphics chipset, providing the ability for users to switch between high-performance or power efficiency depending on whether the computer is on AC power or batteries.

Akitio Node Thunderbolt 3 "card cage" external graphics module - press image courtesy of Akitio

.. so users can choose whether an app uses the onboard discrete graphics processor or a desktop graphics card installed in an external graphics module like this Akitio Node unit

This was achieved either at hardware or graphics-chipset level with special software like NVIDIA Optimus achieving this goal. Then Microsoft recently added this functionality to Windows to allow you to determine whether an application chooses the higher-performance or power-efficient graphics processor.

But in an upcoming feature upgrade, Microsoft will allow fine-tuned control over which graphics chipset applications use. The function is in testing through their Windows Insider beta-testing program.

It will cater towards users who run more than two discrete graphics processors on their system such as a gaming rig with two graphics cards or a laptop / all-in-one / low-profile desktop that has a discrete GPU and Thunderbolt 3 and is connected to an external graphics module. This kind of configuration is primarily offered on Intel-powered premium consumer and business clamshell laptops with a screen size of 15” or more as well as most of the Intel-powered performance-focused laptops.

Here, the users can specify which graphics processor is the one for high-performance computing or can specify that a particular application uses a particular graphics processor.

It is also being driven by the rise of USB4 and Thunderbolt 4, where there will be an effort to make these high-performance USB-C-based peripheral-connection ports ubiquitous and affordable. This could then open up the path for laptops and low-profile / all-in-one desktops to have these ports with their presence being sold on the ability to upgrade the computer’s graphics with an external graphics module.

In the case of a laptop equipped with a discrete GPU and Thunderbolt 3, users may find that the onboard discrete GPU is really a “mobile-grade” type that is intended to be power-efficient but doesn’t perform as good as a desktop graphics card. Here, they would install a desktop graphics card in to a “card-cage” external graphics module and connect it to the computer for better graphics performance.

This may work as a way to allow the use of “fit-for-purpose” graphics processors like a mobile-workstation GPU for a CAD program alongside a gaming-grade GPU for a game. Or a user could run a video-editing program specifically with a graphics processor that is good at rendering the video content while they have another graphics processor available for other tasks.

Personally, I would also like to see Windows offer the ability for users to create an order of preference for high-performance graphics processors either for default high-performance use or for a particular application’s needs. This would come in to its own for graphics-card reviewers who are comparing against their “daily-driver” graphics card, or people who are moving a Thunderbolt-3-equipped laptop between multiple external graphics modules.

Similarly, the control over multiple-graphics-processor setups that Windows is to offer could also evolve towards “task-specific” GPU use. Here, it could be about focusing a graphics chipset towards batch calculation workloads rather than display-focused workloads. This is because people involved with video-editing, media transcoding, statistics, cryptocurrency or similar tasks may prefer to use the kind of chipset that is a “number-cruncher” for those tasks rather than one that excels at interactive computing.

At least Microsoft is working towards answering the needs of power users who deal with two or more graphics processors as part of their personal-computing lives.

Send to Kindle

Intel to make graphics driver updates independent of PC manufacturer customisations

Article

Dell XPS 13 Kaby Lake

Laptops with Intel graphics infrastructure like this Dell XPS 13 will benefit from having any manufacturer-specific customisations to the graphics driver software delivered as a separate item from that drive code

Intel graphics drivers can now be updated separately from OEM customizations | Windows Central

From the horse’s mouth

Intel

Intel Graphics – Windows 10 DCH drivers (Latest download site)

My Comments

Intel is now taking a different approach to packaging the necessary Windows driver software for its graphics infrastructure. This will affect any of us who have Intel graphics infrastructure in our computers, including those of us who have Intel integrated-graphics chipsets working alongside third-party discrete graphics infrastructure in our laptops as an energy-saving measure.

Previously, computer or motherboard manufacturers who wanted to apply any customisations to their Intel integrated-graphics driver software for their products had to package the customisations with the driver software as a single entity. Typically it was to allow the computer manufacturer to optimise the software for their systems or introduce extra display-focused features peculiar to their product range.

Dell Inspiron 15 Gaming laptop

.. even if the Intel graphics architecture is used as a “lean-burn” option for high-performance machines like this Dell Inspiron 15 7000 Gaming laptop when they are run on battery power

This caused problems for those of us who wanted to keep the driver software up-to-date to get the best out of the integrated graphics infrastructure in our Intel-based laptops.

If you wanted to benefit from the manufacturer-supplied software customisations, you had to go to the manufacturer’s software-support Website to download the latest drivers which would have your machine’s specific customisations.

Here, the latest version of the customised drivers may be out-of-step with the latest graphics-driver updates offered by Intel at its Website and if you use Intel’s driver packages, you may not benefit from the customisations your machine’s manufacturer offered.

The different approach Intel is using is to have the graphics driver and the customisations specific to your computer delivered as separate software packages.

Here, Intel will be responsible for maintaining their graphics-driver software as a separate generic package which will have API “hooks” for any manufacturer-specific customisation or optimisation code to use. Users can pick this up from the Intel driver-update download site, the manufacturer’s software update site or Windows Update. Then the computer manufacturer will be responsible for maintaining the software peculiar to their customisations and offering the updates for that software via their support / downloads Website or Microsoft’s Windows Update.

It may be seen as a two-step process if you are using Intel’s and your computer manufacturer’s Websites or software-update apps for this purpose. On the other hand, if you rely on Windows Update as your driver-update path, this process would be simplified.

The issue of providing computer-specific customisations for software drivers associated with computer hardware subsystems will end up being revised after Intel’s effort. This will be more so with sound subsystems for those laptops that have their audio tuned by a name of respect in the audio industry, or common network chipsets implemented in a manufacturer-peculiar manner.

At least you can have your cake and eat it when it comes to running the latest graphics drivers on your Intel-based integrated-graphics-equipped laptop.

Send to Kindle

External or portable USB storage devices–what is the difference

Article

USB portable hard disk

An example of a portable hard disk

Portable vs external hard drive: Which one to buy | Gadgets Now

My Comments

The type of outboard storage devices is becoming differentiated in to two distinct classes – a desktop-grade “external” device and a smaller “portable” device.

Typically, a desktop-grade “external” hard disk or enclosure will house a 3.5” or larger desktop-grade storage device intended for installation in a traditional desktop computer or a server. As well, they will have their own power supply that could be provided using integrated electronics or a power brick like what you would use for a laptop computer. Better designs may offer their own USB hub that will allow you to connect more USB devices to your computer or other functions or some will even be multiple-disk RAID setups. This class of device also extends to external graphics modules or USB expansion

On the other hand, a portable hard disk or enclosure will house a 2.5” or smaller storage device and be powered from the host computer via its USB port. Here, they are being optimised for use on the road and the hard disks will implement data-protection measures to factor the unpredictable nature of this use. These units will typically be more expensive “per gigabyte” compared to the desktop-grade units.

Hard-disk-based devices will use a 3.5” drive spinning at 7200rpm for external units or a 2.5” drive spinning at 5400rpm for portable units. It is assumed that the higher-speed drives will offer better data-access performance over the lower-speed devices. There are solid-state-storage devices being released as outboard storage devices, primarily for portable use most of these devices will be more expensive per gigabyte than portable hard disks.

The desktop-grade “external” devices may be a better solution if you intend to have the device staying at one location all the time. This could be to cater towards extra storage for a desktop or laptop computer where you intend to use the data at home or the workplace, to expand a NAS’s storage capacity or to connect to a router for use as a baseline NAS.

On the other hand, the portable devices can be useful if it is very likely that you will take your data with you or move the device around. This could be to: use as external backup or offload storage for your laptop computer; to store a disk full of confidential data in a locked filing cabinet and only connect the disk to your computer on an “as-needed” basis; create a “grab and run” backup of critical data; or to have a large amount of data that could be used on other computers.

What is worth noting is that the portable devices will be designed to be able to handle unpredictable storage environments where there is a likelihood of increased movement. This is while desktop “external” hard disks are primarily useful for a normally-sessile usage environment.

Send to Kindle

The trends affecting personal-computer graphics infrastructure

Article

AMD Ryzen CPUs with integrated Vega graphics are great for budget-friendly PC gaming | Windows Central

My Comments

Dell Inspiron 13 7000 2-in-1 Intel 8th Generation CPU at QT Melbourne hotel

Highly-portable computers of the same ilk as the Dell Inspiron 13 7000 2-in-1 will end up with highly-capable graphics infrastructure

A major change that will affect personal-computer graphics subsystems is that those subsystems that have a highly-capable graphics processor “wired-in” on the motherboard will be offering affordable graphics performance for games and multimedia.

One of the reasons is that graphics subsystems that are delivered as an expansion card are becoming very pricey, even ethereally expensive, thanks to the Bitcoin gold rush. This is because the GPUs (graphics processors) on the expansion cards are being used simply as dedicated computational processors that are for mining Bitcoin. This situation is placing higher-performance graphics out of the reach of most home and business computer users who want to benefit from this feature for work or play.

But the reality is that we will be asking our computers’ graphics infrastructure to realise images that have a resolution of 4K or more with high colour depths and dynamic range on at least one screen. There will even be the reality that everyone will be dabbling in games or advanced graphics work at some point in their computing lives and even expecting a highly-portable or highly-compact computer to perform this job.

Integrated graphics processors as powerful as economy discrete graphics infrastructure

One of the directions Intel is taking is to design their own integrated graphics processors that use the host computer’s main RAM memory but have these able to serve with the equivalent performance of a baseline dedicated graphics processor that uses its own memory. It is also taking advantage of the fact that most recent computers are being loaded with at least 4Gb system RAM, if not 8Gb or 16Gb. This is to support power economy when a laptop is powered by its own battery, but these processors can even support some casual gaming or graphics tasks.

Discrete graphics processors on the same chip die as the computer’s main processor

Intel Corporation is introducing the 8th Gen Intel Core processor with Radeon RX Vega M Graphics in January 2018. It is packed with features and performance crafted for gamers, content creators and fans of virtual and mixed reality. (Credit: Walden Kirsch/Intel Corporation)

This Intel CPU+GPU chipset will be the kind of graphics infrastructure for portable or compact enthusiast-grade or multimedia-grade computers

Another direction that Intel and AMD are taking is to integrate a discrete graphics subsystem on the same chip die (piece of silicon) as the CPU i.e. the computer’s central “brain” to provide “enthusiast-class” or “multimedia-class” graphics in a relatively compact form factor. It is also about not yielding extra heat nor about drawing on too much power. These features are making it appeal towards laptops, all-in-one computers and low-profile desktops such as the ultra-small “Next Unit of Computing” or consumer / small-business desktop computers, where it is desirable to have silent operation and highly-compact housings.

Both CPU vendors are implementing AMD’s Radeon Vega graphics technology on the same die as some of their CPU designs.

Interest in separate-chip discrete graphics infrastructure

Dell Inspiron 15 Gaming laptop

The Dell Inspiron 15 7000 Gaming laptop – the kind of computer that will maintain traditional soldered-on discrete graphics infrastructure

There is still an interest in discrete graphics infrastructure that uses its own silicon but soldered to the motherboard. NVIDIA and AMD, especially the former, are offering this kind of infrastructure as a high-performance option for gaming laptops and compact high-performance desktop systems; along with high-performance motherboards for own-build high-performance computer projects such as “gaming rigs”. The latter case would typify a situation where one would build the computer with one of these motherboards but install a newer better-performing graphics card at a later date.

Sonnet eGFX Breakaway Puck integrated-chipset external graphics module press picture courtesy of Sonnet Systems

Sonnet eGFX Breakaway Puck integrated-chipset external graphics module – the way to go for ultraportables

This same option is also being offered as part of the external graphics modules that are being facilitated thanks to the Thunderbolt 3 over USB-C interface. The appeal of these modules is that a highly-portable or highly-compact computer can benefit from better graphics at a later date thanks to one plugging in one of these modules. Portable-computer users can benefit from the idea of working with high-performance graphics where they use it most but keep the computer lightweight when on the road.

Graphics processor selection in the operating system

For those computers that implement multiple graphics processors, Microsoft making it easier to determine which graphics processor an application is to use with the view of allowing the user to select whether the application should work in a performance or power-economy mode. This feature is destined for the next major iteration of Windows 10.

Here, it avoids the issues associated with NVIDIA Optimus and similar multi-GPU-management technologies where this feature is managed with an awkward user interface. They are even making sure that a user who runs external graphics modules has that same level of control as one who is running a system with two graphics processors on the motherboard.

What I see now is an effort by the computer-hardware industry to make graphics infrastructure for highly-compact or highly-portable computers offer similar levels of performance to baseline or mid-tier graphics infrastructure available to traditional desktop computer setups.

Send to Kindle

Microsoft to improve user experience and battery runtime for mobile gaming

Article – From the horse’s mouth

Candy Crush Saga gameplay screen Android

Microsoft researching how games like Candy Crush Saga can work with full enjoyment but not demanding much power

Microsoft Research

RAVEN: Reducing Power Consumption of Mobile Games without Compromising User Experience (Blog Post)

My Comments

A common frustration that we all face when we play video games on a laptop, tablet or smartphone is that these devices run out of battery power after a relatively short amount of playing time. It doesn’t matter whether we use a mobile-optimised graphics infrastructure like what the iPad or our smartphones are equipped with, or a desktop-grade graphics infrastructure like the discrete or integrated graphics chipsets that laptops are kitted out with.

What typically happens in gameplay is that the graphics infrastructure paints multiple frames to create the illusion of movement. But most games tend to show static images for a long time, usually while we are planning the next move in the game. In a lot of cases, some of these situations may use a relatively small area where animation takes place be it to show a move taking place or to show a “barberpole” animation which is a looping animation that exists for effect when no activity takes place.

Microsoft is working on an approach for “painting” the interaction screens in a game so as to avoid the CPU and graphics infrastructure devoting too much effort towards this task. This is a goal to allow a game to be played without consuming too much power and takes advantage of human visual perception for scaling frames needed to make an animation. There is also the concept of predictability for interpreting subsequent animations.

But a lot of the theory behind the research is very similar to how most video-compression codecs and techniques work. Here, these codecs use a “base” frame that acts as a reference and data that describes the animation that takes place relative to that base frame. Then during playback or reception, the software reconstructs the subsequent frames to make the animations that we see.

The research is mainly about an energy-efficient approach to measuring these perceptual differences during interactive gameplay based on the luminance component of a video image. Here, the luminance component of a video image would be equivalent to what you would have seen on a black-and-white TV. This therefore can be assessed without needing to place heavy power demands on the computer’s processing infrastructure.

The knowledge can then be used for designing graphics-handling software for games that are to be played on battery-powered devices, or to allow a “dual-power” approach for Windows, MacOS and Linux games. This is where a game can show a detailed animation with high performance on a laptop connected to AC power but allow it not to show that detailed animation while the computer is on battery power.

Send to Kindle

Windows to fully manage multiple graphics processor setups

Article – From the horse’s mouth

Dell Inspiron 15 Gaming laptop

The Dell Inspiron 15 7000 Gaming laptop – the process of selecting which graphics processor  an app or game should use in this Optimus-equipped laptop will soon be managed by Windows 10

Microsoft

Announcing Windows 10 Insider Preview Build 17093 for PC (Windows Experience Blog)

Previous Coverage

What is a GPU all about?

My Comments

Over the last few years, an increasing number of laptop-computer manufacturers worked with graphics-card vendors to implement dual-graphics-processor setups in their portable computing products.

This offered a function that works in a similar manner to the “performance / economy” or “sports mode” switch present in an increasing number of cars. Here, the transmission can be set to give the car sports-like performance or to allow it work more efficiently, typically by determining when the transmission changes gear in relation to the engine’s RPM. NVIDIA markets this function as Optimus while AMD markets it as Dynamic Switchable Graphics.

Sony VAIO S Series ultraportable STAMINA-SPEED switch

Sony VAIO S Series – equipped with dual graphics with an easy-to-use operating-mode switch

Initially Sony implemented a hardware switch to select the graphics processor on their VAIO S Series that I previously reviewed but you manage this function through a control app offered by NVIDIA or AMD depending on the discrete graphics chipset installed. From my experience, these programs can be very confusing to operating especially if you want to allow particular software to run in high-performance or economy mode, or simply override these settings.

Intel Corporation is introducing the 8th Gen Intel Core processor with Radeon RX Vega M Graphics in January 2018. It is packed with features and performance crafted for gamers, content creators and fans of virtual and mixed reality. (Credit: Walden Kirsch/Intel Corporation)

This Intel CPU+GPU chipset will be the reason Microsoft will be providing operating-system management of multiple graphics processors

Microsoft have now integrated in to a preview build of the next iteration of Windows 10 the ability to manage these settings using the operating system’s interface. This setup also applies to desktop systems equipped with two discrete GPUs such as a baseline graphics card and a performance-focused graphics card; or systems connected to an external graphics module. It can cater towards a situation where a computer is equipped with two built-in graphics processors and an external graphics module, a situation that can be made real with Intel’s new CPU+discrete GPU system-on-chip or a gaming laptop with a regular games-grade GPU, when computers with this kind of hardware also have Thunderbolt 3 ports.

Akitio Node Thunderbolt 3 "card cage" external graphics module - press image courtesy of Akitio

.. as will external graphics modules like this Akitio Node Thunderbolt 3 “card cage” external graphics module

The user experience requires you to select a program, be it a Classic (traditional Windows desktop) app or a Universal (Windows Store) app, then choose whether to let the system choose the GPU to use, or to use the GPU offering the highest performance, or the GPU that is the most economical. Here, it could cater for the external graphics modules or systems with three graphics processors by choosing the one with the most horsepower, typically the graphics processor in an external graphics module.

There is the ability for an application or game to choose the graphics processor to work with and this management ability won’t override that choice.  The ability to choose the graphics processor for a program to work with on the basis of whether it is power-saving or higher-performance makes it feasible to work with setups where you may connect or disconnect GPUs on a whim such as when you use external graphics modules.

What users may eventually want is to allow Windows to select the graphics processor for an application based on the kind of power source the host computer is using. Here, such an option could allow an app to use high-performance graphics like a discrete graphics chipset while the computer is running from AC power, but use a power-conserving graphics setup while running on batteries.

Other goals that may be seen would include the ability for Windows to manipulate multiple graphics processors to optimise for higher graphics and system performance for particular situations. This could range from using an integrated graphics processor in a setup using a discrete or external graphics processor for its graphics needs to improve performance for supplementary tasks to allocating GPUs to particular display clusters.

At least Microsoft has started on the idea of “baking in” multiple-graphics-processor management into Windows 10 rather than relying on software supplied by graphics-processor vendors to do the job.

Send to Kindle

Sonnet shows up a highly-portable external graphics module

Articles

Sonnet eGFX Breakaway Puck integrated-chipset external graphics module press picture courtesy of Sonnet Systems

Sonnet eGFX Breakaway Puck integrated-chipset external graphics module – the way to go for ultraportables

This Little Box Can Make Even the Junkiest Laptop a Gaming PC | Gizmodo

From the horse’s mouth

Sonnet

eGFX Breakaway Puck (Product Page, Press Release)

My Comments

Increasingly there has been the rise of external graphics modules that connect to your laptop or small-form-factor desktop computer via its Thunderbolt 3 port. This has allowed this class of computer to benefit from better graphics hardware even though they don’t have the ability for you to fit a graphics card in them. Similarly, they would appeal to users who have an ultraportable computer and mainly want the advanced graphics in a particular environment like home or office but don’t care about it on the road.

A highly-portable approach to giving the Dell XPS 13 Kaby Lake and its ilk discrete graphics

But most of these devices have come in the form of a “card-cage” which houses a desktop-grade graphics card, which is all and well if you are thinking of using gaming-grade or workstation-grade desktop graphics hardware. As well, these “card-cage” units would take up a lot of space, something that may not be beneficial with cramped desktop or entertainment-unit spaces.

Acer previously issued one of these external graphics modules which has an integrated NVIDIA graphics chipset but Sonnet has now come to the fore with the eGFX Breakaway Puck that uses integrated AMD Radeon graphics silicon. This device is sold as two different models with one equipped with the AMD Radeon RX560 GPU and another with the better-performing Radeon RX570 GPU.

The Sonnet eGFX Breakaway Puck can be stuffed in to a backpack’s pocket this making it appeal to users who are likely to be travelling more. As well, they offer a VESA-compliant bracket so that this external graphics module can be mounted on a display stand or arm for those of us who want as much space on the desktop as possible.

Connectivity for external displays is in the form of 3 DisplayPort outlets and 1 HDMI 2.0b outlet to cater for multi-monitor setups. It also exploits the Power Delivery standard to supply up to 45W of power to the host computer which can mean that you don’t need to use the computer’s charger to power the host computer.

There could be some improvements regarding connectivity like having another Thunderbolt 3 / USB-C port for connection to other peripherals, something that can be of concern with ultraportables that use very few connections. But I would see this opening up the idea for similarly-sized integrated-chipset external graphics modules both as highly-portable “add-ons” for laptop computers or to create “building-block” approaches to small-form-factor “NUC-style” desktop computer setups.

Send to Kindle

WD cracks the 14 Terabyte barrier for a standard desktop hard disk

Article HGST UltraStar HS14 14Tb hard disk press image courtesy of Western Digital

Western Digital 14TB hard drive sets storage record | CNet

From the horse’s mouth

HGST by Western Digital

Ultrastar HS14 14Tb hard disk

Product Page

Press Release

My Comments

Western Digital had broken the record for data stored on a 3.5” hard disk by offering the HGST by WD UltraStar HS14 hard disk.

This 3.5” hard disk is capable of storing 14Tb of data and has been seen as a significant increase in data-density for disk-based mechanical data storage. It implements HelioSeal construction technology which yields a hermetically-sealed enclosure filled with helium that leads to thinner disks which also permit reduced cost, cooling requirements and power consumption.

At the moment, this hard disk is being pitched at heavy-duty enterprise, cloud and data-center computing applications rather than regular desktop or small-NAS applications. In this use case, I see that these ultra-high-capacity hard disks earn their keep would be localised data-processing applications where non-volatile secondary storage is an important part of the equation.

Such situations would include content-distribution networks such as the Netflix application or edge / fog computing applications where data has to be processed and held locally. Here, such applications that are dependent on relatively-small devices that can be installed close to where the data is created or consumed like telephone exchanges, street cabinets, or telecommunications rooms.

I would expect that this level of data-density will impact other hard disks and devices based on these hard disks. For example, applying it to the 2.5” hard-disk form factor could see these hard disks approaching 8Tb or more yielding highly capacious compact storage devices. Or that this same storage capacity is made available for hard drives that suit regular desktop computers and NAS units.

Send to Kindle

NVIDIA offers external graphics module support for their Quadro workstation graphics cards

Articles

Razer Blade gaming Ultrabook connected to Razer Core external graphics module - press picture courtesy of Razer

NVIDIA allows you to turn a high-performance Ultrabook like the Razer Blade in to a mobile workstation when you plant a Quadro graphics card in an external graphics module like the Razer Core

Nvidia rolls out external GPU support for Nvidia Quadro | CNet

NVIDIA External GPUs Bring New Creative Power to Millions of Artists and Designers | MarketWired

From the horse’s mouth

NVIDIA

Press Release

My Comments

Over the last year, there has been a slow trickle of external graphics modules that “soup up” the graphics capabilities of computers like laptops, all-in-ones and highly-compact desktops by using outboard graphics processors. Typically these devices connect to the host computer using a Thunderbolt 3 connection which provides a bandwidth equivalent to the PCI Express expansion-card standard used for desktop computers.

At the moment, this approach for improving a computer system’s graphics abilities has been focused towards gaming-grade graphics cards and chipsets, which has left people who want workstation-grade graphics performance in the lurch.

But NVIDIA has answered this problem by providing a driver update for their TITAN X and Quadro workstation graphics cards. This allows Windows to work with these cards even if they are installed in a “card-cage” external graphics module rather than on the host computer’s motherboard.

Not just that, NVIDIA are to start allowing external-graphics-module manufacturers to tender their products for certification so that they are proven by NVIDIA to allow these cards to work reliably to optimum performance. This may be different to the context of a certified workstation where all the components in that particular computer are certified by Autodesk and similar software vendors to work reliably and perform at their best with their CAD or similar software.

What is being pitched in this context is a “thin-and-light” laptop of the Dell XPS 13 kind (including the 2-in-1 variant);  an “all-in-one” desktop computer like the HP Envy 34 Curved All-In-One or an ultra-compact “next unit of computing” unit like the Intel Skull Canyon being able to do workstation-class tasks with the kind of graphics card that best suits this computing requirement.

The question that some workstation users will then raise is whether the computer’s main processor and RAM are up to these tasks even though a workstation-grade graphics card is added on; and then consider this approach unsatisfactory even though the host computer has a lot of RAM and / or runs with a Core i7 CPU. But something like a gaming laptop that uses a gaming-calibre graphics chipset may benefit from the Quadro in an external graphics “card cage” module when this system is destined to do a lot of video editing, CAD or animation work.

Personally, I see the concept of the Quadro workstation graphics chipset in an external graphics module as a way to allow hobbyists and small-time professionals to slowly put their foot in the door of high-performance workstation computing.

Send to Kindle