The Dell XPS 13 range of ultraportable laptops and 2-in-1 computers has over the last few years been seen as the “top of the pack” for that class of computer. Here, it has been about delivering the right mix of features, functionality and build quality for the price with this being reflected through the different generations of that computer.
In this case, there was an emphasis on the quality aspect of the Tiger Lake silicon refresh for the Dell XPS 13 series. This was about a faster range of CPUs, the availability of integrated graphics silicon that is on a par with baseline mobile discrete graphics silicon, and the use of Thunderbolt 4 connectivity which is a reliability and connectivity improvement on that specification. Here, this graphics improvement was about combining an ultraportable computer design with graphics processing technology that isn’t a wimp.
Even as a 2-in-1 that has been engineered to work with higher-power processors nut not overheat
The computers will have a thinner lighter design with the 2-in-1 variant having improved thermal design to cater towards the use of more powerful processing silicon. But that variant will be limited to the Intel Core i7-1165G7 as the most powerful CPU that can be specified. It will have the smallest integrated camera ever which clocks in at 2.25mm. The XPS 13 traditional laptop variant will use an edge-to-edge keyboard and achieve a 91.5% screen-to-body ratio.
XPS 13 computers that are specified with the 4K UHD+ display will have the display being certified for HDR and Dolby Vision use. But computers specified with the Full HD screen will have a battery runtime rated for 19 hours. The question with this is whether this can be about 19 hours with a mixture of activities ranging from Web browsing, word processing, viewing video content and playing a game like Civilization 6 on that long flight or roadtrip.
These computers will normally be delivered with Windows 10 but Dell is offering the XPS 13 traditional clamshell laptop as a “Developer Edition” variant. Here, this will be preloaded with Ubuntu 18 Linux, which will please software tinkerers and open-source computing advocates.
The minimum prices for Australian users are AUD$2999 for the 2-in-1 variant and AUD$2499 for the clamshell variamt. It will be interesting to see what the press reviews will come up with when the review units start to appear – whether they underscore Dell’s commitment to keeping the right mix of features, functionality, build quality and price for these computers.
As I have previously reported, computer-equipment manufacturers are waking up to the realisation that prosumers and content creators are a market segment to address. This group of users was heavily courted by Apple with the MacOS platform but Windows-based computer vendors are answering this need as a significant amount of advanced content-creation and content-presentation software is being written for or ported to Windows 10.
Here, the vendors are shoehorning computer specifications for some of their performance-focused computers towards the kind of independent content creator or content presenter who seeks their own work and manages their own IT. This can range from hobbyists to those of us who create online content to supplement other activities towards small-time professionals who get work “by the job”. It can also appeal to small-time organisations who create or present content but don’t necessarily have their own IT departments or have the same kind of IT department that big corporations have.
Lenovo answered this market with a range of prosumer computers in the form of the Creator Series which encompassed two laptops and a traditional tower-style desktop. Now Dell is coming up to the plate with their Creator Edition computer packages. Here, this approach is to have computers that are specifiied for content creation or content presentation but aren’t workstation-class machines identified with a distinct “Creator Edition” logo.
The first of these are the Creator Edition variants of the latest Dell XPS 17 desktop-replacement laptop. These have, for their horsepower, an Intel Core i7-10875H CPU and a discrete GPU in the form of the NVIDIA GeForce RTX-2060 with 6Gb display memory, based on the NVIDIA Max-Q mobile graphics approach. This will run RTX Studio graphics drivers that are tuned for content-professional use and will be part of the RTX Studio program that NVIDIA runs for content professionals.
The display used in these packages is a 17” 4K UHD touch display that is rated for 100% Adobe RGB colour accuracy. The storage capacity on these computers is 1 Terabyte in the form of a solid-state disk. The only difference between the two packages is that the cheaper variant will run with 16Gb system RAM and the premium variant having 32Gb system RAM.
Dell is also offering a Creator Edition variant of its XPS-branded desktop computer products. This will be in the form of a traditional tower-style desktop computer but is equipped with the latest Intel Core i9 CPU, NVIDIA GeForce RTX 2070 Super graphics card and able to be specced with RAM up to 64Gb and storage of up to 2Tb. It has all the expandability of a traditional form-factor desktop computer, something that would come in handy for project studios where special audio and video interface cards come in to play.
What is being shown up here is that computer manufacturers are recognising the content-creator and prosumer market segment who wants affordable but decent hardware that can do the job. It will be interesting to see who else of the large computer manufacturers will come up to the plate and have a product range courting the content creators and prosumers.
A problem with laptop design is that you can’t effectively mix the idea of a portable aesthetically-pleasing computer with a performance-focused design. It is still the Holy Grail of laptop design to combine these aspects in one machine.
This comes down to the requirement to provide enough power to the computer’s main processors – the central processing unit and the graphics processor for them to work your data and “paint” your screen. In some applications, the graphics processor is tasked with performing supplementary processing activities like rendering or transcoding edited video files or calculating statistics. As well there is the need to remove waste heat generated by the processing silicon so it can perform to expectation even when working hard.
As well, there is the proper full-size full-function keyboard on this gaming laptop
What typically happens is that a lightweight highly-portable computer won’t be engineered for anything beyond everyday computing tasks. This is while a performance-focused computer fit for gaming, photo-video editing or CAD will be a heavier and thicker machine that doesn’t look as aesthetically pleasing as the lightweight. Some of these computers even convey the look equivalent to an American or Australian muscle-car of the 1970s but most convey a look very similar to medium or large family cars that appeared at the end of the 20th century.
Lenovo is getting close to this Holy Grail by designing a 15” gaming laptop that is slimmer and lighter than typical gaming or other high-performance laptops of the same screen size. This laptop, know as the Legion Slim 7i, has had a significant amount of hardware and firmware engineering to achieve this goal of combining portability and performance.
It will use 10th-generation Intel Core i-series CPU silicon and NVIDIA max-Q graphics silicon, with the latter known to avoid yielding too much waste heat for mobile use. But even the max-Q graphics silicon cannot handle excess waste heat and the Intel Core silicon will underperform if there is too much of that heat.
Lenovo is implementing Dynamic Boost technology to steer power to the graphics processor where needed during graphics-intensive tasks like fast-paced gaming. It is augmented by NVIDIA’s Advanced Optimus technology that allows for task-appropriate graphics processor switching – whether to work with Intel integrated graphics for everyday computing as a “lean-burn” approach or to work the NVIDIA GPU for graphics-intense activity.
There is also ColdFront 2.0 hardware-and-software-based thermal engineering which is about increasing airflow within the computer while under load. There are small perforations above the keyboard to allow the computer to draw in air for cooling along with a many-bladed fan that comes in when needed to move the air across three heat pipes.
The Legion Slim 7i gaming laptop will have the full-sized keyboard with a numeric keypad and media keys. This will have a feel similar to a desktop mechanical keyboard. There is a 71 watt-hour battery in the computer which could last up to 7.75 hours.
The baseline variant will weigh in at 2 kilograms and cost $1329. But it can be specced up to Intel Core i9 CPU and NVIDIA RTX2060 Max Q graphics silicon. It can also have at the maximum 32Gb of current-spec RAM and 2Tb of NVMe solid-state storage. The screens are available either as a 4K UHD 60Hz display, a Full HD 144Hz display or a Full HD 60 Hz display.
For connectivity, these units offer Thunderbolt 3 which means access to external graphics modules, along with Wi-Fi 6 and Bluetooth 5 support. You may have to consider using a USB-C or Thunderbolt 3 dock with an Ethernet connection if you are considering low-latency game-friendly Ethernet or HomePlug powerline network technology.
The Lenovo Legion Slim 7i gaming laptop is expected to be on the market by November this year in the USA at least. Personally, I could see this as a push towards performance being about beauty as well as grunt.
The Dell XPS 13 series of ultraportable computers uses a combination of Intel integrated graphics and Thunderbolt 3 USB-C ports
Increasingly, laptop users want to make sure their computers earn their keep for computing activities that are performed away from their home or office. But they also want the ability to do some computer activities that demand more from these machines like playing advanced games or editing photos and videos.
What is this about?
Integrated graphics infrastructure like the Intel UHD and Iris Plus GPUs allows your laptop computers to run for a long time on their own batteries. It is thanks to the infrastructure using the system RAM to “paint” the images you see on the screen, along with being optimised for low-power mobile use. This is more so if the computer is equipped with a screen resolution of not more than the equivalent of Full HD (1080p) which also doesn’t put much strain on the computer’s battery capacity.
They may be seen as being suitable for day-to-day computing tasks like Web browsing, email or word-processing or lightweight multimedia and gaming activities while on the road. Even some games developers are working on capable playable video games that are optimised to run on integrated graphics infrastructure so you can play them on modest computer equipment or to while away a long journey.
There are some “everyday-use” laptop computers that are equipped with a discrete graphics processor along with the integrated graphics, with the host computer implementing automatic GPU-switching for energy efficiency. Typically the graphics processor doesn’t really offer much for performance-grade computing because it is a modest mobile-grade unit but may provide some “pep” for some games and multimedia tasks.
Thunderbolt 3 connection on a Dell XPS 13 2-in-1
But if your laptop has at least oneThunderbolt 3 USB-C port along with the integrated graphics infrastructure, it will open up another option. Here, you could use an external graphics module, also known as an eGPU unit, to add high-performance dedicated graphics to your computer while you are at home or the office. As well, these devices provide charging power for your laptop which, in most cases, would relegate the laptop’s supplied AC adaptor as an “on-the-road” or secondary charging option.
A use case often cited for this kind of setup is a university student who is studying on campus and wants to use the laptop in the library to do their studies or take notes during classes. They then want to head home, whether it is at student accommodation like a dorm / residence hall on the campus, an apartment or house that is shared by a group of students, or their parents’ home where it is within a short affordable commute from the campus. The use case typifies the idea of the computer being able to support gaming as a rest-and-recreation activity at home after all of what they need to do is done.
Razer Core external graphics module with Razer Blade gaming laptop
Here, the idea is to use the external graphics module with the computer and a large-screen monitor have the graphics power come in to play during a video game. As well, if the external graphics module is portable enough, it may be about connecting the laptop to a large-screen TV installed in a common lounge area at their accommodation on an ad-hoc basis so they benefit from that large screen when playing a game or watching multimedia content.
The advantage in this use case would be to have the computer affordable enough for a student at their current point in life thanks to it not being kitted out with a dedicated graphics processor that may be seen as being hopeless. But the student can save towards an external graphics module of their choice and get that at a later time when they see fit. In some cases, it may be about using a “fit-for-purpose” graphics card like an NVIDIA Quadro with the eGPU if they maintain interest in that architecture or multimedia course.
It also extends to business users and multimedia producers who prefer to use a highly-portable laptop “on the road” but use an external graphics module “at base” for those activities that need extra graphics power. Examples of these include to render video projects or to play a more-demanding game as part of rest and relaxation.
Sonnet eGFX Breakaway Puck integrated-chipset external graphics module – the way to go for ultraportables
There are a few small external graphics modules that are provided with a soldered-in graphics processor chip. These units, like the Sonnet Breakaway Puck, are small enough to pack in your laptop bag, briefcase or backpack and can be seen as an opportunity to provide “improved graphics performance” when near AC power. There will be some limitations with these devices like a graphics processor that is modest by “desktop gaming rig” or “certified workstation” standards; or having reduced connectivity for extra peripherals. But they will put a bit of “pep” in to your laptop’s graphics performance at least.
Some of these small external graphics modules would have come about as a way to dodge the “crypto gold rush” where traditional desktop-grade graphics cards were very scarce and expensive. This was due to them being used as part of cryptocurrency mining rigs to facilitate the “mining” of Bitcoin or Ethereum during that “gold rush”. The idea behind these external graphics modules was to offer enhanced graphics performance for those of us who wanted to play games or engage in multimedia editing rather than mine Bitcoin.
Who is heading down this path?
At the moment, most computer manufacturers are configuring a significant number of Intel-powered ultraportable computers along these lines i.e. with Intel integrated graphics and at least one Thunderbolt 3 port. A good example of this are the recent iterations of the Dell XPS 13 (purchase here) and some of the Lenovo ThinkPad X1 family like the ThinkPad X1 Carbon.
Of course some of the computer manufacturers are also offering laptop configurations with modest-spec discrete graphics silicon along with the integrated-graphics silicon and a Thunderbolt 3 port. This is typically pitched towards premium 15” computers including some slimline systems but these graphics processors may not put up much when it comes to graphics performance. In this case, they are most likely to be equivalent in performance to a current-spec baseline desktop graphics card.
The Thunderbolt 3 port on these systems would be about using something like a “card-cage” external graphics module with a high-performance desktop-grade graphics card to get more out of your games or advanced applications.
Trends affecting this configuration
The upcoming USB4 specification is meant to be able to bring Thunderbolt 3 capability to non-Intel silicon thanks to Intel assigning the intellectual property associated with Thunderbolt 3 to the USB Implementers Forum.
As well, Intel has put forward the next iteration of the Thunderbolt specification in the form of Thunderbolt 4. It is more of an evolutionary revision in relationship to USB4 and Thunderbolt 3 and will be part of their next iteration of their Core silicon. But it is also intended to be backwards compatible with these prior standards and uses the USB-C connector.
What can be done to further legitimise Thunderbolt 3 / USB4 and integrated graphics as a valid laptop configuration?
What needs to happen is that the use case for external graphics modules needs to be demonstrated with USB4 and subsequent technology. As well, this kind of setup needs to appear on AMD-equipped computers as well as devices that use silicon based on ARM microarchitecture, along with Intel-based devices.
Personally, I would like to see the Thunderbolt 3 or USB4 technology being made available to more of the popularly-priced laptops made available to householders and small businesses. It would be with an ideal to allow the computer’s user to upgrade towards better graphics at a later date by purchasing an external graphics module.
This is in addition to a wide range of external graphics modules available for these computers with some capable units being offered at affordable price points. I would also like to see more of the likes of the Lenovo Legion BoostStation “card-cage” external graphics module that have the ability for users to install storage devices like hard disks or solid-state drives in addition to the graphics card. Here, these would please those of us who want extra “offload” storage or a “scratch disk” just for use at their workspace. They would also help people who are moving from the traditional desktop computer to a workspace centred around a laptop.
The validity of a laptop computer being equipped with a Thunderbolt 3 or similar port and an integrated graphics chipset is to be recognised. This is more so where the viability of improving on one of these systems using an external graphics module that has a fit-for-purpose dedicated graphics chipset can be considered.
Lenovo Flex 5G / Yoga 5G convertible notebook which runs Windows on Qualcomm ARM silicon – the first laptop computer to have 5G mobile broadband on board
Increasingly, regular computers are moving towards the idea of having processor power based around either classic Intel (i86/i64) or ARM RISC microarchitectures. This is being driven by the idea of portable computers heading towards the latter microarchitecture as a power-efficiency measure with this concept driven by its success with smartphones and tablets.
It is undertaking a different approach to designing silicon, especially RISC-based silicon, where different entities are involved in design and manufacturing. Previously, Motorola was taking the same approach as Intel and other silicon vendors to designing and manufacturing their desktop-computing CPUs and graphics infrastructure. Now ARM have taken the approach of designing the microarchitecture themselves and other entities like Samsung and Qualcomm designing and fabricating the exact silicon for their devices.
Apple to move the Macintosh platform to their own ARM RISC silicon
As well, the Linux community have established Linux-based operating systems on ARM microarchitectore. This has led to Google running Android on ARM-based mobile and set-top devices and offering a Chromebook that uses ARM silicon; along with Apple implementing it in their operating systems. Not to mention the many NAS devices and other home-network hardware that implement ARM silicon.
Initially the RISC-based computing approach was about more sophisticated use cases like multimedia or “workstation-class” computing compared to basic word-processing and allied computing tasks. Think of the early Apple Macintosh computers, the Commodore Amiga with its many “demos” and games, or the RISC/UNIX workstations like the Sun SPARCStation that existed in the late 80s and early 90s. Now it is about power and thermal efficiency for a wide range of computing tasks, especially where portable or low-profile devices are concerned.
Already mobile and set-top devices use ARM silicon
I will see an expectation for computer operating systems and application software to be written and compiled for both classic Intel i86 and ARM RISC microarchitectures. This will require software development tools to support compiling and debugging on both platforms and, perhaps, microarchitecture-agnostic application-programming approaches. It is also driven by the use of ARM RISC microarchitecture on mobile and set-top/connected-TV computing environments with a desire to allow software developers to have software that is useable across all computing environments.
.. as do a significant number of NAS units like this WD MyCloud EX4100 NAS
Some software developers, usually small-time or bespoke-solution developers, will end up using “managed” software development environments like Microsoft’s .NET Framework or Java. These will allow the programmer to turn out a machine-executable file that is dependent on pre-installed run-time elements for it to run. These run-time elements will be installed in a manner that is specific to the host computer’s microarchitecture and make use of the host computer’s needs and capabilities. These environments may allow the software developer to “write once run anywhere” without knowing if the computer the software is to run on uses an i86 or ARM microarchitecture.
There may also be an approach towards “one-machine two instruction-sets” software development environments to facilitate this kind of development where the goal is to simply turn out a fully-compiled executable file for both instruction sets.
It could be in an accepted form like run-time emulation or machine-code translation as what is used to allow MacOS or Windows to run extant software written for different microarchitectures. Or one may have to look at what went on with some early computer platforms like the Apple II where the use of a user-installable co-processor card with the required CPU would allow the computer to run software for another microarchitecture and platform.
Computer Hardware Vendors
For computer hardware vendors, there will be an expectation towards positioning ARM-based silicon towards high-performance power-efficient computing. This may be about highly-capable laptops that can do a wide range of computing tasks without running out of battery power too soon. Or “all-in-one” and low-profile desktop computers will gain increased legitimacy when it comes to high-performance computing while maintaining the svelte looks.
Personally, if ARM-based computing was to gain significant traction, it may have to be about Microsoft encouraging silicon vendors other than Qualcomm to offer ARM-based CPUs and graphics processors fit for “regular” computers. As well, Microsoft and the Linux community may have to look towards legitimising “performance-class” computing tasks like “core” gaming and workstation-class computing on that microarchitecture.
There may be the idea of using 64-bit i86 microarchitecture as a solution for focused high-performance work. This may be due to a large amount of high-performance software code written to run with the classic Intel and AMD silicon. It will most likely exist until a significant amount of high-performance software is written to run natively with ARM silicon.
Thanks to Apple and Microsoft heading towards ARM RISC microarchitecture, the computer hardware and software community will have to look at working with two different microarchitectures especially when it comes to regular computers.
This week, Apple used its WWDC software developers’ conference to announce that the Macintosh regular-computer platform will move away from Intel’s silicon to their own ARM-based silicon. This is to bring that computing platform in to line with their iOS/iPadOS mobile computing platform, their tVOS Apple TV set-top platform and their Watch platform that uses Apple’s own silicon.
Here, this silicon will use the ARM RISC instruction-set microarchitecture rather than the x86/x64 architecture used with Intel silicon. But Apple is no stranger to moving the Macintosh computing platform between microarchitectures.
Initially this platform used Motorola 680×0/PowerPC silicon which used a Motorola RISC instruction set microarchitecture. This platform initially had more chops compared to Intel’s x86 platform especially when it came to graphics and multimedia. Then, when Apple realised that Intel offered cost-effective microprocessors using the x86-64 microarchitecture and had the same kind of multimedia prowess as the Motorola processors, they moved the Macintosh platform to the Intel silicon.
But Apple had to take initiatives to bring the MacOS and Mac application software over to this platform. This required them to supply software development tools to the software-development community to allow programs that they write to be compiled for both Motorola and Intel instruction sets. They also furnished an instruction-set translator or “cross-compiler” called Rosetta to Mac users who had Intel-based Macs so they can run extant software that was written for Motorola silicon.
For a few years, this caused some awkwardness with Mac users, especially those who were early adopters, due to either the availability of software natively compiled for Intel silicon. Or they were finding that their existing Motorola-native software was running too slowly on their Intel-based computers thanks to the Rosetta instruction-set-translation software working between their program and the computer’s silicon.
Apple will be repeating this process in a very similar way to the initial Intel transition by the provision of software-development tools that build for Intel i86-64 based silicon and their own ARM-RISC based silicon. As well they will issue Rosetta2 which does the same job as the original Rosetta but translate i86-64 CISC machine instructions to the ARM RISC instruction set that their own silicon uses. Rosetta2 will be part of the next major version of MacOS which will be known as Big Sur.
The question that will be raised amongst developers and users of high-resource-load software like games or engineering software is what impact this conversion will have on that level of software. Typically most games are issued for the main games consoles and Windows-driven Intel-architecture PCs over Macs or tvOS-based Apple TV set-top devices, with software ports for these platforms coming later on in the software’s evolution.
There is an expectation that the Rosetta2 “cross-compiler” software could work this kind of software properly to a point that it can satisfactorily perform on a computer using integrated graphics infrastructure and working at Full HD resolution. Then there will be the issue of making sure it works with a Mac that uses discrete graphics infrastructure and higher display resolutions, thus giving the MacOS platform some “gaming chops”.
I see the rise of ARM RISC silicon in the tradition regular computing world and having it exist alongside classic Intel-based silicon in this computing space like what is happening with Apple and Microsoft as a challenge for computer software development. It is although some work has taken place within the UNIX / Linux space to facilitate the development of software for multiple computer types thus leading to this space bringing forth the open-source and shared-source software movements. This is more so with Microsoft where there is an expectation to have Intel-based silicon and ARM-based silicon exist alongside each other for the life of a common desktop computing platform, with each silicon type serving particular use cases.
More of the Thunderbolt-3 external graphics modules are appearing on the scene but most of these units are primarily heavy units with plenty of connectivity on them. This is good if you wish to have this external graphics module as part of your main workspace / gaming space rather than something you will be likely to take with you as you travel with that Dell XPS 13 Ultrabook or MacBook Pro.
Dell XPS 13 9360 8th Generation clamshell Ultrabook – an example of an ultraportable computer that can benefit from one of the portable external graphics modules
Windows Central have called out a selection of these units that are particularly portable in design to allow for ease of transport. This will appeal to gamers and the like who have access to a large-screen TV in another room that they can plug video peripherals in to such as university students living in campus accommodation or a sharehouse. It can also appeal to those of us who want to use the laptop’s screen with a dedicated graphics processor such as to edit and render video footage they have captured or play a game with best video performance.
Most of the portable external graphics modules will be embedded with a particular graphics chipset and a known amount of display memory. In most cases this will be a high-end mobile GPU which may be considered low-spec by desktop (gaming-rig) standards. There will also be reduced connectivity options especially with the smaller units but they will have enough power output to power most Thunderbolt-3-equipped Ultrabooks.
An exception that the article called out was the Akitio Node Pro which is a “card cage” that is similar in size to one of the new low-profile desktop computers. This unit also has a handle and a Thunderbolt-3 downstream connection for other peripherals based on this standard. It would need an active DisplayPort-HDMI adaptor or a display card equipped with at least one HDMI port to connect to the typical large-screen TV set.
Most of the very small units or units positioned at the cheap end of the market would excel at 1080p (Full HD) graphics work. This would be realistic for most flatscreen TVs that are in use as secondary TVs or to use the laptop’s own screen if you stick the the advice to specify Full HD (1080p) as a way to conserve battery power on your laptop.
The exception in this roundup of portable external graphics modules was the AORUS Gaming Box which is kitted out with the NVIDIA GeForce GTX 1070 graphics chipset. This would be consided a high-performance unit.
Here, these portable external graphics modules are being identified as being something of use where you are likely to take them between locations but don’t mind compromising when it comes to functionality or capability.
It can also appeal to first-time buyers who don’t want to spend much on their first external graphics module to put a bit of “pep” in to their suitably-equipped laptop’s or all-in-one’s graphics performance. Then if you are thinking of using a better external graphics module, perhaps a “card-cage” variety that can work with high-performance “gaming-rig” or “desktop-workstation” cards, you can then keep one of these external graphics modules as something to use on the road for example.
Intel and the ISB Implementer’s Forum have worked together towards the USB 4.0 specification. This will be primarily about an increased bandwidth version of USB that will also bake in Thunderbolt 3 technology for further-increased throughput.
USB 4.0 will offer twice the bandwidth of USB 3.1 thanks to more “data lanes”. This will lead to 40Gb throughput along the line. It will use the USB Type-C connector and will take a very similar approach to the USB 3.0 standard which relied on the older USB connection types like USB-A, where a “best-case” situation takes place regarding bandwidth but allowing for backward compatibility. There will also be the requirement to use higher-performance cables rated for this standard when connecting your host system to a peripheral device using this standard.
Opening up Thunderbolt 3
Intel is opening up Thunderbolt 3 with a royalty-free non-exclusive licensing regime. It is in addition to baking the Thunderbolt 3 circuitry in to their standard system-on-chip designs rather than requiring a particular “Alpine Ridge” interface chip to be used by both the host and peripheral. This will open up Thunderbolt 3 towards interface chipset designers and the like including the possibility of computing applications based on AMD or ARM-microarchitecture silicon to benefit from this technology.
This effort can make Thunderbolt-3-equipped computers and peripherals more affordable and can open this standard towards newer use cases. For example, handheld games consoles, mobile-platform tablets or ultraportable “Always Connected” laptops could benefit from features like external graphics moduies. It may also benefit people who build their own computer systems such as “gaming rigs” by allowing Thunderbolt 3 to appear in affordable high-performance motherboards and expansion cards, including “pure-retrofit” cards that aren’t dependent on any other particular circuitry on the motherboard.
It is also about integrating the Thunderbolt specification in to the USB 4 specification as a “superhighway” option rather than calling it a separate feature. As well, Thunderbolt 3 and the USB 4 specification can be the subject of increased innovation and cost-effective hardware.
Where to initially
Initially I would see USB 4.0 appear in “system-expansion” applications like docks or external-graphics modules, perhaps also in “direct-attached-storage” applications which are USB-connected high-performance hard-disk subsystems. Of course it will lead towards the possibility of a laptop, all-in-one or low-profile computer being connected to an “extended-functionality” module with dedicated high-performance graphics, space for hard disks or solid-state storage, perhaps an optical drive amongst other things.
Another use case that would be highlighted is virtual reality and augmented reality where you are dealing with headsets that have many sensors and integrated display and audio technology. They would typically be hooked up to computer devices including devices the size of the early-generation Walkman cassette players that you wear on you or even the size of a smartphone. It is more so with the rise of ultra-small “next-unit-of-computing” devices which pack typically desktop computer power in a highly-compact housing.
Of course, this technology will roll out initially as a product differentiator for newer premium equipment that will be preferred by those wanting “cutting-edge” technology. Then it will appear to a wider usage base as more chipsets with this technology appear and are turned out in quantity.
Expect the USB 4.0 standard to be seen as evolutionary as more data moves quickly along these lines.
The microSD card which is used as a removeable storage option in most “open-frame” smartphones and tablets and increasingly being used in laptops has gained two significant improvements at this year’s Mobile World Congress in Barcelona.
The first of these improvements is the launching of microSD cards that can store 1 terabyte of data. Micron pitched the first of these devices while SanDisk, owned by Western Digital and also a strong player with the SD Card format, offered their 1Tb microSD card which is the fastest microSDXC card at this capacity.
The new SD Express card specification, part of the SD 7.1 Specification, provides a “best-case high-throughput” connection based on the same interface technology used in a regular computer for fixed storage or expansion cards. The microSD Express variant which is the second improvement launched at this same show takes the SD Express card specification to the microSD card size.
The SD Express specification, now applying to the microSD card size, achieves a level of backward compatibility for host devices implementing orthodox SD-card interfaces. This is achieved through a set of electrical contacts on the card for PCI Express and NVMe interfaces along with the legacy SD Card contacts, with the interfacing to the storage silicon taking place in the card.
As well, there isn’t the need to create a specific host-interface chipset for SD card use if the application is to expressly use this technology and it still has the easy upgradeability associated with the SD card. But most SD Express applications will also have the SD card interface chipset to support the SD cards that are in circulation.
This will lead to the idea of fast high-capacity compact removeable solid-state storage for a wide range of computing applications especially where size matters. This doesn’t matter whether the finished product has a smaller volume or to have a higher effective circuit density leading to more functionality within the same physical “box”.
One use case that was pitched is the idea of laptops or tablets, especially ultraportable designs, implementing this technology as a primary storage. Here, the microSD Express cards don’t take up the same space as the traditional SATA or M2 solid-state storage devices. There is also the ability for users to easily upsize their computers’ storage capacity to suit their current needs, especially if they bought the cheapest model with the lowest amount of storage.
Photography and videography will be another key use case especially when the images or footage being taken are of a 4K UHDTV or higher resolution and/or have high dynamic range. It will also be of benefit for highly-compact camera applications like “GoPro-style” action cams or drone-mount cameras. It will also benefit advanced photography and videography applications like 360-degree videos.
Another strong use case that is being pitched is virtual-reality and augmented-reality technology where there will be the dependence on computing power within a headset are a small lightweight pack attached to the headset. Here, the idea would be to have the headset and any accessories able to be comfortably worn by the end-user while they engage in virtual-reality.
Some of the press coverage talked about use of a 1Tb SD card in a Nintendo Switch handheld games console and described it as being fanciful for that particular console. But this technology could have appeal for newer-generation handheld games consoles especially where these consoles are used for epic-grade games.
Another interesting use case would be automotive applications, whether on an OEM basis supplied by the vehicle builder or an aftermarket basis installed by the vehicle owner. This could range from a large quantity of high-quality audio content available to use, large mapping areas or support for many apps and their data.
The microSD card improvements will be at the “early-adopter” stage where they will be very expensive and have limited appeal. As well, there may need to be a few bugs ironed out regarding their design or implementation while other SD-card manufacturers come on board and offer more of these cards.
At the moment, there aren’t the devices or SD card adaptors that take advantage of SD Express technology but this will have to happen as new silicon and finished devices come on to the scene. USB adaptors that support SD Express would need to have the same kind of circuitry as a portable hard drive along with USB 3.1 or USB Type-C technology to support “best case” operation with existing host devices.
This technology could become a game-changer for removeable or semi-removeable storage media applications across a range of portable computing devices.
After the GDPR effort for data protection and end-user privacy with our online life, the European Union want to take further action regarding data security. But this time it is about achieving a “secure by design” approach for connected devices, software and online services.
There will be a wider remit for EU Agency for Cybersecurity (ENSA) concerning cybersecurity issues that affect the European Union. But the key issue here is to have a European-Union-based framework for cybersecurity certification, which will affect online services and consumer devices with this certification valid through the EU. It is an internal-market legislation that affects the security of connected products including the Internet Of Things, as well as critical infrastructure and online services.
The certification framework will be about having the products being “secure-by-design” which is an analogy to a similar concept in building and urban design where there is a goal to harden a development or neighbourhood against crime as part of the design process. In the IT case, this involves using various logic processes and cyberdefences to make it harder to penetrate computer networks, endpoints and data.
It will also be about making it easier for people and businesses to choose equipment and services that are secure. The computer press were making an analogy to the “traffic-light” coding on food and drink packaging to encourage customers to choose healthier options.
-VP Andrus Ansip (Digital Single Market) – “In the digital environment, people as well as companies need to feel secure; it is the only way for them to take full advantage of Europe’s digital economy. Trust and security are fundamental for our Digital Single Market to work properly. This evening’s agreement on comprehensive certification for cybersecurity products and a stronger EU Cybersecurity Agency is another step on the path to its completion.”
What the European Union are doing could have implications beyond the European Economic Area. Here, the push for a “secure-by-design” approach could make things easier for people and organisations in and beyond that area to choose IT hardware, software and services satisfying these expectations thanks to reference standards or customer-facing indications that show compliance.
It will also raise the game towards higher data-security standards from hardware, software and services providers especially in the Internet-of-Things and network-infrastructure-device product classes.
Send to Kindle
Privacy & Cookies Policy
Necessary cookies are absolutely essential for the website to function properly. This category only includes cookies that ensures basic functionalities and security features of the website. These cookies do not store any personal information.
Any cookies that may not be particularly necessary for the website to function and is used specifically to collect user personal data via analytics, ads, other embedded contents are termed as non-necessary cookies. It is mandatory to procure user consent prior to running these cookies on your website.