Tag: ARM

Microsoft makes ARM-64 regular computers more legitimate

Through the late 1980s and the 1990s, there were a range of regular personal desktop computers that used various forms of processors that implemented RISC (Reduced Instruction Set Computing) technology.

The most common of these were the Apple Mac computers that used Motorola PowerPC silicon and existed before Apple implemented Intel silicon with examples like the original iMac. As well, the Sony PS3, Microsoft XBox 360 and Nintendo Wii games consoles implemented PowerPC RISC silicon at the heart of these devices so you may have played with this technology without knowing it.

But there were some computers with niche appeal like the SGI Indigo or the Sun Microsystems SPARC-based workstations that used limited-appeal high-power RISC chipsets. Acorn even ran a range of computers pitched to the education sector in the form of the Archimedes and RISC PC that were the first to implement today’s ARM RISC technology.

These systems were more about maximum graphics and multimedia power or high-load workstation-class computing that was to be achieved in an efficient manner. But Intel and Microsoft had brought a very similar level of power to computers based on their traditional i86-based CISC (Complex Instruction Set Computing) processors.

Through the 2000s and 2010s, Apple implemented Intel i64-based (64-bit i86 microarchitecture) silicon in their lineup of MacOS-based regular computers and adapted to this new microarchitecture. Now they have licensed ARM-64 (64-bit ARM RISC) microprocessor technology and used that to build their own M-Series system-on-chip processors.

Qualcomm Snapdragon X Elite processor press image courtesy of Qualcomm

Qualcomm Snapdragon X Elite processor chip – maturing ARM64 RISC computing for the Windows-based regular computer

Now Microsoft is developing their desktop operating system, application software and software-development tools to also work with ARM-64 RISC microarchitecture. They are also partnering with Qualcomm to work on a series of Snapdragon ARM64-based system-on-chip microprocessors for use in laptops that run Windows with this effort coming to maturity this year in the form of the Snapdragon X system-on-chip processor.

This is because ARM-based RISC computing is being used in portable and low-power computing setups like mobile-platform smartphones and tablets. It is also being implemented in set-top boxes, smart TVs, network-attached storage devices and similar devices where a low-profile or flexible design is being preferred. For portable devices, this will be about longer battery life such as many days on a charge or being equipped with smaller batteries. For statioary devices, it will be about compact or flexible power-efficient device designs.

Dell Inspiron 14 Plus CoPilot+ laptop with Qualcomm Snapdragon X Elite silicon press image courtesy of Dell Technologies

Dell Inspiron 14 Plus CoPilot+ laptop with Qualcomm Snapdragon X Elite silicon

As well, Apple and Microsoft are moving towards ARM64 microarchitecture for their regular-computer hardware and software to slow down the decline in business and consumer interest in this client-side computer class.

For Microsoft, Windows 11 has made it possible to emulate a 64-bit Intel operating environment on ARM64 computers like those using the Qualcomm Snapdragon X system-on-chip. This would allow most of today’s software and games to run on these computers. As well, Qualcomm and Microsoft are driving the on-device AI abilities associated with Snapdragon X by marking these computers as “CoPilut+ computers”.

These computers will answer most mainstream computing tasks especially on highly-portable or “all-in-one” computers. But to see stronger appeal, there may be market pressure to have a wide range of core games or creative software developed for or ported to ARM64 platforms.

Lenovo Yoga Slim 7X 2-in-1 laptop with Qualcomm Snapdragon X Elite silicon press image courtesy of Lenovo

Lenovo Yoga Slim 7X 2-in-1 laptop with Qualcomm Snapdragon X Elite silicon

Creative software that was written for or ported to Apple Macs using M-Series silicon could just as easily be developed for ARM-based Windows computers. This has also increased the validity of regular computers using ARM64-based technology amongst the creator / prosumer community. Similarly games studios that wrote for games consoles that have used PowerPC or ARM technology as well as i86/i64 technology desktop computers will also be able to adapt easily to this new reality.

At the moment, the regular-computer scene will still end up as a “horses for courses” environment with Intel/AMD-based silicon being for high-power computing tasks like core gaming or certified workstations, or where the highest level of hardware and software compatibility is desired. That is while the ARM64-based computers will hold their ground for an increasing amount of mainstream computing tasks.

But I would still consider the ARM64-based computers as being a viable alternative to i64-based Intel or AMD powered regular computers that run Windows or Linux as their operating systems.

ARM to introduce new performance chip design for laptops

Article

Lenovo Yoga 5G convertible notebook press image courtesy of Lenovo

More powerful CPU designs await ARM-based computers like the Lenovo Flex 5G / Yoga 5G Always Connected PC convertible notebook which runs Windows on ARM

Here’s how Arm’s latest CPU targets laptop and handheld console performance | Android Authority

From the horse’s mouth

ARM Holdings

Arm Cortex-A78C CPU: Secure and scalable performance for next-generation on-the-go devices (Blog Post)

My Comments

With some computer manufacturers offering regular computers that use ARM microarchitecture, there had to be a time for ARM Holdings to introduce a performance variety of their RISC-based computer chipset design.

This is in the form of the Cortex A78C CPU design number which is increased performance over current ARM-based CPU designs used in some Chromebooks or the Always Connected PC that runs Windows 10. It is being seen as an upgrade path for use cases with these systems where increased performance is being desired like games or multimedia.

Snapdragon smartphone electronics in 2-in-1 laptop press picture courtesy of Qualcomn

This will give Always Connected PCs that run Windows on ARM silicon more credibility

This is not really about clawing back the position that RISC-based microarchitecture held during the late 80s and early 90s as having increased multimedia prowess, even though this was facilitated with Motorola silicon. Rather this chip design is about blending performance and power efficiency making it appeal to a performance class of highly-portable computing device. Think of devices like the Always Connected PC notebook or Chromebook computer, a mobile-platform tablet with gaming or advanced multimedia prowess or a handheld gaming console.

Here the idea may be to keep the same battery type and thermal design for the device in question but allow more performance out of that device. This will be very similar what happened with portable audio equipment through the 1970s where manufacturers improved on the device’s design while keeping the power-supply requirement the same across the years for the device class. This led to amplifier and speaker designs that could allow for increased sound quality that led to increased product differentiation and improvement.

But where do I see this taking place for something like an Always Connected PC laptop that runs Windows 10 on ARM, or an ARM-based Chromebook or even a mobile-platform tablet? I would see this come about in the form of product differentiation in the context of CPU-level performance where manufacturers can offer device models that factor in performance. This avoids computers in the Always Connected PC or Chromebook class being relegated to “baseline duty machines” and allow them to be on a par with traditional Windows 64-bit x86-based computers when it comes to gaming or multimedia.

The same also holds true for mobile-platform tablets of the same ilk as the iPad or Samsung Galaxy Tab S. Here, it could be feasible for manufacturers to open up interest in gaming or multimedia-focused Android tablets that are about performance. That is especially where a tablet’s larger display surface can make it appeal as a gaming companion device to a smartphone.

Let’s not forget companies like Nintendo who have a strong legacy with the handheld games consoles from its Game & Watch devices of the early 80s through the Game Boy devices of the 1990s to the current Nintendo Switch. Here, they could work towards more powerful iterations of their current platforms, whether you consider them as a “timewaster” or a “guilty pleasure”. These platforms could even show some more highly-capable games as well while even using higher-resolution displays.

What will need to happen is for the likes of Qualcomm and Samsung to build this design into the actual CPU processors in order to have it appear in newer computer devices. As well, Microsoft would have to encourage the creation of games and similar software for ARM-based Windows setups especially those that use more powerful silicon.

This could then place ARM-based and x86-based mobile computing on a par with each other when it comes to performance but allow ARM to gain the edge in power efficiency for portable use cases.

Do I see regular computing target i86 and ARM microarchitectures?

Lenovo Yoga 5G convertible notebook press image courtesy of Lenovo

Lenovo Flex 5G / Yoga 5G convertible notebook which runs Windows on Qualcomm ARM silicon – the first laptop computer to have 5G mobile broadband on board

Increasingly, regular computers are moving towards the idea of having processor power based around either classic Intel (i86/i64) or ARM RISC microarchitectures. This is being driven by the idea of portable computers heading towards the latter microarchitecture as a power-efficiency measure with this concept driven by its success with smartphones and tablets.

It is undertaking a different approach to designing silicon, especially RISC-based silicon, where different entities are involved in design and manufacturing. Previously, Motorola was taking the same approach as Intel and other silicon vendors to designing and manufacturing their desktop-computing CPUs and graphics infrastructure. Now ARM have taken the approach of designing the microarchitecture themselves and other entities like Samsung and Qualcomm designing and fabricating the exact silicon for their devices.

Apple MacBook Pro running MacOS X Mavericks - press picture courtesy of Apple

Apple to move the Macintosh platform to their own ARM RISC silicon

A key driver of this is Microsoft with their Always Connected PC initiative which uses Qualcomm ARM silicon similar to what is used in a smartphone or tablet. This is to have the computer able to work on basic productivity tasks for a whole day without needing to be on AC power. Then Apple intended to pull away from Intel and use their own ARM-based silicon for their Macintosh regular computers, a symptom of them going back to the platform’s RISC roots but not in a monolithic manner.

As well, the Linux community have established Linux-based operating systems on ARM microarchitectore. This has led to Google running Android on ARM-based mobile and set-top devices and offering a Chromebook that uses ARM silicon; along with Apple implementing it in their operating systems. Not to mention the many NAS devices and other home-network hardware that implement ARM silicon.

Initially the RISC-based computing approach was about more sophisticated use cases like multimedia or “workstation-class” computing compared to basic word-processing and allied computing tasks. Think of the early Apple Macintosh computers, the Commodore Amiga with its many “demos” and games, or the RISC/UNIX workstations like the Sun SPARCStation that existed in the late 80s and early 90s. Now it is about power and thermal efficiency for a wide range of computing tasks, especially where portable or low-profile devices are concerned.

Software development

Already mobile and set-top devices use ARM silicon

I will see an expectation for computer operating systems and application software to be written and compiled for both classic Intel i86 and ARM RISC microarchitectures.  This will require software development tools to support compiling and debugging on both platforms and, perhaps, microarchitecture-agnostic application-programming approaches.  It is also driven by the use of ARM RISC microarchitecture on mobile and set-top/connected-TV computing environments with a desire to allow software developers to have software that is useable across all computing environments.

WD MyCloud EX4100 NAS press image courtesy of Western Digital

.. as do a significant number of NAS units like this WD MyCloud EX4100 NAS

Some software developers, usually small-time or bespoke-solution developers, will end up using “managed” software development environments like Microsoft’s .NET Framework or Java. These will allow the programmer to turn out a machine-executable file that is dependent on pre-installed run-time elements for it to run. These run-time elements will be installed in a manner that is specific to the host computer’s microarchitecture and make use of the host computer’s needs and capabilities. These environments may allow the software developer to “write once run anywhere” without knowing if the computer  the software is to run on uses an i86 or ARM microarchitecture.

There may also be an approach towards “one-machine two instruction-sets” software development environments to facilitate this kind of development where the goal is to simply turn out a fully-compiled executable file for both instruction sets.

It could be in an accepted form like run-time emulation or machine-code translation as what is used to allow MacOS or Windows to run extant software written for different microarchitectures. Or one may have to look at what went on with some early computer platforms like the Apple II where the use of a user-installable co-processor card with the required CPU would allow the computer to run software for another microarchitecture and platform.

Computer Hardware Vendors

For computer hardware vendors, there will be an expectation towards positioning ARM-based silicon towards high-performance power-efficient computing. This may be about highly-capable laptops that can do a wide range of computing tasks without running out of battery power too soon. Or “all-in-one” and low-profile desktop computers will gain increased legitimacy when it comes to high-performance computing while maintaining the svelte looks.

Personally, if ARM-based computing was to gain significant traction, it may have to be about Microsoft encouraging silicon vendors other than Qualcomm to offer ARM-based CPUs and graphics processors fit for “regular” computers. As well, Microsoft and the Linux community may have to look towards legitimising “performance-class” computing tasks like “core” gaming and workstation-class computing on that microarchitecture.

There may be the idea of using 64-bit i86 microarchitecture as a solution for focused high-performance work. This may be due to a large amount of high-performance software code written to run with the classic Intel and AMD silicon. It will most likely exist until a significant amount of high-performance software is written to run natively with ARM silicon.

Conclusion

Thanks to Apple and Microsoft heading towards ARM RISC microarchitecture, the computer hardware and software community will have to look at working with two different microarchitectures especially when it comes to regular computers.

Apple to use the ARM microarchitecture in newer Mac computers

Article

Apple MacBook Pro running MacOS X Mavericks - press picture courtesy of Apple

The Apple Mac platform is to move towards apple’s own silicon that uses ARM RISC microarchitecture

It’s Official: The Mac Is Transitioning to Apple-Made Silicon | Gizmodo

My Comments

This week, Apple used its WWDC software developers’ conference to announce that the Macintosh regular-computer platform will move away from Intel’s silicon to their own ARM-based silicon. This is to bring that computing platform in to line with their iOS/iPadOS mobile computing platform, their tVOS Apple TV set-top platform and their Watch platform that uses Apple’s own silicon.

Here, this silicon will use the ARM RISC instruction-set microarchitecture rather than the x86/x64 architecture used with Intel silicon. But Apple is no stranger to moving the Macintosh computing platform between microarchitectures.

Initially this platform used Motorola 680×0/PowerPC silicon which used a Motorola RISC instruction set microarchitecture. This platform initially had more chops compared to Intel’s x86 platform especially when it came to graphics and multimedia. Then, when Apple realised that Intel offered cost-effective microprocessors using the x86-64 microarchitecture and had the same kind of multimedia prowess as the Motorola processors, they moved the Macintosh platform to the Intel silicon.

But Apple had to take initiatives to bring the MacOS and Mac application software over to this platform. This required them to supply software development tools to the software-development community to allow programs that they write to be compiled for both Motorola and Intel instruction sets. They also furnished an instruction-set translator or “cross-compiler” called Rosetta to Mac users who had Intel-based Macs so they can run extant software that was written for Motorola silicon.

For a few years, this caused some awkwardness with Mac users, especially those who were early adopters, due to either the availability of software natively compiled for Intel silicon. Or they were finding that their existing Motorola-native software was running too slowly on their Intel-based computers thanks to the Rosetta instruction-set-translation software working between their program and the computer’s silicon.

Apple will be repeating this process in a very similar way to the initial Intel transition by the provision of software-development tools that build for Intel i86-64 based silicon and their own ARM-RISC based silicon. As well they will issue Rosetta2 which does the same job as the original Rosetta but translate i86-64 CISC machine instructions to the ARM RISC instruction set that their own silicon uses. Rosetta2 will be part of the next major version of MacOS which will be known as Big Sur.

The question that will be raised amongst developers and users of high-resource-load software like games or engineering software is what impact this conversion will have on that level of software. Typically most games are issued for the main games consoles and Windows-driven Intel-architecture PCs over Macs or tvOS-based Apple TV set-top devices, with software ports for these platforms coming later on in the software’s evolution.

There is an expectation that the Rosetta2 “cross-compiler” software could work this kind of software properly to a point that it can satisfactorily perform on a computer using integrated graphics infrastructure and working at Full HD resolution. Then there will be the issue of making sure it works with a Mac that uses discrete graphics infrastructure and higher display resolutions, thus giving the MacOS platform some “gaming chops”.

I see the rise of ARM RISC silicon in the tradition regular computing world and having it exist alongside classic Intel-based silicon in this computing space like what is happening with Apple and Microsoft as a challenge for computer software development. It is although some work has taken place within the UNIX / Linux space to facilitate the development of software for multiple computer types thus leading to this space bringing forth the open-source and shared-source software movements. This is more so with Microsoft where there is an expectation to have Intel-based silicon and ARM-based silicon exist alongside each other for the life of a common desktop computing platform, with each silicon type serving particular use cases.

What to expect in personal IT over 2019

Internet and Network technologies

Netgear Nighthawk 5G Mobile Hotspot press image courtesy of NETGEAR USA

Netgear Nighthawk 5G Mobile Hotspot – first retail 5G device

5G mobile broadband will see more carriers deploying this technology in more locations whether as a trial setup or to run with it as a full revenue service. It will also see the arrival of client devices like smartphones or laptops rather than just USB modems or modem routers supporting this technology.

Some users will see 5G mobile broadband as supplanting fixed broadband services but the fixed broadband technologies will be improved with higher data throughput that competes with that technology. As well, fixed broadband especially fibre-based next-generation broadband will also be required to serve as an infrastructure-level backhaul for 5G mobile broadband setups.

Wi-Fi 6 a.k.a. 802.11ax Wi-Fi wireless will be officially on the scene with more devices becoming available. It may also mean the arrival not just of new access points and routers supporting this standard but the arrival at least of client-side chipsets to allow laptops, tablets and smartphones to work with the new technology. Some countries’ radio-regulation authorities will look towards opening up the 6GHz spectrum for Wi-Fi purposes.

It also runs alongside the increased deployment of distributed-Wi-Fi systems with multiple access points linked by a wired or wireless backhaul. This will be facilitated with Wi-Fi EasyConnect and EasyMesh standards to create distributed-Wi-Fi setups with equipment from different vendors, which means that vendors don’t need to reinvent the wheel to build a distributed-Wi-Fi product line.

Consumer electronics and home entertainment

LG 4K OLED TVs press picture courtesy of LG America

LG 4K OLED TVs – a technology that could be coming more affordable over 2019

4K UHDTV with HDR technology will head towards its evolution phase with it maturing as a display technology. This will be with an increased number of sets implementing OLED, QLED or similar display technologies. It will also lead to more affordable HDR-capable TV models coming on to the scene.

Screen sizes of 75” and more will also cut in to affordable price ranges/ This will also be augmented with OLED-based screens becoming available in a “rollup” form that comes in an out like a blind or a traditional pull-down screen. Similarly, there will be a look towards the concept of “visual wallpaper” in order to justify the use of large screens in peoples’ households, including using the screen as a way to show messages or other information.

Online services will still become the primary source of 4K HDR TV content but the 4K UHD Blu-Ray disc will increase its foothold as the “packaged collectable” distribution medium for 4K video content. ATSC 3.0 and DVB-T2 will be pushed as a way to deliver 4K UHDTV content over the traditional TV aerial with this method of TV reception regaining its importance amongst the “cord-cutting” generations who dump cable and satellite TV.

JBL Link View lifestyle press image courtesy of Harman International

More of these voce-driven home-assistant devices with screens over this year

Another major direction affecting the home network and consumer electronics is an increased presence of voice-driven home-assistant services in this class of device. Typically this will be in the form of soundbars, wireless speakers, TV remote controls and similar home-entertainment equipment having endpoint functionality for Amazon Alexa or Google Assistant.

As well, the “smart screens” like what Lenovo, JBL and Amazon are offering will become more ubiquitous, with the ability to augment responses from a voice-driven home assistant. It will be part of having more household appliances and other gadgets work tightly with voice-driven home assistants.

It may be seen as an effort to bridge the multiple network-based multiroom audio platforms so you can run equipment from different vendors as part of one system. But the problem here will be that such setups may end up being more awkward to use.

The smartphone will be facing some key challenges what with people hanging on to these devices for longer and / or running two of them – one for their work or business along with one for personal life. Some new form-factors like folding smartphones will be demonstrated while some of them will be optimised for high-performance activities like gaming.

These devices are being augmented with the return of mobile feature phones or basic mobile phones. These phones are like the mobile phones that were on the market through the 1990s and 2000s and don’t connect to the home network or Internet or use these resources in a very limited way. They are appearing due to people wanting detachment from online life like the Social Web usually as part of the “back to basics” life calling, or simply as a fail-over mobile telephony device.

But as laptops and tablets become full-on computing and communications devices, the feature phones and basic phones will simply work in a complementary way to allow voice telephony or text messaging on the same service in a handheld form.

This situation is being underscored by more mobile carriers offering mobile telecommunications services that aren’t necessarily bound to one particular device. This is to face realities like the connected car, smartwatches with mobile broadband, Mi-Fi devices amongst other things which will be expected to use the same mobile service.

In the same context, there will be a market requirement for mobile communications devices, especially mobile phones, to support two or more services including multiple numbers on the same service. Primarily this will be driven by eSIM technology and over-the-air provisioning, but it will facilitate ideas like totally separate services for one’s business and private lives, or to cater towards people who regularly travel across international borders.

Security and regulatory issues

I do see a strong push towards more secure Internet-of-Things devices for residential, commercial and other applications over this year. This is as regulators in Europe and California put the pressure on IoT vendors to up their game regarding “secure-by-design” products. There is also the expectation that the Internet Of Things needs to be fit for purpose with transport applications, utilities, medical applications and the like where there is an expectation for safe secure reliable operation that cannot be compromised by cyber-attacks.

Here, it may be about the establishment of device-firmware “bug-bounty” programs by manufacturers, industry bodies and others used to unearth any software weaknesses. Then it will lead towards regular maintenance updates becoming the norm for dedicated-purpose devices. It may also include a requirement to for device vendors and end-users to support automatic installation of these maintenance updates but allow for manual installation of major “feature-addition” updates.

This is in conjunction with the Silicon Valley behemoths like Amazon, Facebook, Apple and Google having to change their ways due to them under increased scrutiny from governments, media, activist investors, civil society and end-users. It will affect issues like end-user privacy and data transparency, financial and corporate-governance / human-resources practices, along with the effective market power that they have across the globe.

Equipment design

Use of Gallium Nitride transistors for power conversion

A major trend to see more of this year is the increased use of Gallium Nitride transistor technology. This is beyond using this chemical compound for optoelectronics such as blue, white or multicolour LEDs or laser diodes installed in Blu-Ray players and BD-ROM drive for the purpose of reading these optical discs.

Here, it is to multiply the effect silicon had on the design of audio equipment through the 1970s leading to highly-powerful equipment in highly-compact or portable forms. This is through improved heat management that leads to the compact form alongside more powerful transistors for switch-mode circuits.

One of the initial applications will be in the form of highly-compact USB-C Power-Delivery-compliant chargers for laptops and smartphones. This year will be about an increased number of finished products and reference designs that, depending on the application,  yield more than 45W of DC power for USB-C PD applications from either 100-250VAC mains power or 12-24VDC vehicle / marine power. It could then be affecting multiple-outlet “charging bars” and similar devices where the goal is to have something highly compact and portable to power that Dell XPS 13 or Nintendo Switch alongside your smartphone.

I see it also affecting how power-supply circuitry for computers, peripherals, network equipment and the like is designed. This can lead towards equipment having the compact profile along with reduced emphasis on factoring in thermal management in the design like use of fans or venting.

ARM-based microarchitecture to compete with Intel’s traditional microarchitecture

In the late 1980s, the then-new RISC (Reduced Instruction Set Computing) microarchitecture excelled with graphics and multimedia applications. This is while Intel’s x86-based 16-bit traditional-microarchitecture used in the IBM PC and its clones were focused simply on number-crunching.

But 32-bit iterations of the x86 microarchitecture were able to encroach on graphics and multimedia since the early 1990s. Eventually it led to Apple moving the Macintosh platform away from the RISC-based Motorola CPUs towards Intel-based x86 and x64 traditional microarchitecture.

This was while Acorn Computers and a handful of other computer names worked towards ARM RISC microarchitecture which ended up in smartphones, tablets, set-top boxes and similar applications.

Now this microarchitecture is making a comeback with the Always-Connected PCs which are laptops that run Windows 10 on Qualcomm ARM processors for higher power efficiency. It was brought about with Microsoft releasing a Windows 10 variant that runs on ARM microarchitecture rather than classic microarchitecture.

This will lead to some computer vendors running with at least one or two of these computers in their ultraportable product ranges. But there is investigation in to taking ARM technology to higher-power computing applications like gaming and server setups.

The big question for Intel is what can they offer when it comes to microprocessor technology that can answer what Qualcomm and others are offering using their ARM processors.

Increased SSD capacity

The solid-state drive will start to approach bill-of-material per-kilobyte price parity with the 500GB hard disk. Here, it could lead towards laptops and ultra-compact desktop computers coming with 512Gb SSDs in the affordable configurations. This is also applying to USB-based external storage devices as well as what is integrated in a computer.

Here, the concept of high-speed energy-saving non-volatile storage that would satisfy a “sole computer” situation for a reasonable outlay is coming to fruition. What will still happen with the traditional mechanical hard disk is that it will end up satisfying high-capacity storage requirements like NAS units or servers. In some situations, it may lead towards more NAS units supporting multi-tier storage approaches like bring frequently-used data forward.

Conclusion

This is just a representative sample of what 2019 is likely to bring about for one’s personal and business online life, but as with each year, more situations will crop up over the year.

Windows 10 on Qualcomn ARM chips–to be real

Articles

Snapdragon smartphone electronics in 2-in-1 laptop press picture courtesy of Qualcomn

Implementing high-end smartphone electronics into an ultraportable laptop

Smartphone Guts Are Coming to Windows Laptops, and It Could Triple Your Battery Life | Gizmodo

Microsoft reveals ‘Always Connected PCs’ from HP and ASUS with Windows 10 on ARM | Windows Central

From the horse’s mouth

Microsoft

Always Connected PCs enable a new culture of work (Windows Experience Blog)

Qualcomn

Qualcomm Launches Technology Innovation with Advancements in the Always Connected PC and its Next-Generation Qualcomm Snapdragon Mobile Platform (Press Release)

A day in the life with the Snapdragon 835 powered HP Envy x2 PC (OnQ Blog)

Video – Click or tap to play

My Comments

Microsoft had made some attempts at bringing Windows to the ARM RISC microarchitecture with a view to bringing forth cheaper computers. But they had failed thanks to silicon based on traditional Intel x86/x64 microarchitecture being offered at very cheap price points and able to natively run a large roster of software already available for that platform.

But they, along with Qualcomn who supply the silicon for most of today’s smartphones, have re-approached this through the vision of an ultraportable laptop computer or tablet that implements the same technology as one of the recent high-end smartphones and phablets. This has been drawn out alongside the recent crop of highly-capable 11”-14” 2-in-1 laptops that are making a strong appeal as a highly-capable alternative to the iPad and Android-based tablets.

But the computers that represent the “Always Connected PC” product class integrate a large battery along with the LTE-based wireless-broadband modem, both of which allow for a long time of computer activity without the need of Wi-Fi or daily charging. These would also support eSIM which allows for over-the-wire provisioning of mobile broadband service, including the ability to provide “international-focused” service for people roaming around the world. HP and ASUS have premiered a detachable 2-in-1 and a convertible 2-in-1 which are based on this technology.

Microsoft is pushing the Always-Connected PC for the workplace with a focus towards a managed computing environment. Here, it is about avoiding the need to connect to insecure public-access Wi-Fi networks or worry about whether you have the laptop’s power supply with you when you head to work or make that business trip.

I see it more as an answer to Apple’s iOS platform, Google’s ChromeOS platform and Samsung’s interpretation of the Android platform where the goal is to cater to a mainstream productivity-focused computing environment for work or school.

Here, the focus would be about interacting with cloud-based business / education software whether as a Web app or as platform-native software or simply working with information using standard office-productivity software, perhaps with some video playback or mobile-grade gaming. I also see this as a way for Microsoft to aggressively compete against the iPad in the household, education and business environment by encouraging its partners to offer tablets and 2-in-1s that have the same operational qualities as that tablet.

But it wouldn’t displace the Intel / AMD x86/x64-based computers which would be focused towards applications where performance is of importance such as serious gaming or photo / video editing. But as for running Windows software, the ARM-based variants of Windows will be implementing an x86 emulation layer that allows 32-bit Windows software to run on these computers. This is while Windows software developers who package software for the Windows Store will be encouraged to deploy code native to x86, x64 and ARM microarchitectures.

The big challenge now is for software developers and games studios to port the software that is on the iOS or Android platforms towards the Windows 10 platforms on all the microarchitectures. It would them make it viable for Windows to continue as a third force for “non-handheld” mobile computing.

Another attempt at security for the Internet Of Things

Article

Google and others back Internet of Things security push | Engadget

My Comments

An issue that is perplexing the personal-computing scene is data security and user privacy in the context of dedicated-function devices including the Internet Of Things. This has lately come to the fore thanks to the KRACK WPA2 wireless-network security exploit which mainly affects Wi-Fi client devices. In this situation, it would be of concern regarding these devices due to the fact that the device vendors and the chipset vendors don’t regularly update the software for their devices.

But ARM Holdings, a British chipmaker behind the ARM RISC microarchitecture used in mobile devices and most dedicated-function devices has joined with Google Cloud Platform and others to push for an Internet-Of-Things data security platform. This is very relevant because the ARM RISC microarchitecture satisfies the needs of dedicated-function device designs due to the ability to yield greater functionalities using lean power requirements compared to traditional microarchitecture.

Here, the effort is centred around open-source firmware known as “Firmware-M” that is to be pitched for ARMv8-M CPUs. The Platform Security Architecture will allow the ability for hardware / software / cloud-system designers to tackle IoT threat models and analyse the firmware with a security angle. This means that they can work towards hardware and firmware architectures that have a “best-practice approach” for security and user-friendliness for devices likely to be used by the typical householder.

There is still the issue of assuring software maintenance over the lifecycle of the typical IoT and dedicated-function device. This will include how newer updated firmware should be deployed to existing devices and how often such updates should take place. It will also have to include practices associated with maintaining devices abandoned by their vendors such as when a vendor ceases to exist or changes hands or a device reaches end-of-life.

But at least it is another effort by industry to answer the data-security and user-privacy realities associated with the Internet Of Things.

Consumer Electronics Show 2013–Part 3

Introduction

In Part 1, I had covered the home entertainment direction with such technologies as the 4K UHDTV screens, smart TV, and the presence of alternate gaming boxes. Then in Part 2, I had covered the rise of touchscreen computing, increased pixel density the 802.11ac Wi-Fi network segment amongst other things. Now I am about to cover the mobile-computing technology which is infact a strong part of the connected lifestyle.

Mobile technology

Smartphones

A major direction that is showing up for smartphones is the 5” large-screen devices that have been brought about by the Samsung Galaxy Note series of smartphones. These are described as “phablets” because they are a bridge device between the traditional 4” smartphone and the 7” coat-pocket tablet.

Sony are premiering the new Xperia premium Android phones which are the Xperia Z and Xperia ZL 5” standard with 1080p display. The Xperia ZL is a dual-SIM variant of the XPeria Z. As well, Huawei have increased their foothold in the US market by offering more of the reasonably-priced regular smartphones.

There has been some more effort towards standardised wireless charging for the smartphone. This is although there are two groups promoting their standards – the Power Matters Alliance and the Wireless Power Consortium who maintain the Qi (chee) wireless-charging standard. Examples of this include Toyota implementing the Qi standard in their 2013 Avalon vehicles and Nokia integrating it in to their Lumia 920 smartphones.

On the accessories front, Invoxia had launched an iPhone dock which connected two desk phones to the iPhone. The original device used the iPhone as an outside line for the desk phones whereas the current version launched here also works as a VoIP terminal for the desk phones. It also works with a supplied iOS softphone app to have the iPhone as a softphone for the VoIP setup.

Tablets

Now there is an increasing number of the 7” coat-pocket tablets which were previously dismissed in the marketplace but made popular by the Google Nexus 7 and Amazon Kindle Fire. The Windows-RT-based devices were showing up more as a 10” tablet or a detachable-keyboard hybrid device.

Polaroid, trying to keep their brand alive in consumers’ minds after the demise of their legendary instant-picture cameras, have launched a few of the Android tablets. One is a 7” unit pitched for use by children. Here, this model uses 8Gb onboard storage and microSD expansion, 2-megapixel camera and works only with 802.11g/n Wi-Fi networks. It is built in a rugged form to withstand little ones’ handling but can work well for environments where a coat-pocket tablet device could cop a lot of hard wear-and-tear. The M10 is a 10” variant with a brushed-metal finish.

RCA fielded an 8” Android tablet that is made by Digital Stream and has integrated TV tuners. Here, it could pick up conventional ATSC digital TV and mobile ATSC (Dyle) broadcasts and works to the Android ICS. Personally, I would suspect that this device could be sold out to other markets, perhaps under other brands and equipped with local-spec tuners like DVB-T tuners.

Mobile technology

The ARM-based microprocessor has raised the ante for more powerful work by offering the same number of processor cores as the newer IA-32 or IA-64 processors used in regular computers. Yet this could allow for increased computing power with less power requirements thus making the embedded devices, smartphones and tablets that use RISC processing do more.

Here, NVIDIA launched the Tegra 4 which is a4-core ARM CPU that can yield faster response from tablets and smartphones. Samsung raised the bar with their Exynos 5 Octa which is an 8-core ARM CPU.

Samsung used this event to show a prototype 5.5” (1280×720) flexible screen and a 55” flexible screen as a proof-of-concept. As well, LG increased the pixel density by exhibiting a 5.5” 1080p smartphone screen.

The connected home

There has been very little happening concerning home automation and security through the past years of the Consumer Electronics Show but this year, the connected home has increased its foothold here.

This is demonstrated through the concept of mobile apps being used to control or monitor appliances, thermostats, security systems and the like.

Here, Motorola demonstrated a “Connected Home” router being a device that allows you to control a network-enabled central-heating thermostat using an app on an Android phone. What I liked of this was that the mobile device used to manage that thermostat wasn’t just the Apple iPhone and you were able to move away from that hard-to-program wall thermostat.

This has been brought about through the Nest thermostat opening up the market for user-friendly thermostats for heating / cooling systems. Here, this could lead to a commercial-style heating-control setup with a small wall-mounted box that works as a temperature sensor but may have a knob or two buttons for you to adjust the comfort level “on the fly”. Then you use your smartphone, tablet or computer that runs an easy-to-understand app to program comfort levels for particular times of particular days.

Alarm.com, a firm who provide monitoring for home automation and security sold through large retailers, has provided a “dashboard app” for their equipment that works on their platform. This app runs on the common mobile-phone platforms (iOS, Android, Blackberry and Windows Phone 8) so you can use your phone to check on the state of things with your Alarm.com setup.

Similarly, the Securifi Almond+ 802.11ac Wi-Fi router was exhibited at this year’s CES. This is a regular home network router but has integrated Zigbee / Z-Wave wireless home-automation-network support. Here, this device can be seen as a dashboard for the connected home and they are intending to fund this with a Kickstarter campaign.

As for appliances, Dacor integrated a 7” Android tablet into their high-end wall oven and this provides for guided cooking including recipe lookup. Of course, Samsung hasn’t let go of the Internet fridge dream and exhibited a four-door fridge with an integrated app-driven screen that can work alongside their Android phones and tablets. They also exhibited a top-loading washing machine that uses an LCD control panel and is able to be controlled with a smartphone.

This is part of the “Internet of things” and this concept was underscored by a few manufacturers becoming charter members of the “Internet Of Things Consortium”. It is about an open-frame vendor-independent infrastructure for interlinking home automation / security, consumer entertainment, and computing devices using the common standards and common application-programming interfaces.

Automotive Technology

Of course the car is not forgotten about at the Consumer Electronics Show, and is considered as an extension of our connected lives.

A main automotive drawcard feature for this year are the self-driving cars; but the core feature for now are the app platforms for vehicle infotainment systems. Infact, Ford and GM are encouraging people to develop software for their infotainment setups. This is exploiting the fact that midrange and premium cars are increasingly being equipped with Internet connections and highly-sophisticated infotainment systems that have navigation, mobile phone integration and media playback.

Here, you might think of navigation, Internet radio / online content services and communications services. It may also include “one-touch” social destination sharing amongst other things.

For example, Google Maps to come in to Hyundai and Kia cars as part of their UVO connected infotainment platform. The first vehicle to have this is the Kia Sorrento (model-year 2014). Similarly Hyundai are implementing the MirrorLink smartphone-user-interface-replication technology in the infotainment setups.

As well. TuneIn Radio and Apple Siri integration are to be part of model-year 2013 Chevrolet Sonic & Spark cars. Ford has implement the Glympse social-destination-sharing software as part of their SYNC AppLink platform.

Similarly, Pioneer are extending the AppRadio functionality across most of their head-units so you can have certain iOS apps managed from the dashboard. They have also provided connectivity options for Apple’s iPhone 5 device with its Lightning connector and iOS 6 platform.

Last but not least

Pebble were showing a Kickstarter-funded concept of an E-paper smartwatch that interlinks with your smartphone. Here. I was wondering whether E-paper and E-ink could become the new LCD display for devices that can rely on an available-light display. It was also a way where these “smartwatches” were having us think back to the 80s where the more features and functions a digital watch had, the better it was and you could start showing off that watch to your friends.

Conclusion

This year has underscored a few key trends:

  • the 4K UHDTV display and displays with increased pixel density being mainstream,
  • the acceptance of touchscreen computing with regular computers courtesy of Windows 8,
  • the arrival of very lightweight laptop computers,
  • NFC becoming a common setup method for smartphones and consumer AV,
  • the draft 802.11ac Gigabit Wi-Fi network segment being exhibited with relatively-mature equipment,
  • the 5” smartphone and 7” tablet becoming mainstream mobile options

and has shown up what can be capable in our connected lives. Who knows what the next major trade shows will bring forth, whether as a way to “cement” these technologies or launch newer technologies. Similarly, it would be interesting whether these technologies would catch on firmly in to the marketplace.

ARM-based microarchitecture — now a game-changer for general-purpose computing

Article:

ARM The Next Big Thing In Personal Computing | eHomeUpgrade

My comments

I have previously mentioned about NVIDIA developing an ARM-based CPU/GPU chipset and have noticed that this class of RISC chipset is about to resurface in the desktop and laptop computer scene.

What is ARM and how it came about

Initially, Acorn, a British computer company well known for the BBC Model B computer which was used as part of the BBC’s computer-education program in the UK, had pushed on with a RISC processor-based computer in the late 1980s. This became a disaster due to the dominance of the IBM-PC and Apple Macintosh computer platforms as general-purpose computing platforms; even though Acorn were trying to push the computer as a multimedia computer for the classroom. This is although the Apple Macintosh and the Commodore Amiga, which were the multimedia computer platforms of that time, were based on Motorola RISC processors.

Luckily they didn’t give up on the RISC microprocessor and had this class of processor pushed into dedicated-purpose computer setups like set-top boxes, games consoles, mobile phones and PDAs. This chipset and class of microarchitecture became known as the ARM (Acorn RISC Microprocessor) chipset.

The benefit of these RISC (Reduced Instruction Set Computing) class of microarchitecture was to achieve an efficient instruction set that suited the task-intensive requirements that graphics-rich multimedia computing offered; compared to the CISC (Complex Instruction Set Computing) microarchitecture that was practised primarily with Intel 80×86-based chipsets.

There was reduced interest in the RISC chipset due to Motorola pulling out of the processor game since the mid 2000s when they ceased manufacturing the PowerPC processors. Here, Apple had to build the Macintosh platform for the Intel Architecture because this was offering RISC performance at a cheaper cost to Apple; and started selling Intel-based Macintosh computers.

How is this coming about

An increasing number of processor makers who have made ARM-based microprocessors have pushed for these processors to return to general-purpose computing as a way of achieving power-efficient highly-capable computer systems.

This has come along with Microsoft offering a Windows build for the ARM microarchitecture as well as for the Intel microarchitecture. Similarly, Apple bought out a chipset designer when developed ARM-based chipsets.

What will this mean for software development

There will be a requirement for software to be built for the ARM microarchitecture as well as for the Intel microarchitecture because these work on totally different instruction sets. This may be easier for Apple and Macintosh software developers because when the Intel-based Macintosh computers came along, they had to work out a way of packaging software for the PowerPC and the Intel processor families. Apple marketed these software builds as being “Universal” software builds because of the need to suit the two main processor types.

Windows developers will be needing to head down this same path, especially if they work with orthodox code where they fully compile the programs to machine code themselves. This may not be as limiting for people who work with managed code like the Microsoft .NET platform because the runtime packages could just be prepared for the instruction set that the host computer uses.

Of course, Java programmers won’t need to face this challenge due to the language being designed around a “build once run anywhere” scheme with “virtual machines” that work between the computer and the compiled Java code.

For the consumer

This may require that people who run desktop or laptop computers that use ARM processors will need to look for packaged software or downloadable software that is distributed as an ARM build rather than for Intel processors. This may be made easier through the use of “universal” packages that are part of the software distribution requirement.

It may not worry people who run Java or similar programs because Oracle and others who stand behind these programming environments will be needing to port the runtime environments to these ARM systems.

Conclusion

This has certainly shown that the technology behind the chipsets that powered the computing environments that were considered more exciting through the late 1980s are now relevant in today’s computing life. These will even provide a competitive development field for the next generation of computer systems.

Next Windows to have ARM build as well as Intel build. Apple,used to delivering MacOS X for Motorola PowerPC RISC as well as Intel CPUs, to implement Apple ARM processors on Macintosh laptops.

Processor Chipsets with built-in Graphics

 

BBC News – Intel to launch chipsets with built-in graphics

My comments

With Intel now showing interest in supplying a processor chip with an integrated graphics processor, this will raise the stakes when it comes to supplying single-chip CPU / GPU solutions.

Why supply a single-chip CPU/GPU solution

There is the obvious benefit in design size that it would yield. This would of course allow for more compact applications and, of course, the bill-of-materials costs would be reduced thus allowing for cheaper devices. Another key benefit would be that the single-chip solution would have reduced power needs, which is important for battery-operated devices like laptops, tablet computers and, especially, smartphones.

There is also the reality that most consumer electronics devices like electronic picture frames, digital cameras, TVs / video peripherals and hi-fi equipment are being designed like the general-purpose computers and most of them will also benefit from these CPU/GPU chips. This has become evident with most of these devices offering network and Internet connectivity in a way to augment their primary function or beyond that primary function.  They will also yield the reduced “bill-of-materials” costs and the reduced power demands for this class of device which will become a market requirement.

Similarly, an increasing number of office equipment / computer peripherals, home appliances and “backbone” devices (HVAC / domestic-hot-water, building safety / security, etc) are becoming increasingly sophisticated and offering a huge plethora of functions. I had noticed this more so with the multifunction printers that I have reviewed on this site where most of them use a colour bitmap LCD display and a D-toggle control as part of their user interfaces.

Therefore manufacturers who design these devices can benefit from these single-chip CPU/graphics solutions in order to support these requirements through reduced supporting-power requirements or design costs. In the case of “backbone” devices which typically require the uses to operate them from remotely-located user-interface panels i.e. programmable thermostats or codepads, there isn’t the need to require too much power from the host device to support one or more of these panels even if the panel is to provide access to extended functions.

The market situation

The Intel Sandy Bridge which is just about to be launched at the time of publication, would provide improved graphics. This is in a market which AMD has just entered with their Zacate CPU / graphics chip and been dominated by ARM who have been involved in the smartphone scene. This firm’s design was infact used as part of the Apple A4 chip used in the iPhone 4 and iPad.

With three companies in the market, this could yield a highly-competitive environment with a run for high-quality quickly-drawn graphics, quick CPU response, power conservation / long battery runtime and small circuit size / reduced bill-of-materials. This may also yield a “run for the best” which also yields desirable functionality being available at prices that most people can afford.

The only limitation with this concept is that the single-chip design may make the market for discrete graphics chipsets and cards only for people who value extreme-performance graphics.

Conclusion

The reduced size of these new single-chip CPU/GPU setups could replicate the success of what has happened with the arrival of the 80486 processor with its integrated floating-point coprocessor. It could then make for a longer battery runtime for portable applications and lead to smaller cooler-running computers for most applications.