Tag: computer design

The classes of computers to be blurred–is this the trend?

Article

A dual Windows-Android machine: PC industry savior or non-starter? | Mobile – CNET News

My Comments

Sony VAIO Duo 11 slider-convertible tablet

A computer that slides to become a tablet or laptop

Previously, a computer with a screen greater than 11”, having a physical QWERTY keyboard and running a desktop operating system like Windows, MacOS X or Linux was a separate class of computer from something that had a smaller screen, no physical keyboard and running a mobile operating system.

Now we are starting to see these classes become blurred by the arrival of 7” and 10” tablets running Windows 8.1 on Intel x86 microarchitecture, along with a plethora of ultra-portable laptops with integrated physical keyboards that convert to tablets whether by folding the keyboard under the screen or detaching the keyboard. This is now augmented with a new trend where computers can boot between Windows 8.1 and Android or run both operating systems concurrently; and Android is being ported to work on the classic Intel microarchitecture.

HP Envy X2 Detachable-Keyboard Hybrid Tablet

The HP X2 family – showcasing the trend for a detachable-keyboard tablet computer

What is happening for both consumers and business users is that they will find it hard to determine which kind of computer is exactly the right one for them to use for their needs. Operating systems and baseline hardware configurations may lose their position as a factor for determining a computer’s suitability to a particular task.

Rather I see factors like the screen size which typically affects the computer’s size and form factor; the graphics or audio chipsets; the existence of a physical keyboard and its actual size; as well as the unit’s connectivity, primary RAM and secondary-storage capacity along with the presence and runtime of an integrated battery being what determines the computer’s suitability for particular tasks and operating conditions that a user may put it to.

Sony VAIO Fit 15e on dining table

The 15″ mainstream laptop will still earn its keep as an option for one’s “digital hub”

For example, if you are creating a lot of documents and other textual content, a full-sized physical keyboard would be considered important. Similarly the size of the screen along with the computer’s form factor and the battery integrated in the computer would also affect its portability and suitability to certain tasks.

In a lot of cases, you may end up with multiple devices where each device suits a particular task or activity. For example a 7”-8” tablet that you can stuff in to a coat pocket may come in to its own when you want something that has material you refer to when you are on the road. This is while you keep a 10”-14” ultraportable computer for when you are “doing a bit more” like taking notes or creating content “on the road”; or you may keep a 15”-17” laptop or a larger-screen desktop computer as your “main digital hub”.

Desktops of a sessile nature like traditional 3-piece desktops and “all-in-one” desktops will typically end up just for applications where the computer is used in one place only. Whereas the “adaptive all-in-one” computers of the Sony VAIO Tap 20 ilk, along with 15”-17” high-end laptops will end up for those situations where the computer will be shifted as required.

What will become of this is to look at particular features and the size and form-factor of a computer to rate its suitability for a task you are targeting it at rather than thinking that one computer would suit all your needs.

The evolution of the mobile lifestyle summed up in a video

GSMA YouTube videoclip

http://youtu.be/wOryfTLTc1oClick to view (if you can’t see it below or want to “throw” to a Smart TV with a TwonkyMedia solution)

My Comments

This video sums up in two minutes and thirty seconds how the online lifestyle has evolved from the late 70s with the hobbyist computers through video games being popular to the brick-like mobile phones and desktop computers coming in to every office.

It then also shows how the mobile online life is becoming integrated in to everyone’s daily lifestyle and workstyle. Have a chance to look at this clip!

An all-in-one PC now with gaming credentials

Article

Maingear introduces first boutique gaming all-in-one PC | Reviews – Desktops – CNET Reviews

My Comments

Previously, a computer that had serious gaming credibility, commonly described as a “gaming rig”, was a full-size tower-style PC that was decked out with “hotted-up” processors, highly-strung graphics-card circuitry and other components. These setups needed intense cooling and, in some applications, used elaborate cooling systems as part of some wild case designs. They were typically connected to large displays and gaming-optimised input devices as well as intense surround-sound systems.

Now Maingear have redefined how a gaming computer should be designed by releasing the Alpha 24 Super. This is an “all-in-one” computer that is able to take a full-size PCI Express graphics card and use it to drive the main screen. It has a similar kind of expandability as the HP Z1 all-in-one workstation which, although pitched as a CAD or graphics-arts workstation, can be built out as an intense gaming rig.

It can support a 256Gb mSATA SSD and 3Tb regular hard disk as its main secondary storage as well as having 2 miniPCI Express slots for further function expandability. Maingear are offering it with the NVIDIA GeForce GTX-650 or the GTX-680 which have Optimus automatically-selectable graphics “overdrive”. This means that it can save on energy costs and cooling needs when undertaking regular Web browsing or office work. As for the display, this unit supports a 24” HD touchscreen for Windows 8 and has an HDMI input so it can work as a display for video peripherals.  North-American users can have this computer equipped with a CableCard-compliant TV tuner for use as the “all-in-one” bach-pad entertainment setup when it comes to regular computing use, games, TV, DVD or online video.

What I am impressed about this computer is that it is another “all-in-one” that allows you to upgrade / expand / repair it yourself, this allowing the computer to have a very long useful life. I would also reckon that it could be considered as a “poor man’s” alternative to the HP Z1 Workstation.

The idea of the convertible ultrabook becomes real with ASUS

Articles

Asustek to showcase swivel-screen notebook at 2012 Computex | DigiTimes

Un ultrabook convertible chez Asus ? | Le Journal Du Geek (France – French language)

My Comments

A question that many people will be pondering nowadays when they consider a secondary computing device is whether to get a small laptop computer like a netbook or Ultrabook or a tablet computer like the iPad along with an accessory keyboard. There will be the tradeoffs of each platform such as software availability and user-interface requirements.

This will become more so when Windows 8 with its Metro touch user interface being part of the operating system and becoming another full-bore competition to the Apple iOS platform.

But ASUS have answered with an Ultrabook that can bridge between the notebook / laptop and tablet form factors in the cost-effective and power-efficient way that has been required of the Ultrabook. This machine will be the first “convertible” Ultrabook that has the “swivel-head” screen design like what I have experienced with the Fujitsu TH550M convertible notebook.

This will work tightly with the integrated touchscreen interface that Windows 8 provides rather than the previous practice where the manufacturers fabricated their own touch-optimised shell for these computers.

The ASUS convertible Ultrabook could offer a tablet-style user interface for casual computing needs yet have the full proper keyboard that would appeal to us when working on emails or documents; yet it will have the benefits that tablets like the iPad offer like quick start-up and long battery runtimes.

The main question is that whether other manufacturers would make the convertible Ultrabook form factor and make these computers cost-effective and widely available or will they be taken in by just supplying tablets as a distinct touchscreen product class?

Intel’s Ivy Bridge next-generation chipset intending to offer

Article

Intel’s Ivy Bridge chip packs understated goodies | Business Tech – CNET News

My Comments

Intel are working on the next-generation “Ivy Bridge” computing chipset which will be considered the technical successor to the successful Sandy Bridge chipset.

High-performance integrated graphics

One major benefit that this chipset will offer is graphics performance. Here, these chipsets will be tuned for better performance than Sandy Bridge’s “Intel HD” graphics. This will lead to more powerful Integrated graphics which can also improve on the power economy. Here, this may improve the laptop’s credentials as a gaming machine. This is also augmented by integrated DirectX 11 support for games and advanced graphics applications.

The obvious question is whether it will put AMD and NVIDIA “on notice” as far as their role in supplying discrete graphics chipsets is concerned? I would see this as allowing both these companies to focus their efforts on developing their graphics chipsets as the “performance chipsets”. This is in a similar vein to the likes of Creative Labs who provide highly-tuned sound subsystems for computers;.

Here, it could allow companies intending to offer high-performance computers for CAD and hardcore gaming to implement improved dual-chipset setups while giving mainstreams users including average game players access to improved performance graphics. AMD and NVIDIA could focus on making highly-tuned graphics subsystems that show their prowess in the LAN party or the design office.

USB 3.0

Another bonus that will come about of this would be an improved USB chipset. This will provide low-latency USB data transfer and streaming; as well as inherent support for USB 3.0 . This is compared to the current USB 3.0 implementation which has another chipset serving one or two USB 3.0 ports while another serves a few USB 2.0 ports.

Windows 8

This chipset is intended to be targeted with the impending arrival of Windows 8 and these functions will provide a direct tie-in with the new operating system. This is more so with the USB 3.0 and improved USB functionality which is supported by a new USB service stack in Windows 8.

Conclusion

I would see this new chipset improve all of the computing sectors and could put performance graphics into the reach of the average computer users who will be exposed to more intense graphics and multimedia. The improved data throughput will benefit laptop users who use external storage or USB audio / video peripherals frequently.

At least it is a step towards power-effective, cost-effective high-performance computing for the mainstream.

ARM-based microarchitecture — now a game-changer for general-purpose computing

Article:

ARM The Next Big Thing In Personal Computing | eHomeUpgrade

My comments

I have previously mentioned about NVIDIA developing an ARM-based CPU/GPU chipset and have noticed that this class of RISC chipset is about to resurface in the desktop and laptop computer scene.

What is ARM and how it came about

Initially, Acorn, a British computer company well known for the BBC Model B computer which was used as part of the BBC’s computer-education program in the UK, had pushed on with a RISC processor-based computer in the late 1980s. This became a disaster due to the dominance of the IBM-PC and Apple Macintosh computer platforms as general-purpose computing platforms; even though Acorn were trying to push the computer as a multimedia computer for the classroom. This is although the Apple Macintosh and the Commodore Amiga, which were the multimedia computer platforms of that time, were based on Motorola RISC processors.

Luckily they didn’t give up on the RISC microprocessor and had this class of processor pushed into dedicated-purpose computer setups like set-top boxes, games consoles, mobile phones and PDAs. This chipset and class of microarchitecture became known as the ARM (Acorn RISC Microprocessor) chipset.

The benefit of these RISC (Reduced Instruction Set Computing) class of microarchitecture was to achieve an efficient instruction set that suited the task-intensive requirements that graphics-rich multimedia computing offered; compared to the CISC (Complex Instruction Set Computing) microarchitecture that was practised primarily with Intel 80×86-based chipsets.

There was reduced interest in the RISC chipset due to Motorola pulling out of the processor game since the mid 2000s when they ceased manufacturing the PowerPC processors. Here, Apple had to build the Macintosh platform for the Intel Architecture because this was offering RISC performance at a cheaper cost to Apple; and started selling Intel-based Macintosh computers.

How is this coming about

An increasing number of processor makers who have made ARM-based microprocessors have pushed for these processors to return to general-purpose computing as a way of achieving power-efficient highly-capable computer systems.

This has come along with Microsoft offering a Windows build for the ARM microarchitecture as well as for the Intel microarchitecture. Similarly, Apple bought out a chipset designer when developed ARM-based chipsets.

What will this mean for software development

There will be a requirement for software to be built for the ARM microarchitecture as well as for the Intel microarchitecture because these work on totally different instruction sets. This may be easier for Apple and Macintosh software developers because when the Intel-based Macintosh computers came along, they had to work out a way of packaging software for the PowerPC and the Intel processor families. Apple marketed these software builds as being “Universal” software builds because of the need to suit the two main processor types.

Windows developers will be needing to head down this same path, especially if they work with orthodox code where they fully compile the programs to machine code themselves. This may not be as limiting for people who work with managed code like the Microsoft .NET platform because the runtime packages could just be prepared for the instruction set that the host computer uses.

Of course, Java programmers won’t need to face this challenge due to the language being designed around a “build once run anywhere” scheme with “virtual machines” that work between the computer and the compiled Java code.

For the consumer

This may require that people who run desktop or laptop computers that use ARM processors will need to look for packaged software or downloadable software that is distributed as an ARM build rather than for Intel processors. This may be made easier through the use of “universal” packages that are part of the software distribution requirement.

It may not worry people who run Java or similar programs because Oracle and others who stand behind these programming environments will be needing to port the runtime environments to these ARM systems.

Conclusion

This has certainly shown that the technology behind the chipsets that powered the computing environments that were considered more exciting through the late 1980s are now relevant in today’s computing life. These will even provide a competitive development field for the next generation of computer systems.

Next Windows to have ARM build as well as Intel build. Apple,used to delivering MacOS X for Motorola PowerPC RISC as well as Intel CPUs, to implement Apple ARM processors on Macintosh laptops.

Understanding the new Thunderbolt peripheral-connection technology

Another of the new technologies that Intel has been promoting alongside its “Sandy Bridge” processor architecture has been the “Thunderbolt” peripheral connector.

Capabilities

This connector has a current raw transfer speed of 10Gbps but could have a theoretical maximum is 40Gbps (20Gbps up and 20Gbps down) when both pairs of wires are used. You can use this same “pipe” to pass a DisplayPort-based audio-video stream for a display as well as PCI-Express-based data stream.

There is the ability to daisy-chain 7 Thunderbolt-connected devices but you can have less than 3 metres between the devices at the moment.

Thunderbolt at the moment

This technology will complement USB and other connection technologies but will be like what happened with USB in the mid-90s. This means that it will be an Apple-only technology and this will appear on the latest run of MacBook Pro laptops.

It will appear on PC-based computers in early next year. As far as retrofit opportunities go, Intel had mentioned that it could be available for new motherboards but there was nothing much said about availability as an add-in expansion card.

The main peripheral applications would be external storage subsystems like the LaCie “Little Big Disk” storage array; as well as displays. Such peripherals that have this connection will typically be marketed as being “Thunderbolt-ready”.

What could it offer

Another storage-expansion connection for computing devices

One key application would be to provide a high-bandwidth direct connection between computer devices and one or more external hard-disk storage subsystems. The reason I use the term “computer devices” is because such devices could encompass PVRs which could benefit from capacity expansion, routers and network devices that convert attached external hard-disk subsystems to network-attached storage; as well as the general-purpose computers.

Multifunction devices that are fit for the new generation of compact high-performance computers

There is the possibility for one to exploit the Thunderbolt concept to design a multifunction desktop console unit. Here, this unit could house a screen, audio subsystem, video camera, removable storage such as an optical drive or SDXC card reader and/or a USB hub. Another variant could house a keyboard instead of a screen and connect to one or more external displays using DisplayPort or regular monitor connectors.

This display unit would be connected to an ultracompact system unit that has only the processor, RAM, graphics-processor, network connectivity and a hard disk, plus some USB sockets for a desktop application. On the other hand, this display could serve as a “desktop display” for a subnotebook or ultraportable computer. The USB hub would come in handy for connecting keyboards, mice, USB memory keys and similar devices.

Here, these multifunction devices can be designed so that they are no “second-class citizen” because they have multiple functions. This means they could render the multiple video streams as well as support the high-capacity removable storage technologies like Blu-Ray Disc or SDXC cards.

This is more so as the Intel Sandy Bridge technology makes it feasible for small computers like book-sized ultracompact desktops and notebooks of the “subnotebook” or “ultraportable” class to “have all the fruit” as far as performance goes.

Issues that may be of concern

One main issue that I would have about the Thunderbolt technology is that Intel could limit it to computer applications that are centred around its chipsets. This would make it harder for competing processor designers like AMD or NVidia to implement the technology in their chipset designs. It would also place the same implementation limits on system designers who want to use chipsets that offer improved performance or better value for money alongside Intel processors on their motherboards.

This is like the Intel Wireless Display technology which allows a special display adaptor to connect to an Intel-based laptop computer via a WiFi wireless network and show the pictures on the attached display device. Here, this functionality could only work with computers that have certain Intel chipsets and couldn’t be retroactively applied to older computers.

Another issue would be to encourage implementation in “embedded” and dedicated-purpuse devices like PVRs and routers as well as the general-purpose computers. For some applications like the previously-mentioned storage-expansion application, this could add value and longer service life to these devices.

Conclusion

Once the Thunderbolt technology is implemented in a competitive manner, it could open up a new class of devices and applications for the computing world by making proper use of the “big fat pipe” that it offers.

Another major change for the Intel-based PC platform will shorten the boot-up cycle

News articles

Getting a Windows PC to boot in under 10 seconds | Nanotech – The Circuits Blog (CNET News)

BBC News – Change to ‘Bios’ will make for PCs that boot in seconds

My comments

The PC BIOS legacy

The PC BIOS which was the functional bridge between the time you turn a personal computer on and when the operating system can be booted was defined in 1979 when personal computers of reasonable sophistication came on the scene. At that time the best peripheral mix for a personal computer was a “green-screen” text display,  two to four floppy disk drives, a dot-matrix printer and a keyboard. Rudimentary computers at that time used a cassette recorder rather than the floppy-disk drives as their secondary storage.

Through the 1980s, there was Improved BIOS support for integrated colour graphics chipsets and the ability to address hard disks. In the 1990s, there were some newer changes such as support for networks, mice, higher graphics and alternate storage types but the BIOS wasn’t improved for these newer needs. In some cases, the computer had to have extra “sidecar” ROM chips installed on VGA cards or network cards to permit support for VGA graphics or booting from the network. Similarly, interface cards like SCSI cards or add-on IDE cards couldn’t support “boot disks” unless they had specific “sidecar” ROM chips to tell the BIOS that there were “boot disks” on these cards.

These BIOS setups were only able to boot to one operating environment or, in some cases, could boot to an alternative operating environment such as a BASIC interpreter that used a cassette recorder as secondary storage. If a user wanted to work with a choice of operating environments, the computer had to boot to a multi-choice “bootloader” program which was a miniature operating system in itself and presented a menu of operating environments to boot into. This was extended to lightweight Web browsers, email clients and media players that are used in some of the newer laptops for “there-and-then” computing tasks.

The needs of a current computer, with its newer peripheral types and connection methods, were too demanding on this old code and typically required that the computer take a significant amount of time from switch-on to when the operating system could start. In some cases, there were reliability problems as the BIOS had to get used to existing peripheral types being connected to newer connection methods, such as use of Bluetooth wireless keyboards or keyboards that connect via the USB bus.

The Universal Extensible Firmware Interface improvement

This is a new improvement that will replace the BIOS as the bootstrap software that runs just after you turn on the computer in order to start the operating system. The way this aspect of a computer’s operation is designed has been radically improved with the software being programmed in C rather than machine language.

Optimised for today’s computers rather than yesterday’s computers

All of the computer’s peripherals are identified by function rather than by where they are connected. This will allow for console devices such as the keyboard and the mouse to work properly if they are connected via a link like the USB bus or wireless connectivity. It also allows for different scenarios like “headless” boxes which are managed by a Web front, Remote Desktop Protocol session or similar network-driven remote-management setup. That ability has appealed to businesses who have large racks of servers in a “data room” or wiring closet and the IT staff want to manage these servers from their desk or their home network.

Another, yet more obvious benefit is for computer devices to have a quicker boot time because the new functions that UEFI allows for and that the UEFI code is optimised for today’s computer device rather than the 1979-81-era computer devices. It is also designed to work with future connection methods and peripheral types which means that there won’t be a need for “sidecar” BIOS or bootstrap chips on interface cards.

Other operational advantages

There is support in the UEFI standard for the bootstrap firmware to provide a multi-boot setup for systems that have multiple operating environments thus avoiding the need to provide a “bootloader” menu program on the boot disk to allow the user to select the operating environment. It will also yield the same improvements for those computers that allow the user to boot to a lightweight task-specific operating environment.

When will this be available

This technology has been implemented in some newer laptops and a lot of business-class servers but from 2011 onwards, it will become available in most desktop and laptop computers that appeal to home users and small-business operators. People who have their computers built by an independent reseller or build their own PCs will be likely to have this function integrated in motherboards released from this model year onwards.