Category: Computer building and repair

Could an upgrade to Windows 8 yield a performance boost to your computer

Article

Installing Windows 8 on your old PC could turn it into Greased Lightning | ZDNet

My Comments

Some of us might think about upgrading an existing computer system to Windows 8 when this operating system is released rather than staying with Windows 7. Most commonly, there would be a lot of doubt about this process but, in some cases, it could improve the computer’s performance.

A Windows computer that was built up in the last four years i.e. since Windows Vista was launched could easily benefit from an upgrade to Windows 8. This is due to the reworked code that is written to work best with the recent generation of hardware.

The speed increase also comes due to natively-integrated desktop security software as well as Internet Explorer 10 and integrated cloud-computing support.

But I would recommend that the system which is being upgraded has current expectations for RAM and secondary-storage capacity. This would be 4Gb RAM and 128Gb of secondary storage as a bare minimum. If the machine uses a hard disk rather than solid-state storage, I would expect it to have at least 320Gb. As I subsequently mention, the operating system is at a price point that may allow you to budget in a hardware upgrade to these expectations.

Users can upgrade their Windows computers to the new operating system at a cost-effective price due mainly to the “electronic hard-copy” distribution that Microsoft is using i.e. to buy and download the operating system online rather than having to buy a packaged copy with an optical disk. This is in a similar way to how Apple are distributing the MacOS X Lion and Mountain Lion operating-system upgrades. It also comes with some attractive licensing terms including the ability for those of us who are building our own systems to legitimately purchase a “system builder” package.

It would be OK to go ahead with the upgrade if you can handle the changes to how the operating environment works, such as the new touch-focused Start “dashboard” and having to “descend further” to get to the standard operating interface. But I would recommend that those of us who aren’t computer-competent should stay with Windows 7 unless they are buying a new computer that comes with Windows 8.

Hitachi outs a pair of 4TB HDDs for your storing pleasure — Engadget

Article

Hitachi outs a pair of 4TB HDDs for your storing pleasure — Engadget

My Comments

Hitachi has raised the ante again for hard-disk storage by delivering a 4Tb 3.5” hard-disk unit. They have packaged it as a retail-sold aftermarket retrofit kit with SATA connectivity for around US$399 and as a USB 3.0-connected external hard disk for US$420.

The Engadget article went on about us thinking of cloud storage as the way to go for personal data storage and that it would please those of us who place emphasis on desktop-local or NAS-hosted data storage. This would include most business operators who want direct control over their business data. I also see this hard disk as being relevant to the network-attached storage sector where you place emphasis on data capacity with these devices as they become local warehouses for high-definition video, high-quality music and high-resolution photos.

A question that may need to be raised with NAS applications is whether the NAS’s firmware / operating-system can address unique physical disks with a capacity of 4 or more terabytes. Here, I would suspect that most Linux-based firmwares could do so but even if the current firmware can’t address the 4Tb or more physical disk, a subsequent version could support the volume size.

Of course, as more hard-disk plants in Taiwan get back to full steam after the floods and more of the 4Tb hard disks come on the market, the prices could reduce where this capacity becomes more reasonable for home and small-business users. Other interesting factors that could come of this include the development of single-unit 2.5” hard disks with capacities of 1Tb or greater or smaller hard-disks with higher capacities that would appeal to those of us with a need for higher mobile data capacity.

Don’t forget that the Browser Choice Screen is your one-stop Web browser port-of-call

Previous Coverage – HomeNetworking01.info

Understanding The Browser Choice Screen (EN, FR)

Web site

Browser Choice Screen – http://browserchoice.eu

My Comments

Previously, I have covered the Browser Choice Screen, which was par of Microsoft’s anti-trust settlement with the European Commission concerning Internet Explorer. This was to be for consumer and small-business Windows setups in the European Union where people were to be offered a choice of Web browser for their operating environment.

But I still see this menu Web page as a “one-stop” port-of-call for people anywhere in the world who want to install new Web browsers or repair a damaged Web-browser installation. This resource came in handy when I was repairing a houseguest’s computer that was damaged by a “system-repair” Trojan Horse. Here, I could know where to go to collect the installation files for the Firefox Web browser that I was to repair so I can restore their Web environment.

If you are creating a system-repair toolkit on a USB memory key, you may visit this resource to download installation packages for the Web browsers to that memory key. Or you can create a shortcut file to this site and store it on the memory key b

ARM-based microarchitecture — now a game-changer for general-purpose computing

Article:

ARM The Next Big Thing In Personal Computing | eHomeUpgrade

My comments

I have previously mentioned about NVIDIA developing an ARM-based CPU/GPU chipset and have noticed that this class of RISC chipset is about to resurface in the desktop and laptop computer scene.

What is ARM and how it came about

Initially, Acorn, a British computer company well known for the BBC Model B computer which was used as part of the BBC’s computer-education program in the UK, had pushed on with a RISC processor-based computer in the late 1980s. This became a disaster due to the dominance of the IBM-PC and Apple Macintosh computer platforms as general-purpose computing platforms; even though Acorn were trying to push the computer as a multimedia computer for the classroom. This is although the Apple Macintosh and the Commodore Amiga, which were the multimedia computer platforms of that time, were based on Motorola RISC processors.

Luckily they didn’t give up on the RISC microprocessor and had this class of processor pushed into dedicated-purpose computer setups like set-top boxes, games consoles, mobile phones and PDAs. This chipset and class of microarchitecture became known as the ARM (Acorn RISC Microprocessor) chipset.

The benefit of these RISC (Reduced Instruction Set Computing) class of microarchitecture was to achieve an efficient instruction set that suited the task-intensive requirements that graphics-rich multimedia computing offered; compared to the CISC (Complex Instruction Set Computing) microarchitecture that was practised primarily with Intel 80×86-based chipsets.

There was reduced interest in the RISC chipset due to Motorola pulling out of the processor game since the mid 2000s when they ceased manufacturing the PowerPC processors. Here, Apple had to build the Macintosh platform for the Intel Architecture because this was offering RISC performance at a cheaper cost to Apple; and started selling Intel-based Macintosh computers.

How is this coming about

An increasing number of processor makers who have made ARM-based microprocessors have pushed for these processors to return to general-purpose computing as a way of achieving power-efficient highly-capable computer systems.

This has come along with Microsoft offering a Windows build for the ARM microarchitecture as well as for the Intel microarchitecture. Similarly, Apple bought out a chipset designer when developed ARM-based chipsets.

What will this mean for software development

There will be a requirement for software to be built for the ARM microarchitecture as well as for the Intel microarchitecture because these work on totally different instruction sets. This may be easier for Apple and Macintosh software developers because when the Intel-based Macintosh computers came along, they had to work out a way of packaging software for the PowerPC and the Intel processor families. Apple marketed these software builds as being “Universal” software builds because of the need to suit the two main processor types.

Windows developers will be needing to head down this same path, especially if they work with orthodox code where they fully compile the programs to machine code themselves. This may not be as limiting for people who work with managed code like the Microsoft .NET platform because the runtime packages could just be prepared for the instruction set that the host computer uses.

Of course, Java programmers won’t need to face this challenge due to the language being designed around a “build once run anywhere” scheme with “virtual machines” that work between the computer and the compiled Java code.

For the consumer

This may require that people who run desktop or laptop computers that use ARM processors will need to look for packaged software or downloadable software that is distributed as an ARM build rather than for Intel processors. This may be made easier through the use of “universal” packages that are part of the software distribution requirement.

It may not worry people who run Java or similar programs because Oracle and others who stand behind these programming environments will be needing to port the runtime environments to these ARM systems.

Conclusion

This has certainly shown that the technology behind the chipsets that powered the computing environments that were considered more exciting through the late 1980s are now relevant in today’s computing life. These will even provide a competitive development field for the next generation of computer systems.

Next Windows to have ARM build as well as Intel build. Apple,used to delivering MacOS X for Motorola PowerPC RISC as well as Intel CPUs, to implement Apple ARM processors on Macintosh laptops.

The new CPU/GPU processor platforms–what change would there be for computing?

Articles

Sony Unveils its new premium VAIO S Series laptops

My comments about the new trend

Cost-effective system design

Due to the integration of the CPU and the graphics processor in the one chip, we will find that most computer systems will become cheaper to purchase. This will also mean that graphics performance for most multimedia and games activity will start to come at a cheaper price and be available in product classes that wouldn’t otherwise have it like mainstream-priced computers and the subnotebook / ultraportable class of portable computer.

Dual-mode graphics

There will also be an increased use of dual-mode graphics technology as a product differentiator for midrange and high-end machines. This is where a computer is equipped with integrated graphics as well as a discrete graphics chipset and the computer uses integrated graphics for most tasks but uses the discrete graphics for video editing and intense gameplay.

This could be seen like the computer-graphics equivalent of the “overdrive” or “sports mode” switch used on some cars as a way of allowing the car to work in a performance-enhanced way. Here, the user benefits from reduced energy needs and reduced battery consumption when they use the integrated graphics but can use the discrete graphics chipset when they need the extra graphics performance.

Could this change the positioning and pricing of computers?

This may have some some effect on the prices for most of the mainstream computer ranges especially if the equipment in question is to be sold with “single-mode” graphics. Of course, the “dual-mode” graphics will still be pitched at the market who place heavy importance on graphics performance like line-of-profession imaging (CAD/ CAM, graphic arts, medical imaging, etc) and “LAN-party” hardcore gamers and will still command the price premium. Here, the manufacturers can still work on performance-optimised discrete GPUs for this market and offer them in the “dual-mode” computers.

Some people may also reckon that the ability for computers based on these chipsets to perform to mainstream expectations for multimedia and gaming may allow people who value these functions to spend less on the equipment that they want. They can also place importance on “size and style” without sacrificing graphics performance.

It can therefore lead to ultra-compact computer types like 12”-14” subnotebook / ultraportable computers and small-form-factor desktop computers being offered with decent rather than second-rate graphics performance. This could, for example, make the subnotebook more appealing as a “travel workstation” for a photo journalist or other professional photographer to use when editing or previewing photographs and video footage in the field.

How to factor this in when buying a computer through this year

What I would reckon that you should do is determine what class of computer that suits your needs, including your minimum specifications for functionality. This includes hard disk capacity, RAM memory capacity, screen size, user interface, operating-system and other factors. Then look for the good deals where you can save money on the prospective computer purchase.

It may also affect the pricing and positioning of computers based on existing “separate-GPU” graphics technology especially as manufacturers move towards the new combined CPU/GPU technologies. Here, they will be wanting to clear the warehouses of these machines and you may find that the deals are favourable to you with these computers. As I said before, work out your system needs and shop around for the cheapest and best one that will suit these needs. Also take advantage of “deal-makers” that will be offered like applications software, higher-tier operating systems (Windows 7 Professional at Windows 7 Home Premium price), and extra RAM and hard-disk capacity.

Conclusion

Once the new CPU/GPU chipsets become the mainstream for desktop and portable computers, this could bring about a subtle but real change affecting the design, product-positioning and pricing of these devices.

Another major change for the Intel-based PC platform will shorten the boot-up cycle

News articles

Getting a Windows PC to boot in under 10 seconds | Nanotech – The Circuits Blog (CNET News)

BBC News – Change to ‘Bios’ will make for PCs that boot in seconds

My comments

The PC BIOS legacy

The PC BIOS which was the functional bridge between the time you turn a personal computer on and when the operating system can be booted was defined in 1979 when personal computers of reasonable sophistication came on the scene. At that time the best peripheral mix for a personal computer was a “green-screen” text display,  two to four floppy disk drives, a dot-matrix printer and a keyboard. Rudimentary computers at that time used a cassette recorder rather than the floppy-disk drives as their secondary storage.

Through the 1980s, there was Improved BIOS support for integrated colour graphics chipsets and the ability to address hard disks. In the 1990s, there were some newer changes such as support for networks, mice, higher graphics and alternate storage types but the BIOS wasn’t improved for these newer needs. In some cases, the computer had to have extra “sidecar” ROM chips installed on VGA cards or network cards to permit support for VGA graphics or booting from the network. Similarly, interface cards like SCSI cards or add-on IDE cards couldn’t support “boot disks” unless they had specific “sidecar” ROM chips to tell the BIOS that there were “boot disks” on these cards.

These BIOS setups were only able to boot to one operating environment or, in some cases, could boot to an alternative operating environment such as a BASIC interpreter that used a cassette recorder as secondary storage. If a user wanted to work with a choice of operating environments, the computer had to boot to a multi-choice “bootloader” program which was a miniature operating system in itself and presented a menu of operating environments to boot into. This was extended to lightweight Web browsers, email clients and media players that are used in some of the newer laptops for “there-and-then” computing tasks.

The needs of a current computer, with its newer peripheral types and connection methods, were too demanding on this old code and typically required that the computer take a significant amount of time from switch-on to when the operating system could start. In some cases, there were reliability problems as the BIOS had to get used to existing peripheral types being connected to newer connection methods, such as use of Bluetooth wireless keyboards or keyboards that connect via the USB bus.

The Universal Extensible Firmware Interface improvement

This is a new improvement that will replace the BIOS as the bootstrap software that runs just after you turn on the computer in order to start the operating system. The way this aspect of a computer’s operation is designed has been radically improved with the software being programmed in C rather than machine language.

Optimised for today’s computers rather than yesterday’s computers

All of the computer’s peripherals are identified by function rather than by where they are connected. This will allow for console devices such as the keyboard and the mouse to work properly if they are connected via a link like the USB bus or wireless connectivity. It also allows for different scenarios like “headless” boxes which are managed by a Web front, Remote Desktop Protocol session or similar network-driven remote-management setup. That ability has appealed to businesses who have large racks of servers in a “data room” or wiring closet and the IT staff want to manage these servers from their desk or their home network.

Another, yet more obvious benefit is for computer devices to have a quicker boot time because the new functions that UEFI allows for and that the UEFI code is optimised for today’s computer device rather than the 1979-81-era computer devices. It is also designed to work with future connection methods and peripheral types which means that there won’t be a need for “sidecar” BIOS or bootstrap chips on interface cards.

Other operational advantages

There is support in the UEFI standard for the bootstrap firmware to provide a multi-boot setup for systems that have multiple operating environments thus avoiding the need to provide a “bootloader” menu program on the boot disk to allow the user to select the operating environment. It will also yield the same improvements for those computers that allow the user to boot to a lightweight task-specific operating environment.

When will this be available

This technology has been implemented in some newer laptops and a lot of business-class servers but from 2011 onwards, it will become available in most desktop and laptop computers that appeal to home users and small-business operators. People who have their computers built by an independent reseller or build their own PCs will be likely to have this function integrated in motherboards released from this model year onwards.

Processor Chipsets with built-in Graphics

 

BBC News – Intel to launch chipsets with built-in graphics

My comments

With Intel now showing interest in supplying a processor chip with an integrated graphics processor, this will raise the stakes when it comes to supplying single-chip CPU / GPU solutions.

Why supply a single-chip CPU/GPU solution

There is the obvious benefit in design size that it would yield. This would of course allow for more compact applications and, of course, the bill-of-materials costs would be reduced thus allowing for cheaper devices. Another key benefit would be that the single-chip solution would have reduced power needs, which is important for battery-operated devices like laptops, tablet computers and, especially, smartphones.

There is also the reality that most consumer electronics devices like electronic picture frames, digital cameras, TVs / video peripherals and hi-fi equipment are being designed like the general-purpose computers and most of them will also benefit from these CPU/GPU chips. This has become evident with most of these devices offering network and Internet connectivity in a way to augment their primary function or beyond that primary function.  They will also yield the reduced “bill-of-materials” costs and the reduced power demands for this class of device which will become a market requirement.

Similarly, an increasing number of office equipment / computer peripherals, home appliances and “backbone” devices (HVAC / domestic-hot-water, building safety / security, etc) are becoming increasingly sophisticated and offering a huge plethora of functions. I had noticed this more so with the multifunction printers that I have reviewed on this site where most of them use a colour bitmap LCD display and a D-toggle control as part of their user interfaces.

Therefore manufacturers who design these devices can benefit from these single-chip CPU/graphics solutions in order to support these requirements through reduced supporting-power requirements or design costs. In the case of “backbone” devices which typically require the uses to operate them from remotely-located user-interface panels i.e. programmable thermostats or codepads, there isn’t the need to require too much power from the host device to support one or more of these panels even if the panel is to provide access to extended functions.

The market situation

The Intel Sandy Bridge which is just about to be launched at the time of publication, would provide improved graphics. This is in a market which AMD has just entered with their Zacate CPU / graphics chip and been dominated by ARM who have been involved in the smartphone scene. This firm’s design was infact used as part of the Apple A4 chip used in the iPhone 4 and iPad.

With three companies in the market, this could yield a highly-competitive environment with a run for high-quality quickly-drawn graphics, quick CPU response, power conservation / long battery runtime and small circuit size / reduced bill-of-materials. This may also yield a “run for the best” which also yields desirable functionality being available at prices that most people can afford.

The only limitation with this concept is that the single-chip design may make the market for discrete graphics chipsets and cards only for people who value extreme-performance graphics.

Conclusion

The reduced size of these new single-chip CPU/GPU setups could replicate the success of what has happened with the arrival of the 80486 processor with its integrated floating-point coprocessor. It could then make for a longer battery runtime for portable applications and lead to smaller cooler-running computers for most applications.

Graphics chipsets: ATI is no more, AMD is now the brand

 AMD jettisons ATI brand name, makes Radeon its own – The Tech Report

My comments

Some of us who have observed what has happened with the ATI graphics chipset name was taken over by AMD and were wondering what would happen with this name and the graphics-chipset scene.

Now that AMD has changed the brand for the ATI chipsets to their own brand, who knows what could happen next especially when it comes to computer display solutions, especially integrated-display setups like in laptops, all-in-one PCs and low-profile desktop computers.

One way that the situation could evolve would be for AMD to end up making motherboard or chipset solutions centred around an AMD CPU and GPU setup. This may be in a similar vein to the Intel Centrino solutions which include an Intel Wi-Fi chipset as well as the Intel CPU.

The worst thing that could affect high-end graphics and gaming users is for AMD to pull out of the plug-in display-card scene thus reducing a competitive aftermarket when it comes to performance graphics. This is because the ATI brand has been put up as an alternative to NVIDIA when it came to aftermarket and OEM plug-in display cards pitched at the gaming, multimedia and performance graphics scene.

Once we see disappearance of brands that are part of a competitive market, there has to be others who well provide competing products or a nasty monopoly or cartel can start to exist.

Hitachi-LG optical-reader / solid-state drive combo for laptops

Articles

Hitachi-LG teases HyDrive: an optical reader with loads of NAND (video) – Engadget

Web site

http://www.mysterydrive.net

My comments

The main thing that impressed me about this was that both the tray-load optical drive and the solid-state drive wore integrated in to the same low-profile chassis that would suit installation in to a laptop. There are many benefits that I see with this.

One would be that you could have a laptop specification that has both a large-capacity hard disk that is used for data and a lower-capacity solid-state drive used for the operating system and applications. It could then allow for battery economy and quick starts while the high capacity on the hard disk can exist for the user’s data and this hard disk is only spun up when the user’s files need to be loaded or saved.

As well, if Hitachi and LG move towards higher solid-state capacities, this could allow for low-profile laptops like the “thin-and-light” segment to have the SSD as the main system drive while supporting an optical drive.

SSD drives now available for IDE-based computers

News articles

Du SSD pour les “vieilles” bécanes | Le Journal du Geek (France – French language)

My comments

You may be keeping an older IDE-based computer going or have a computer which has one IDE bus but plenty of SATA connectors on the motherboard. Hey, you may think of adding a solid-state drive to this computer in order to benefit from high operating speeds and low energy consumption but the fact that the only vacant secondary-storage interface is IDE-based throws your plans haywire.

What Buffalo has now done is to provide an SSD which connects to the IDE bus on these computers. The main limitation with this is that they only come in a 2.5” chassis, which means that you may have to use a 2.5” mounting kit and adaptor plugs if the computer you plan to upgrade is your desktop rig. They have a 64Mb cache and come in capacities of 32Gb, 64Gb and 128Gb with a price list of USD$250, USD$360 and USD$630 respectively. This may be a steep premium to pay if you want that quicker boot time for your older computer.

In my opinion, I would place the 64Gb drive as a drop-in replacement for the system drive (operating system, program files) in a multi-drive computer while keeping the “data” drives as regular rotary drives. Here, this could lead to quick boots and application starts without much power being used. The 128Gb drive may be useful as a drop-in replacement for the hard drives in older laptops that have a fair bit of life in them so as to keep them running longer on their batteries.