TDK was once known in its earlier years for storage media, especially tapes and, subsequently floppy and optical discs. During that time, when any of us wanted high-quality audio or video recording, we chose this name as one of our preferred brands. Infact one idea they were known for in the 1970s was a cost-effective high-bias magnetic tape formula known as “Super Avilyn” which yielded as good an audio or video recording result as traditional chrome-based high-bias magnetic tape.
Now that we have moved to MP3 players, smartphones and hard-disk-based storage of audio and video content, this company had diversified in to cheaper audio equipment to the open market and reduced its presence in storage media for that market. Here, Hitachi and others have been improving on the data capacity of hard disks over the many years with TDK disappearing in to the background in this field.
But they have not left this storage-medium expertise of theirs behind in this hard-disk-based data-storage era. Here, they raised the data-density bar for hard disks further, thus allowing for 1.5 terabyte per square inch. The article raised possibilities of 15” laptops coming with single 2.5” hard disks greater than 1Tb or desktop computers being equipped with 3.5” hard disks greater than 2Tb. This would also appeal to the current trend for low-profile and “all-in-one” desktops having the same storage as what was acceptable for larger designs.
For the home or small-business network, I would see possibilities like NAS units being in the order of at least 10Tb. This is in conjunction with PVRs and similar home-entertainment equipment able to work with many hours of ultra-high-definition video material especially as the 4K and 8K ultra-high-definition video technologies which yield cinema-quality video come closer.
Personally, I would expect this technology to materialise in the form of hard disks within the next two years once TDK have got it proven in a form for manufacturers to use. It also happens to be coinciding with the South-East-Asian hard-disk factories coming back on stream after the Thailand floods, this making it feasible to see the return of “dime-a-dozen” hard-disk storage.
Previously, a computer that had serious gaming credibility, commonly described as a “gaming rig”, was a full-size tower-style PC that was decked out with “hotted-up” processors, highly-strung graphics-card circuitry and other components. These setups needed intense cooling and, in some applications, used elaborate cooling systems as part of some wild case designs. They were typically connected to large displays and gaming-optimised input devices as well as intense surround-sound systems.
Now Maingear have redefined how a gaming computer should be designed by releasing the Alpha 24 Super. This is an “all-in-one” computer that is able to take a full-size PCI Express graphics card and use it to drive the main screen. It has a similar kind of expandability as the HP Z1 all-in-one workstation which, although pitched as a CAD or graphics-arts workstation, can be built out as an intense gaming rig.
It can support a 256Gb mSATA SSD and 3Tb regular hard disk as its main secondary storage as well as having 2 miniPCI Express slots for further function expandability. Maingear are offering it with the NVIDIA GeForce GTX-650 or the GTX-680 which have Optimus automatically-selectable graphics “overdrive”. This means that it can save on energy costs and cooling needs when undertaking regular Web browsing or office work. As for the display, this unit supports a 24” HD touchscreen for Windows 8 and has an HDMI input so it can work as a display for video peripherals. North-American users can have this computer equipped with a CableCard-compliant TV tuner for use as the “all-in-one” bach-pad entertainment setup when it comes to regular computing use, games, TV, DVD or online video.
What I am impressed about this computer is that it is another “all-in-one” that allows you to upgrade / expand / repair it yourself, this allowing the computer to have a very long useful life. I would also reckon that it could be considered as a “poor man’s” alternative to the HP Z1 Workstation.
Those of you who have heard a lot about Apple’s latest iPhones and other devices will know about them being equipped with the “Retina” display. This is primarily Apple’s take on a high-pixel-density display, and this will become an increasing trend over the next few years for most computing environments.
What is the high-pixel-density dsiplay
These are displays that have a pixel-density of at least 200-250 pixels per inch and are represented by devices like the 3rd generation Apple iPad, tie Apple iPhone 4S or the Sony PlayStation Vita. This is compared to most desktop and notebook computers offering a pixel density of 120-150 pixels per inch.
The main benefit is to see an image on the display that looks like what you would normally see with the naked eye. For text, this would appear as though you are reading a regular book; and would come in to being with e-book applications but can also apply to regular word-processing or similar work.
Similarly photos and computer graphics would acquire the smoothness of a photo that was taken using ordinary film or an painting that was painted by an artist.
As well as the display surface having this kind of resolution and the display subsystem being able to show images to this resolution, the operating system would have to paint the user interface at a regular viewable pitch. This is no mean feat with the current desktop and mobile operating systems like Apple’s desktop and mobile operating systems, Windows 7 and the Android mobile operating system.
This has been aided through the use of vector images for the text and shapes that form the user interface and the ability to determine certain viewable pitch. It then allows for these features to be rendered at a more natural look that takes advantage of this higher pixel-density.
At the moment, the high-pixel-density display will be limited to smartphone, tablet and laptop applications up to 16”. This is until LCD and OLED display fabricators can supply display subsystems with these pixel densities at a cost that allows the construction of larger displays at prices that are tolerable to the mainstream computing market.
What I would observe is that the Windows 8 platform would increasingly, with the increasingly powerful display subsystems. make the idea of the high pixel-density display common for most Windows-based regular computing platforms rather than just the Apple platforms. This same idea could be taken further with the next or subsequent generation of mobile and dedicated-purpose computing devices.
From my observations, Android has been known to offer an open-frame computing platform for the smartphone and tablet. This has included access to independent content services as well as access to third-party browsers, independent content-transfer paths, and standards-based setup.
Now the Kickstarter project has asssisted the OUYA Android-based gaming platform which has been called as an effort to “open up” the last “closed” gaming environment i.e. the television. This environment has been effectively controlled by Microsoft, Sony and Nintendo through the sale of loss-leading consoles and developers finding it hard to cotton on to one of these console platforms without having to pony up large sums of money or satisfy onerous requirements.
The OUYA gaming platform could be seen as an effort to take Android’s values of openness to this class of device, especially by allowing independent games authors and distributors to have access to a large-screen console gaming platform. One of the main requirements is to provide free-to-play parts for a game title like what has successfully happened with games for regular computers, mobile devices and Web-driven online / social play. This is where games were available with demo levels or with optional subscriptions, microcurrency trading or paid add-on content.
Other companies have stood behind OUYA as an IPTV set-top box platform with TuneIn Radio (an Internet-radio directory for mobile phones) and VeVo (an online music-video service with access to most of the 1980s-era classics) giving support for this platform.
The proof-of-concept console uses the latest technology options like a Tegra3 ARM processor, 1Gb RAM / 8Gb secondary flash storage, 802.11g/n Wi-Fi and Ethernet networking as well as Bluetooth 4.0 Smart-Ready wireless peripheral interface. The controllers have analogue joysticks, a D-pad, a trackpad and link via this Bluetooth interface. They are also a lightweight statement of industrial design.
But I would like to see some support for additional local storage such as the ability to work with a USB hard disk or a NAS for local games storage. This could allow one to “draw down”extras for a game that they are playing
What is possible for the OUYA gaming platform
Hardware development and integration
But what I would like to see out of this is that the OUYA platform is available as an “open-source” integration platform. This could mean that someone who was to build a smart-TV, an IPTV set-top box or a PVR could integrate the OUYA platform in to their product in the same vein as what has successfully happened with the Android platform. For example, Philips or B&O could design a smart TV that uses the OUYA platform for gaming or a French ISP like Free, SFR or Bougyes Télécom offering a “triple-play” service could have the OUYA platform in an iteration of their “décodeur” that they supply to their customers.
Similarly the specification that was called out in the proof-of-concept can be varied to provide different levels of functionality like different storage and memory allowances or different hardware connections.
Software development and distribution
For software development, the OUYA platform can be seen as an open platform for mainstream and independent games studios to take large-screen console gaming further without having to risk big sums of money.
Examples of this could include the development and distribution of values-based games titles which respect desired values like less emphasis on sex or violence; as well as allowing countries that haven’t built up a strong electronic-games presence, like Europe, to build this presence up. There is also the ability to explore different games types that you may not have had a chance to explore on the big screen.
The OUYA platform could satisfy and extend vertical markets like venue-specific gaming / entertainment systems such as airline or hotel entertainment subsystems or arcade gaming; and could work well for education and therapy applications due to this open-frame platform.
What needs to happen is that there be greater industry and consumer awareness about the OUYA open-source large-screen gaming platform so that this platform is placed on the same level as the three established platforms. This this could open up a path to an open-frame computing platform success that the Android platform has benefited from.
Most USB memory keys and similar devices do present themselves to your computer as a single volume or “logical disk”. In Windows, this would be represented as one drive letter and volume name for the device and a Macintosh would show up one extra drive icon on the Desktop when you plug the device in. These devices do work well with specific-function USB host devices like printers or audio/video equipment.
Multiple-volume USB devices
Single USB socket on Kingston Wi-Drive to connect two logical volumes
But there are devices out there that don’t present themselves as a single logical volume. These can range from a memory key or external hard disk that has been formatted as two logical volumes to USB memory-card drives that have multiple slots for the different card types and devices that have fixed storage and a memory-card drive. It can also include mobile phones and MP3 players that have internal storage but also have a microSD card slot.
The former situation is best represented by the Kingston Wi-Drive which I just reviewed here. It presented itself as two logical volumes – one being a read-only volume for the Wi-Fi access point user interface and another for users to store their data at.
How different hosts handle multiple-volume USB devices
This class of device would show up as two or more different drive letters and volume names in Windows or show up as two or more drive icons on the Macintosh desktop. You may have to make sure each volume is safely dismounted in the operating system before you disconnect the device from the computer.
USB memory key used to play music in a NAD C446 Media Tuner
But an increasing number of specific-purpose devices are being equipped with USB ports for connecting USB storage devices to. This typically allows you to print documents or photos held on the USB storage device or play / show audio-video content through the screen and/or speakers attached to or integrated in the host device. Infact this setup is used in cars as a preferred alternative to the multi-disc CD stackers that used to exist in the boot (trunk) or dash.
Some devices even write to the USB storage device, typically to store configurations, recorded audio / video content or locally-cached BD-Live online data.
The main problem with these USB storage devices that present themselves as multiple logical volumes is that most of the specific-purpose devices cannot successfully mount the multiple-logical-volume devices at all.
Typically, they would give up the ghost at such attempts, as I noticed with the Kogan WiFi Digital Radio when I tried to connect the Kingston Wi-Drive which had some music on it to the radio. As well, the host-device manufacturers stipulate that you cannot try to use such storage devices with their devices. One person I talked to tonight mentioned that he had to be careful about how he formatted the USB memory key he used for storing music to play in his car’s stereo system.
What can be done
The idea of mounting multiple volumes of the common file systems could be investigated with these dedicated-purpose devices. Here, it could allow the volumes in the device to be presented as multiple “disks” if multiple suitable volumes exist. They could then be listed using a generic “USB+number” name for unlabelled volumes and the volume name for labelled volumes. Most applications would need to mount and use one volume at a time whereas some applications may allow for concurrent multiple-volume access.
The volume-selection option could be provided as part of selecting the files or folders to work with or, in the case of audio-video applications, the USB port used by the multi-volume storage device could be “split” as extra logical sources for each eligible volume.
This may require a small amount of extra code so that different volumes at a physical interface can be enumerated and made available but the idea of supporting multi-volume USB storage devices by dedicated-purpose host devices could be worth investigating.
Acer Aspire S3 Ultrabook – suits air travel very well
You expect that you use a laptop, especially an Ultrabook through most of the day on a mixture of tasks ranging from basic data entry / content creation through playing audio and video content to even playing games. Often these tasks require you to be online all the time, thus causing you to use the built-in Wi-Fi network adaptor or a wireless-broadband adaptor whether built-in or plugged in to the machine’s USB port or ExpressCard slot. Here, you expect the battery to last around 6-10 hours on this activity mix with you just plugging the laptop in to its charger and having it on charge for up to 8 hours while you eat and sleep.
This is compared to you having to make sure that you have the charger in your laptop bag when you are out for the day and looking for power outlets all the time after a significant amount of battery-only activity.
A common reality with the battery life on a laptop is that this factor can be easily assessed, measured and reviewed on new machines where the battery would be performing at its best. But as the battery pack ages as you use the machine, the runtime will typical deplete through the repeated charge and discharge cycles. Similarly, the kind of usage a customer throws at a laptop or similar device may not be typical of what a manufacturer or reviewer would observe due to the mixed nature of this use, the laptop’s configuration which may be different from what was assessed with, and the peripherals connected to the machine.
The article talked of the idea of a laptop having the horsepower of a “thin-and-light” like an Ultrabook or the Toshiba Portege R830, but as thick as a regular “standard” laptop. But this extra thickness is taken up with a larger high-capacity battery pack that facilitates a longer runtime. It is a practice that has been tried before with some portable-computer implementations including the Apple iPad.
As well, Sony have implemented this concept with their add-on external-battery-packs that are available for their VAIO S Series and VAIO Z Series that I have reviewed. From the various comments and images that I have come across regarding these accessory batteries and their use with the VAIO laptops, the laptops don’t become obnoxiously weighty or thick when the battery packs are installed.
Nowadays the issue of battery life in the ultra-slim laptops is becoming more important as Intel releases the Ivy Bridge processor range which provides integrated graphics for these computers that can yield high performance for most video and gaming needs. Of course, it may not satisfy the needs of competitive LAN-party gamers or high-end video-editing / graphics needs.
Here, intense graphics-rich activity on these computers, such as a marathon session of Civilization V on the long-haul flight or a long run of video editing and transcoding for that “on-the-road” vidcast may put a strain on the Ultrabook’s battery. If the concept of a laptop equipped with the abilities of a “thin-and-light” but using a high-capacity battery; or a “thin-and-light” working with a high-capacity add-on battery pack was followed through, you could then provide for more room with the battery even after this kind of activity takes place.
Of course, I would still ensure that there is the ability to run these computers on external power as a way of not compromising battery runtime when you are using the computer at your “base”, whether it be your office, hotel room or car.
It is then still worth factoring in long-run battery life in laptop’s design, whether as part of the computer itself or as an optional accessory, especially for those of us who may consider working online and away from power for significant periods.
A major change has been happening for the average desktop computer system. Here this class of computer isn’t just the large box that sat under the screen or the large tower that sat beside the desk. In some ways, these desktop computers are now being welcomed back in to the main living areas of the house rather than being shut out to the den.
There are two major directions that are being made available for this class of computer.
This style integrates the computer circuitry, the screen and all of the secondary storage in one box about the size of a small flatscreen TV. The keyboard and mouse appear as separate devices that connect to this unit. An example of this style is the Sony VAIO J Series “all-in-one” that I have recently reviewed.
The style was inspired initially by the Apple Macintosh being the most popular of this form factor, but was augmented initially by the “transportable” computers that appeared at various times through the 1980s. Compaq also tried to bring this style in to being in the mid 1990s but with little success.
Some all-in-one variants where the computer circuitry, keyboard and secondary storage may appear but this style has been and could be targeted at the “retro 80s” market. This is because most of the computers that were popular in the early days of hobby and home computing that existed through the late 70s to the late 80s like the Commodore 64, the Apple IIe or the Sinclair ZX computers were based on that design layout even though, in a lot of the early designs, the secondary storage was outside of the system.
Most of these machines now have a touchscreen built in to them so as to make them appeal as interactive terminals. But HP have raised the stakes in this form factor by develop the Z1 which was a high-powered 27” workstation that implements a modular design so that it can be upgraded or repaired more easily.
Low-profile system units
Another direction for the desktop computer is for a traditional “three-piece” system to be equipped with a low-profile system unit. In earlier times a low-profile system unit was a box about the size of a typical video recorder or hi-fi CD player released through the late 80s and was very unreliable due to intense heat build-up.
Now these are units that appear in different sizes ranging from a small book to a loaf of bread to an ordinary two-slice toaster and some may be mistaken for a typical consumer network-attached-storage unit. This may include “pizza-box” designs that are so slim that you don’t know they are there; and the highly-powerful heavy-duty servers that are as big as the classic desktop computer designs.
The common features with these newer desktop-computer designs include a thermal design that relies less on a constantly-running fan to keep the system cool. In some cases, any system cooling fans that are used in these computers may operate in an “on-demand” manner where they come on if the system is running hot. This then leads to a reduced noise output from these computers compared to the traditional desktop computer.
Similarly, some of these computers will even use an outboard power supply that looks like the kind that would come with a laptop computer. Of course these would be designed to work without the use of a cooling fan.
Depending on the configuration, you may have new-design desktop computers that may suit average desktop computing tasks whereas you may have highly-compact systems like the HP Z1 that can perform heavy multimedia, graphics or intense gaming tasks.
On the other hand, most of these systems may not be as adaptable to newer needs as a classic desktop system. This may be due to a lot of the systems being built around integrated rather than standards-driven modular architecture.
Choosing the right form factor for your needs
If you want to place value on a touchscreen on a desktop setup, you could go for a large-screen all-in-one that has this feature. Similarly, the all-in-one can come in handy for a brand-new computer system where you are starting from scratch.
On the other hand, if you have a display type, size or arrangement in mind, you could value a low-profile desktop units. This same situation can come in handy if you have a screen, keyboard and mouse that is still in good order. In some cases, you could easily hide the system unit behind the screen or a peripheral if you don’t like the look of it.
It is also worth knowing that some of the larger low-profile desktop units may have room for expansion with the ability to add one or two expansion cards such as installing a discrete graphics card or upgrade secondary storage to your needs.
The traditional “tower-style” desktop is still a sure bet if you place your emphasis on expandability, ultra performance or a system that has to suit your computing needs exactly. Here, these should be purchased from a quality independent computer store who can build them “to order” or have one or more systems available “off the peg” at a cost-effective price to start from.
At least this the improvements in the new desktop-computer designs have allowed for the desktop computer system to be considered as a system option for most computing tasks in environments where aesthetics or noise issues do matter.
I have previously talked about on this site about the concept of standards-common expansion modules for use with laptops, especially Ultrabooks. These devices, also known as docking stations, would have connections for peripherals that you would typically used at your desk like larger displays, Ethernet network connections or work-specific peripherals.
Infoact one of these devices was part of an ultraportable laptop that I had reviewed, namely the Sony VAIO Z Series unit; and this one included a slot-load optical-disc drive that reads Blu-Ray Discs.
Now Lenovo have presented the ThinkPad USB 3.0 Dock, which connects to the host laptop using a USB 3.0 connection, already common on most laptops including higher-priced Ultrabooks. But it exploits the higher data throughput of USB 3.0 to allow for more than what one would typically expect from these devices.
For example, the expansion module is a network adaptor for Cat5 Gigabit Ethernet networks and an external sound module as well as a self-powered USB 3.0 hub for five peripherals. The self-powered USB hub also has the advantage of supplying power to USB peripherals independently of the host computer so that you could charge up smartphones and other gadgets or use it as a power supply for USB-driven gadgets.
But it uses DisplayLink technology to use the USB 3.0 connection to drive external displays while using the host computer’s graphics subsystem. This can encourage us to use the large displays with these laptops without needing to connecting them to the computer itself.
What I would like about this expansion module and any expansion modules designed along this line is that it isn’t dependent on the laptop being a Lenovo ThinkPad model at all, let alone a Lenovo unit. Compared to the Sony solution which exploited a proprietary “Light Path” setup over USB 3.0, this could be used with computers that use any USB 3.0 port.
This is more so as the next generation of Ultrabooks come with USB 3.0 ports integrated in to them but may have two or three of these ports as well as fewer connections for other wired peripherals. Infact the more of these devices that exist, the better it would be for people who use “work-home” laptops or 13” ultraportabls as travel/desk computers/
I had heard about Corning’s new series of videos about glass being more than just windows, mirrors and drinks containers. Their vision in these videos was to have windows, mirrors and similar objects as display surfaces for computer-hosted data; as well as for other applications like photovoltaic (solar) cells or electrochromic uses like tinting or frosting on demand.
Some of these visions include windows that are clear but become frosted “on demand” for privacy or show images or text such as a themed photo cluster or a diagram, with some being touchscreens for interacting with the display or being a control surface for lighting for example. The applications were being extended to automotive use like the glass displays being part of a dashboard for example.
This has been made feasible through efforts like the “Gorilla Glass” technology that is now being implemented in smartphones, tablets and large displays like TVs. Here, this glass is about an increasingly-tough surface or about a thinner glass surface for an LCD or OLED display application (including a touchscreen) being as tough as a glass surface of regular thickness.
It is even worth noting that Philips was also involved in “taking glass further” with mirrors that are displays and lately with an OLED light / solar-cell combination which is transparent one moment and a light-source another moment while supplying extra power during the day. This latter application was pitched again at cars with a way of bringing more light in to the car but also working as an interior light when it is darker.
At least this shows that there will be many different game-changers when it comes to the design of display and similar technologies.
A question that many people will be pondering nowadays when they consider a secondary computing device is whether to get a small laptop computer like a netbook or Ultrabook or a tablet computer like the iPad along with an accessory keyboard. There will be the tradeoffs of each platform such as software availability and user-interface requirements.
This will become more so when Windows 8 with its Metro touch user interface being part of the operating system and becoming another full-bore competition to the Apple iOS platform.
But ASUS have answered with an Ultrabook that can bridge between the notebook / laptop and tablet form factors in the cost-effective and power-efficient way that has been required of the Ultrabook. This machine will be the first “convertible” Ultrabook that has the “swivel-head” screen design like what I have experienced with the Fujitsu TH550M convertible notebook.
This will work tightly with the integrated touchscreen interface that Windows 8 provides rather than the previous practice where the manufacturers fabricated their own touch-optimised shell for these computers.
The ASUS convertible Ultrabook could offer a tablet-style user interface for casual computing needs yet have the full proper keyboard that would appeal to us when working on emails or documents; yet it will have the benefits that tablets like the iPad offer like quick start-up and long battery runtimes.
The main question is that whether other manufacturers would make the convertible Ultrabook form factor and make these computers cost-effective and widely available or will they be taken in by just supplying tablets as a distinct touchscreen product class?
Send to Kindle
Privacy & Cookies Policy
Necessary cookies are absolutely essential for the website to function properly. This category only includes cookies that ensures basic functionalities and security features of the website. These cookies do not store any personal information.
Any cookies that may not be particularly necessary for the website to function and is used specifically to collect user personal data via analytics, ads, other embedded contents are termed as non-necessary cookies. It is mandatory to procure user consent prior to running these cookies on your website.