Current and Future Trends Archive

Sony’s move to the high-end is a sign that Japan is becoming like Europe in the 1970s

Article

Sony steps into high-end home audio, marks move away from mass market – PC World Australia

My Comments

After I read this article about Sony focusing on the high-end audio and video market and reading the press about Sharp suffering deep losses, I have noticed that what is happening with Japan is very similar to what has happened with consumer electronics in Europe and, to some extent, America through the 1970s and 1980s.

Initially during the 1960s, Europe was replete with many consumer-electronics brands that were started off within that area like Blaupunkt, Grundig, Bang & Olufsen, Nordmende, as well as Philips. These brands had product ranges that, in some cases, covered the whole market share. This was happening as Japan and the rest of South East Asia was cutting in to the consumer electronics and photography market through that decade. There was a popular consensus about Japanese products being of inferior quality to these European-sourced products during that period.

But during those 1970s and 1980s, the Japanese names were busily yielding equipment that was able to do the job very capably for a cheaper price compared to the European names. As well most of the Japanese manufacturers were busily innovating while turning out products that appealed across the market share. So, while some European names walked out of the consumer electronics scene, most of the Europeans took steps to focus on the high-end aspirational market, thus keeping their space in that market reserved with these names being considered special.

What is now about to happen with Sony and some other Japanese brands is that they will end up like the European brands where they possess a rarified status. Here, they turn out premium equipment at a premium price; while leaving the loss-leading popular equipment ranges out of their lineup. Most likely, I would suspect that the equipment will be like some of the British names such as Wharfedale where the emphasis will be on the quality of the experience. As well, some of these companies would be working towards innovation and, in some cases, component building where they supply components to other electronics names.

The article made references to Korean companies targeting the mass market but I would reckon that LG & Samsung would focus on the high-value end of this market and work towards the good-quality equipment. This could be in some ways, drifting towards the high-end market. Similarly, the pressure by Chinese workers to see their labour valued properly could migrate China towards better-quality goods.

Send to Kindle

Defining parameters for 4K and 8K ultra-high-resolution displays

Article

ITU meets to define 4K and 8K UHDTV parameters – Engadget

My comments

We are starting to see the arrival of ultra-high-definition video displays being available for general-purpose computing requirements. This yields cinema-quality vision experience as if normally seen directly by the eye.

But the concept has existed in a general form where a well-bred current-generation digital still camera is able to take an image with that resolution. As well, some screens used in particular industries like medical imaging are implementing this kind of pixel-dense display. Similarly, some video setups like the recent practice of exhibiting performances of opera or classic plays in the cinema through the use of video links use the ultra-high-definition setups.

The technology is also being assisted through the availability of pixel-dense display technology in computer devices. Examples of this include Apple’s “Retina” technology used in the latest iPhone and iPad devices and starting to appear across some of Apple’s 13” MacBook computers. This could be implemented in larger display areas like flatscreen TVs and desktop monitors.

Here, a particular resolution and aspect ratio needs to be called for both the 4K and 8K displays. This may be a point to bring in the 21:9 display ratio used for cinema applications; and could help with providing an improved video experience for the films that were used to showcase Cinemascope or Panavision.

But after 1080p (1920×1080) was called as a standard for HDTV displays, which has allowed a point of reference to be used for this application; there needs to be a standard for this kind of ultra-high-definition display. This can allow the displays to be marketed properly such as with a standard logo that applies to equipment that meets one or more of the criteria.

This may also affect how visual layouts are worked on so we can think more of display physical sizes and application classes rather than particular resolutions. It will also mean the use of vector-based user-interface displays or graphics assets that suit particular display densitys as what is being put forward for Windows 8 software design.

Send to Kindle

Holden to add smartphone-linked network audio to their cars

Article

Holden Adds Stitcher To Its Infotainment Systems, Pandora And TuneIn On The Way | Gizmodo Australia

My Comments

Previously I have covered the issue of Internet radio and networked audio in the automotive context and raised the possible scenarios that apply to this application. They were either a smartphone or MiFi device acting as a network router between a mobile broadband service and a Wi-Fi segment in the car with the car radio being an Internet radio; a car infotainment system with an integrated mobile broadband router; or a smartphone or tablet with the appropriate app working as an Internet radio or network audio endpoint and connected to the car stereo typically via USB, Bluetooth or line-level connection.

Vehicle builders and, to some extent, car-audio manufacturers are implementing a two-way setup which integrates the smartphone with the car infotainment system. In most cases, the link would be fulfilled by a Bluetooth wireless connection for control, communications audio and entertainment audio and, depending on the setup, an interface app installed on the iOS or Android smartphone that works with particular information, music and other apps.

Holden, like most of the GM nameplates around the world, have followed this path for their infotainment by introducing the MyLink system to the Barina CDX small car. Here, this would require the use of an iOS or Android smartphone with a bridge app linked by Bluetooth to the car. But the phone would be managed at least using the touchscreen on the dashboard. Initially the Holden solution is to work with the Stitcher Internet-radio platform but they are intending to have it work with Pandora and TuneIn Radio.

There is an intent to allow you to work your smartphone platform’s navigation function on the dash using the “BringGo” software so you are not needing to have the phone on a “cobra mount” if you want to use Google Maps or Apple Maps.

What I see of this is that vehicle builders are integrating your smartphone or tablet as a part of the vehicle not just for communications but for information, entertainment and navigation.

Send to Kindle

Sony Vaio Tap 20–a new class of personal computer

Article

Sony Vaio Tap 20 Review – Watch CNET’s Video Review

My Comments

We have seen the desktop-replacement laptops with the 17” displays as the pinnacle of the laptop computer class but Sony has introduced a new computer device class that bridges two other computer classes. This is part of an increased run of touch-enabled computers that take advantage of the Windows 8 touch shell.

This computer, known as the VAIO Tap 20 is a bridge between the tablet computer and the all-in-one desktop computer of the ilk of the VAIO J Series that I reviewed. Here, it is a Windows 8 tablet with the multi-touch user interface, but it can rest on a stand which links to a keyboard and mouse for regular all-in-one use.

It has 4Gb RAM and 750Gb on the hard disk but doesn’t have an integrated optical drive or HDMI video input. The screen comes in at 20” with a 1600×900 resolution while it is powered by an Intel i5 third-generation processor.

The CNET article still found this computer to have what they considered as dubious performance abilities of the all-in-one class and they found that, although it runs the Windows 8 operating system and has the NFC abilities, it is not worth the money. This is although the HP Z1 Workstation and the Malmgear Alpha 24 Super are showing up as highly-capable all-in-one computers that can handle advanced graphics for work and play.

But what I see of this is that it could be a proving ground for this computer class as more of the all-in-one computers come on the market in response to Windows 8. This is in the form a a large tablet computer which can work as a desk-based computer. Once Sony or someone else issue a “follow-up” model that has the better specifications and features, this could be a chance to legitimise the “all-in-one” tablet hybrid computer as a credible computing device.

Send to Kindle

TDK does amazing things to my home network by increasing the hard-disk data density further

Article

TDK breaks the Hard Drive density limit, could go on to develop super-sized storage — Engadget

My Comments

TDK was once known in its earlier years for storage media, especially tapes and, subsequently floppy and optical discs. During that time, when any of us wanted high-quality audio or video recording, we chose this name as one of our preferred brands. Infact one idea they were known for in the 1970s was a cost-effective high-bias magnetic tape formula known as “Super Avilyn” which yielded as good an audio or video recording result as traditional chrome-based high-bias magnetic tape.

Now that we have moved to MP3 players, smartphones and hard-disk-based storage of audio and video content, this company had diversified in to cheaper audio equipment to the open market and reduced its presence in storage media for that market. Here, Hitachi and others have been improving on the data capacity of hard disks over the many years with TDK disappearing in to the background in this field.

But they have not left this storage-medium expertise of theirs behind in this hard-disk-based data-storage era. Here, they raised the data-density bar for hard disks further, thus allowing for 1.5 terabyte per square inch. The article raised possibilities of 15” laptops coming with single 2.5” hard disks greater than 1Tb or desktop computers being equipped with 3.5” hard disks greater than 2Tb. This would also appeal to the current trend for low-profile and “all-in-one” desktops having the same storage as what was acceptable for larger designs.

For the home or small-business network, I would see possibilities like NAS units being in the order of at least 10Tb. This is in conjunction with PVRs and similar home-entertainment equipment able to work with many hours of ultra-high-definition video material especially as the 4K and 8K ultra-high-definition video technologies which yield cinema-quality video come closer.

Personally, I would expect this technology to materialise in the form of hard disks within the next two years once TDK have got it proven in a form for manufacturers to use. It also happens to be coinciding with the South-East-Asian hard-disk factories coming back on stream after the Thailand floods, this making it feasible to see the return of “dime-a-dozen” hard-disk storage.

Send to Kindle

Creating a small data cloud–why couldn’t it be done

The small data-replication cloud

Netgear ReadyNAS

These network-attached storage devices could be part of a personal data-replication cloud

Whenever the “personal cloud” is talked of, we think of a network-attached storage device where you gain access to the data on the road. Here, the cloud aspect is fulfilled by a manufacturer-provided data centre that “discovers” your NAS using a form of dynamic DNS and creates a “data path” or VPN to your NAS. Users typically gain access to the files by logging in to a SSL-secured Web page or using a client-side file-manager program.

But another small data cloud is often forgotten about in this class of device, except in the case of some Iomega devices. This represents a handful of consumer or small-business NAS units located at geographically-different areas that are linked to each other via the Internet. Here, they could synchronise the same data or a subset of that data between each other.

This could extend to applications like replicating music and other media held on a NAS to a hard disk installed in a car whether the vehicle is at home, at the office or even while driving. The latter example may be where you purchase or place an order for a song or album via the wireless broadband infrastructure with the content ending up on your car’s media hard disk so it plays through its sound system. Then you find that it has been synchronised to your home’s NAS so you can play that album on your home theatre when you arrive at home.

What could it achieve?

An example of this need could be for a small business to back up their business data to the network-attached storage device located at their shop or office as well as their owner’s home no matter where the data is created.

Similarly, one could copy their music and video material held on the main NAS device out to a NAS that is at holiday home. This can lead to location-specific speedy access to the multimedia files and you could add new multimedia files to the NAS at your holiday home but have this new collection reflected to your main home.

Here, one could exploit a larger-capacity unit with better resiliency, like the business-grade NAS units pitched at small businesses, as a master data store while maintaining less-expensive single-disk or dual-disk consumer NAS units as local data stores at other locations. This setup may appeal to businesses where one location is seen as a primary “office” while the other location is seen as either a shopfront or secondary office.

This kind of setup could allow the creation of a NAS as a local “staging post” for newly-handled or regularly-worked data so as to provide a resilient setup that can survive a link failure. In some cases this could even allow for near-line operation for a business’s computing needs should the link to a cloud service fail.

User interface and software requirements

This same context can be built on the existing remote-access “personal cloud” infrastructure and software so there is no need to “reinvent the wheel” for a multi-NAS cloud.

Similarly, users would have to use the NAS’s existing management Web page to determine the location of the remote NAS devices and the data sets they wish to replicate. This can include how the data set is to be replicated such as keeping a mirror-copy of the data set, or contributing new and changed data to a designated master data set or a combination of both. The data set could be the copy of a particular NAS volume or share, a folder or group of folders or simply files of a kind.

The recently-determined UPnP RemoteAccess v2 standard, along with the UPnP ContentSync standards could simplify the setup of these data-synchronisation clouds. This could also make it easier to provide heterogenous data clouds that exist for this requirement.

But one main requirement that needs to be thought of is that the computer systems at both ends cannot collapse or underperform because the link fails. There has to be some form of scalability so that regular small-business servers can be party to the cloud, which may benefit the small-business owner who wants to integrate this hardware and the home-network hardware as part of a data-replication cloud.

Hardware requirements

A small data cloud needs to support cost-effective hardware requirements that allow for system growth. This means that it could start with two or more consumer or SME NAS devices of a known software configuration yet increase in capacity and resilience as the user adds or improves storage equipment at any location or rents storage services at a later stage.

This could mean that one could start with one single-disk NAS unit at each location, then purchase a small-business NAS equipped with a multi-disk RAID setup, setting this up at the business. The extra single-disk unit could then be shifted to another location as a staging-post disk or extra personal backup.

Conclusion

What NAS manufacturers need to think of is the idea of supporting easy-to-manage multi-device data-replication “personal clouds” using these devices. This is alongside the current marketing idea of the remote-access “personal cloud” offered for these devices.

Send to Kindle

Other manufacturers can yield more cool devices now

Click to view: Samsung’s latest video / TV ad for the Galaxy S 3

My Comments

Just lately, as Apple were launching the iPhone 5 and the fanbois were lininig up outside the Apple Stores or mobile-carrier outlets to be the first to get this phone, Samsung have been running a video campaign about how more advanced their phones are compared to the Apple product.

Previously, I touched on Android’s competitive-environment abilities such as the use of other browsers or ability to shift content to the phone using the computer’s file system. This has also underscored the ability to provide paths to innovation that we are seeing in devices that work to this platform. The commercial that I am referring to, along with other Samsung TV commercials for the Galaxy S3, even emphasised the near-field communication technology as a content-transfer technology rather than just as an authentication technology, thanks to Android Beam.

Similarly, the latest crop of Windows-based computers that appeared over the last few years are showing that this operating environment is still a breeding ground for innovation. One key feature that we will be seeing more of is the touchscreen on these computers, most of which will have this feature work alongside a supplied or standards-compliant optional keyboard. I even reviewed a taste of things to come when I reviewed the Sony VAIO J Series all-in-one desktop. This was also augmented when I heard of a Toshiba Ultrabook that was to come with an NFC, which could support file transfer in the Android Beam manner.

This is showing that there are other companies and IT operating platforms out there who can make and improve the technology that maintains the “cool factor” in its use, rather than only one company with its platforms. It is the sign of healthy competition when this kind of innovation takes place.

Send to Kindle

Symantec Symposium 2012–My observations from this event

Introduction

Yesterday, I attended the Symantec Symposium 2012 conference which was a chance to demonstrate the computing technologies Symantec was involved in developing and selling that were becoming important to big business computing.

Relevance to this site’s readership

Most solutions exhibited at this conference are pitched at big business with a fleet of 200 or more computers. But there were resellers and IT contractors at this event who buy these large-quantity solutions to sell on to small-business sites who will typically have ten to 100 computers.

I even raised an issue in one of the breakout sessions about how manageability would be assured in a franchised business model such as most fast-food or service-industry chains. Here, this goal could be achieved through the use of thin-client computers or pre-configured equipment bought or leased through the franchisor.

As well, the issues and solution types of the kind shown at this Symposium tend to cross over between small sites and the “big end of town” just like a lot of office technology including the telephone and the fax machine have done so.

Key issues that were being focused were achieving a secure computing environment, supoorting the BYOD device-management model and the trend towards cloud computing for the systems-support tasks.

Secure computing

As part of the Keynote speech, we had a guest speaker from the Australian Federal Police touch on the realities of cybercrime and how it affects the whole of the computing ecosystem. Like what was raised in the previous interview with Alastair MacGibbon and Brahman Thiyagalingham about secure computing in the cloud-computing environment, the kind of people committing cybercrime is now moving towards organised crime like East-European mafia alongside nation states engaging in espionage or sabotage. He also raised that it’s not just regular computers that are at risk, but mobile devices (smartphones and tablets), point-of-sale equipment like EFTPOS terminals and other dedicated-purpose computing devices that are also at risk. He emphasised issues like keeping regular and other computer systems up to date with the latest patches for the operating environment and the application software.

This encompassed the availability of a cloud-driven email and Website verification system that implements a proxy-server setup. This is designed to cater for the real world of business computing where computer equipment is likely to be taken and used out of the office and used with the home network or public networks like hotel or café hotspots. It stays away from the classic site-based corporate firewall and VPN arrangement to provide controlled Internet access for roaming computers. It also was exposing real Internet-usage needs like operating a company’s Social-Web presence, personal Internet services like Internet banking or home monitoring so as to cater for the ever-increasing workday, and the like. Yet this can still allow for an organisation to have control over the resources to prevent cyberslacking or viewing of inappropriate material.

Another technique that I observed is the ability to facilitate two-factor authentication for business resources or customer-facing Websites. This is where the username and password are further protected by something else in the similar way that your bank account is protected at the ATM using your card and your PIN. It was initially achieved through the use of hardware tokens – those key fobs or card-like devices that showed a random number on their display and you had to enter them in your VPN login; or a smart card or SIM that required the use of a hardware reader. Instead Symantec developed a software token that works with most desktop or mobile operating systems and generates this random code. It even exploits integrated hardware security setups in order to make this more robust such as what is part of the Intel Ivy Bridge chipset in second-generation Ultrabooks.

Advanced machine-learning has also played a stronger part in two more secure-computing solutions. For example, there is a risk assessment setup being made available where an environment to fulfill a connection or transaction can be assessed against what is normal for a users’s operating environment and practices. It is similar to the fraud-detection mechanisms that most payment-card companies are implementing where they could detect and alert customers to abnormal transactions that are about to occur, like ANZ Falcon. This can trigger verification requirements for the connection or transaction like the requirement to enter a one-time-password from a software token or an out-of-band voice or SMS confirmation sequence.

The other area where advanced machine-learning plays a role in secure computing is data loss prevention. As we hear of information being leaked out to the press or, at worst, laptops, mobile computing devices and removable storage full of confidential information disappearing and falling in to wrong hands, this field of information security is becoming more important across the board. Here, they used the ability to “fingerprint” confidential data like payment card information and apply handling rules to this information. This includes implementation of on-the-fly encryptions for the data, establishment of secure-access Web portals, and sandboxing of the data. The rules can be applied at different levels and affect the different ways the data is transferred between computers such as shared folders, public-hosted storage services (Dropbox, Evernote, GMail, etc), email (both client-based and Webmail) and removable media (USB memory keys, optical disks). The demonstration focused more on the payment-card numbers but I raised questions regarding information like customer/patient/guest lists or similar reports and this system supports the ability to create the necessary fingerprint of the information to the requirements desired.  

Cloud-focused computing support

The abovementioned secure-computing application makes use of the cloud-computing technology which relies on many of the data centres scattered around the world.

But the Norton 360 online backup solution that is typically packaged with some newer laptops is the basis for cloud-driven data backup. This could support endpoint backup as well as backup for servers, virtual machines and the like.

Mobile computing and BYOD

Symantec have approached the mobile computing and BYOD issues in two different paths. They have catered for the fully-managed devices which may appeal to businesses running fleets of devices that they own or using tablets as interactive customer displays. But they allowed for “object-specific” management where particular objects (apps, files, etc) can be managed or run to particular policies.

It includes the ability to provide a corporate app store with the ability to provide in-house apps, Web links or commercial apps so users know what to “pick up” on their devices. These apps are then set up to run to the policies that affect how that user runs them, including control of data transfer. This setup may also please the big businesses who provide those services that small businesses often provide as an agent or reseller, such as Interflora. Here, they could run the business-specific app store with the line-of-business apps like a flower-delivery-list app that runs on a smartphone. There is the ability to remotely vary and revoke permissions concerning the apps, which could come in handy when the device’s owner walks out of the organisation.

Conclusion

What this conference shows at least is the direction that business computing is taking and was also a chance to see core trends that were affecting this class of computing whether you are at the “big end of town” or not.

Send to Kindle

Toshiba to introduce the first NFC-capable Ultrabook

Article

Toshiba Satellite U925T is First NFC-Enabled Ultrabook#xtor=RSS-181#xtor=RSS-181#xtor=RSS-181

My Comments

From this article, I reckon that Toshiba has used the Satellite U925T Ultrabook to push themselves ahead of the game by integrating the “touch-and-go” near-field-communications technology in to a portable computer.

One key advantage that I see of this is exploiting the mobile-wallet systems like MasterCard PayPASS and, perhaps, Google Wallet to allow NFC-compliant payment cards to facilitate an online transaction that doesn’t have the fraud risks associated with “card not present” transactions. This would be facilitated by the use of appropriate software that interlinks with the NFC reader and merchant-side software that runs the transaction as if you are paying for the goods at a store using your card and their card terminal.

Similarly, the Android and Windows Phone ecosystem would benefit from this feature through access to the mobile wallets that can be hosted in the NFC-capable smartphones. This can extend to device-to-device file-transfer functions like Android Beam where users could upload pictures and sync contacts and QR-discovered Websites to the notebook from the smartphone.

In addition, the setup routines associated with commissioning Bluetooth or Wi-Fi wireless devices with this notebook can be simplified to a “touch-and-go” procedure if these devices support this functionality. This can then lead to the ability to transfer “extended-functionality” files to the host computer so as to open up advanced feature sets like sound-optimisation functions for headsets and microphones.

What I see about this more is that this Toshiba Windows 8 hybrid Ultrabook is an example of using NFC to demonstrate a synergy between open-platform computing devices. This then simply leads to a breeding ground for innovation.

Send to Kindle

Bluetooth Smart Ready product announcements piling up

Article – from the horse’s mouth

Bluetooth Smart Ready product announcements piling up

My Comments

I have given some coverage about the new Bluetooth 4.0 “Smart” and “Smart Ready” technologies. These are improvements to the Bluetooth specification to allow the use of Bluetooth sensor and control devices that can work on low battery requirements – think 2-3 AA or AAA Duracells or a “watch” battery – for in an order of six months or more.

This has opened up paths for health and wellness devices like blood-pressure monitors, glucose monitors and pedometers. Even the old 80s-style digital watch is coming back with a vengeance as a smartphone accessory due to this technology.

Most of the Bluetooth-equipped tablets and smartphones issued over the past model year or so are equipped with this technology fully with software support. But an increasing number of newer laptops are equipped with Bluetooth 4.0 Smart Ready functionality at least on a hardware level and underpinned with OEM software. An example of this is the recently-reviewed Fujitsu LifeBook LH772 which has this interface.

These units would have full inherent implementation when they run Windows 8 and it could open up questions about how the Bluetooth 4.0 Smart technology could be relevant to the laptop or desktop “regular-computer” device class.

One way I would see it being relevant to this class is the availability of Bluetooth wireless keyboards, mice and game controllers that don’t need special rechargeable batteries to operate. Here, they could run for a long time of use on just the two or three AA batteries.

Sensor devices like temperature or humidity sensors that are important to particular profession or hobby groups like refrigeration / HVAC engineers or gardeners could benefit from this technology especially when used with a laptop or tablet. Here, these computers could work with data-logging software to record trends or monitor for abnormal conditions.

At least what is being proven with the current crop of Bluetooth-Smart-Ready capable regular and mobile computer devices is that the world of innovation with this low-power wireless netowrk is being opened up.

Send to Kindle