Category: Industry Comments

What is the sound-tuning that is now implemented in laptops all about?

HP Pavillion dv7-6013TX laptop - keyboard highlightedA trend that I have seen with laptop computers and some all-in-one desktop computers is for them to have their sound output “tuned” by a company involved in the recording or reproduction of music. In a similar vein to how a motor-racing team will work a car destined for street use to improve its performance, these firms, such as Harman (JBL), Bang & Olufsen or Dr. Dre’s Beats Audio,  will work on the sound-reproduction systems to improve the computer’s sound reproduction, whether through its integrated speakers or through headphones attached to the computer.

The main issue that these efforts are trying to conquer is the tinny sound that emanates from the typical laptop speakers. Previously, these computers used just a pair of small speakers installed in their small chassis that didn’t yield good bass or midrange reproduction and they were driven via a low-power stereo amplifier in the computer. The setup was just good enough for audio prompts and, in some cases, speech from people without accents, yet did a horrendous job at reproducing music or sound effects in video or game content.  This is compared to the way even a cheaper portable radio or tape player that is equipped with the traditional 3” cone speaker can reproduce most frequencies “across the board”.  It is made easier due to these sets having a larger cabinet that isn’t crammed out with circuitry and reproducing sound through a larger speaker with a deeper cone. End-users are asking a lot more out of their computers as they use them as personal jukeboxes, movie players and games machines or businesses make heavy use of them as voice and video telephony endpoints.

HP Pavilion dm4 BeatsAudio Edition laptop at a Wi-Fi hotspotThe challenge is to keep these computers slim yet yield a proper and desirable sound across the audio spectrum. Typically the modifications will focus on the sound-reproduction and amplification circuitry as well as the integrated speakers. For example, there will be digital-sound-processing circuitry that works as a tone control for the computer, with the ability to improve the tone for the integrated speakers.

There will be the implementation of Class-D power amplifier circuitry that is designed by people in the audio industry and the sound will emanate from a multi-way speaker system. An example of this is the ASUS Ultrabooks implementing Bang & Olufsen ICEPower audio amplification. Most systems will use a 2.1 speaker setup with a separate bass driver that may be separately amplified, but some may use a multi-way speaker setup with many speaker units to achieve the sound of larger traditional speakers. As well, there would be some work on planning out the speaker-enclosure area to allow the sound to come out of the system properly.

From what I have noticed when I reviewed many of the laptops, I have come across some setups where the speakers can be muffled easily when you rest your hands on the palmrest, or some computers may sound better when placed on a harder surface. I have also noticed that the screen area isnt necessarily used on most laptops as a place to locate speakers because when you have speakers there, you can improve the stereo separation and sound localisation there.

There are still the many challenges ahead for these sound-tuning projects, where there is an expectation to yield that punchy bass from the built-in speakers. This is usually the kind of stuff that the marketers hype on about when they promote the computers that are equipped with these sound-tuning efforts. Other than that, these efforts have succeeded in putting the life back in to sound reproduction from the larger “new-computing-environment” laptop computers.

Acer–to stay on with the netbook

Articles

Acer will stop making cheap crap, but keep selling netbooks. Discuss. — Engadget

Acer VP: ‘We’re never gonna give netbooks up, let them down, run around and desert them’ | Engadget

My Comments

These articles had outlined the way the development of portable computers has become and the way Acer has stood on with the netbook computer even though other companies are dumping this product class and focusing on ultrabooks and tablets. This has been emphasised with their classy Aspire One series of netbooks which also use Android as an alternative operating system. Here they have worked on this product class and refined it so that it isn’t an ordinary product anymore.

On the other hand, Windows 8 and its “Metro” touchscreen user interface may legitimise the convertible notebook form factor where the notebook has a touchscreen on a swivel so it can be turned in to a tablet, an example of which is the Fujitsu TH550M which I reviewed previously. If Acer had developed a convertible netbook that had the touchscreen and ran Windows 8, they could create a perfect “bridge” product.

This is where one could benefit from a proper keyboard for text entry wile having a 10” touchscreen like all the good tablets have. It is in a similar way to how camera manufacturers have established the “bridge” cameras which could work as point-and-shoot cameras but had increased levels of configurability for advanced photographers, with some such cameras being able to work with accessory lenses or flashguns.

What Sony has to say about entering a “new form of television”

Article

Sony Hopes To Debut “A New Form Of Television” | TechCrunch

My Comments

Here, Sony is raising an issue about entering TV’s new direction. This includes coping with the current marketplace dimension

In the article, Sony’s CEO, Howard Stringer was underlining the ability for his company to be able to ride through rough times and smooth times. He cited the fact that the TV industry was going through a rough time due to economic crisis with customers preferring to buy budget brands or smaller sets if they were in the market for a TV. As I have mentioned before in this site, TVs do have a long service life and are typically “pushed down” when a newer and better set is acquired.

But I would affirm that the video peripherals matter as much as TVs when it comes to developing a video platform. Here, one could replace a DVD player with a Blu-Ray player that supports an interactive-TV platform. Similarly, Sony has integrated their interactive-TV platform in to the PlayStation 3 games console through the use of firmware upgrades.

It would also include the idea of using “other screens” such as the computer, smartphone or tablet as complementary or competing display surfaces. Personally I would see the other screens being able to work in both roles such as personal viewing of video material during a long train ride or finding supporting information on the TV show you are watching on the big screen.

Sony are also in a position to use open standards to build out their video platform rather than reinvent the wheel which they previously have done. This is accomplished through their support for DLNA home media networks and their implementation of Android in their tablet and smartphone devices. Even the VAIO computers work on the Windows desktop operating systems; and they were trialling the Google TV platform in the TV and Blu-Ray player form factor.

But they have contributed to other efforts through the supply of subsystems to technology manufacturers on an OEM basis. Initial examples of this included the supply of colour Trinitron CRTs to Apple for their Macintosh colour monitors to the current supply of LCD screens to other TV manufacturers and even the camera subsystem in the iPhone 4S.

What do you really do if you are trying to establish an integrated video-services platform that uses the many screens that the customer has? Do you need to make it highly-integrated in the way Apple has done or build a platform that can be worked across other devices and designs offered by other manufacturers.

In some ways it depends on the kind of customer you are targeting. Some concepts like what Apple offers would appeal to those who are sold on brand alone whereas other concepts would appeal more to those customers who “know what they are after”.

Interview and Presentation–Security Issues associated with cloud-based computing

Introduction

Alastair MacGibbon - Centre For Internet Safety (University of Canberra)

Alastair MacGibbon – Centre For Internet Safety (University of Camberra)

I have been invited to do an interview with Alastair MacGibbon of Centre For Internet Safety (University Of Canberra) and Brahman Thiyagalingham of SAI Global who is involved in auditing computing service providers for data security compliance.

This interview and the presentation delivered by Alastair which I attended subsequently is about the issue of data security in the cloud-driven “computing-as-a-service” world of information technology.

Cloud based computing

We often hear the term “cloud computing” being used to describe newer outsourced computing setups, especially those which use multiple data centers and servers. But, for the context of this interview, we use this term to cover all “computing-as-a-service” models that are in place.

Brahman Thyagalingham - SAI Global

Brahman Thyagalingham – SAI Global

These “cloud-based computing” setups are in use by every consumer and business owner or manager as they go through their online and offline lives. Examples of these include client-based and Web-based email services, the Social Web (Facebook, Twitter, etc), photo-sharing services and online-gaming services. But it also encompasses systems that are part of our everyday lives like payment for goods and services; the use of public transport including air travel; as well as private and public medical services.

This is an increasing trend as an increasing number of companies offer information solutions for our work or play life that are dependent on some form of “computing-as-a-service” backend. It also encompasses building control, security and energy management; as well as telehealth with these services offered through the use of outsourced backend servers.

Factors concerning cloud-based computing and data security

Risks to data

There are many risks that can affect data in cloud-based computing and other “computing-as-a-service” setups.

Data theft

The most obvious and highly-publicised risk is threats to data security. This can come in the form of the computing infrastructure being hacked including malware attacks on client or other computers in the infrastructure to social-engineering attacks on the service’s participants.

A clear example of this were the recent attacks on Sony’s online gaming systems like the PlayStation Network. Here, there was a successful break-in in April which caused Sony to shut down the PlayStation Network and Qriocity for a month. Then, a break-in attempt on many of the PlayStation Network accounts had taken place this week ending 13 October 2011.

The attack on data isn’t just by lonely script kiddies anymore. It is being performed by organised crime; competitors engaging in industrial espionage and nation states engaging in economic or political espionage. The data that is being stolen is identities of end-users; personal and business financial data; and business intellectual property like customer information, the “secret sauce” and details about the brand and image.

Other risks

Other situations can occur that compromise the integrity of the data, For example, a computing service provider could become insolvent or change ownership. This can affect the continuity of the computing service and the availability of the data on the systems. It also can affect who owns the actual data held in these systems.

Another situation can occur if there is a system or network breakdown or drop in performance. This may be caused by a security breach; but can be caused by ageing hardware and software or, as I have seen more recently, an oversubscribed service where there is more demand than the service can handle. I have mentioned this latest scenario in HomeNetworking01.info in relation to Web-based email providers like Gmail becoming oversubscribed and performing too slowly for their users.

Common rhetoric delivered to end-users of computing services

The industry focuses the responsibility of data security for these services on to the end-users of the services.

Typically the mantra is to keep software on end computers (including firmware on dedicated devices) up-to-date; develop good password habits by using strong passwords that are regularly changed and not visible to others; and make backup copies of the data.

New trends brought on by the Social Web

But there are factors that are being undone by the use of the Social Web. One is the use of password-reset questions and procedures that are based on factors known to the end user. Here, the factors can be disclosed by crawling data left available on social-networking sites, blogs and similar services.

Similarly, consumer sites like forums, and comment trees are implementing single-sign-on setups that use credential pools hosted by other services popular to consumers; namely Google, Facebook and Windows Live. This also extends to “account-tying” by popular services so that you are logged on to one service if you are logged on to another. These can create a weaker security environment and aren’t valued by companies like banks which hold high-stakes data.

The new direction

As well, it has been previously very easy for a service provider to absolve themselves of the responsibility they have to their users and the data they create. This has been through the use of complex legalese in their service agreements that users have to assent to before they sign up to the service.

Now the weight for data security is now being placed primarily on the service providers who offer these services to the end users rather than the end users themselves. Even if the service provider is providing technology to facilitate another organisation’s operations, they will have to be responsible for that organisation’s data and the data stream created by the organisation’s customers.

Handling a data break-in or similar incident

Common procedures taken by service providers

A typical procedure in handling a compromised user account is that the account is locked down by the service provider. The user is then forced to set a new password for that account. In the case of banking and other cards that are compromised, the compromised account cards would be voided sot that retailers or ATMs seize them and the customer would be issued with a new card and have to determine a new PIN.

The question that was raised in the interview and presentation today is what was placed at risk during the recent Sony break-ins. The typical report was that the customers’ login credentials were compromised, with some doubtful talk about the customers’ credit-card and stored-value-wallet data being at risk.

Inconsistent data-protection laws

One issue that was raised today was inconsistent data-protection laws that were in place across the globe. An example of this is Australia – the “She’ll Be Right” nation. Compared to the USA and the UK, Australians don’t benefit from data-protection laws that require data-compromise disclosure.

What is needed in a robust data-compromise-disclosure law or regulation is for data-security incidents to the disclosed properly and promptly to the law-enforcement authorities and the end-users.

This should cover what data was affected, which end-users were placed at risk by the security breach, when the incident took place and where it took place

International issues

We also raised the issue of what happens if the situation crosses national borders. Here nations would have to set out practices in handling these incidents.

It may be an issue that has to evolved in the similar way that other factors of international law like extradition, international child-custody/access, and money-laundering have evolved.

Use of industry standards

Customers place trust in brands associated with products and services. The example that we were talking about with the Sony data breach was the Sony name has been well-respected for audio-visual electronics since the 1960s. As well, the PlayStation name was a brand of respect associated with a highly-innovative electronic gaming experience. But these names were compromised in the recent security incidents.

There is a demand for standards that prove the ability for a computing service provider to provide a stable proper secure computing service. Analogies that we raised were those standards that were in place to assure the provision of safe goods like those concerning vehicle parts like windscreens or those affecting the fire-safety rating of the upholstered furniture and soft-furnishings in the hotel that we were in during the afternoon.

Examples of these are the nationally-recognised standards bodies like Standards Australia, British Standards Institute and Underwriters Laboratories. As well there have been internationally-recognised standards bodies like the International Standards Organisation; and industry-driven standards groups like DLNA.

The standards we were focusing on today were the ISO 27001 which covers information security and the ISO 20000 which covers IT service management.

Regulation of standards

Here, the government regulators need to “have teeth” when it comes to assuring proper compliance. This includes the ability to issue severe fines against companies who aren’t handling the data breaches responsibly as well as mitigation of these fines for companies who had an incident but had audited compliance to the standards. This would be demonstrated with evidence of compliant workflow through their procedures, especially through the data incident.

As well, Brahmin had underscored the need for regular auditing of “computing as a service” providers so they can prove to customers and end users that they have procedures in place to deal with data incidents.

I would augment this with the use of a customer-recognisable distinct “Trusted Computing Service Provider” logo that can only be used if the company is compliant the the standards in their processes. The logo would be promoted with a customer-facing advertising campaign that promotes the virtues of buying serviced computing from a compliant provider. This would be the “computing-as-a-service” equivalent of the classic “Good Housekeeping Seal” that was used for food and kitchen equipment in the USA,

Conclusion

What I have taken from this event is that the effort for maintaining a secure computing service is now moving away from the customer who uses the service towards the provider who provides the service. As well, there is a requirement to establish and enforce industry-recognised standards concerning the provision of these services.

Telephone Interview–Gigaclear UK (Matthew Hare)

In response to the latest news that has happened with Gigaclear and Rutland Telecom in relation to the Hambleton fibre-to-the-premises rollout, I offered to organise an email exchange with a representative from this company about this broadband access network.

Matthew Hare replied to my email offering to do a short Skype-based telephone interview rather than an email interview. This allowed him and I to talk more freely about the Hambleton and Lyddington rollouts which I have been covering in HomeNetworking01.info .

Real interest in rural-broadband improvements

There are the usual naysayers who would doubt that country-village residents would not need real broadband, and I have heard these arguments through the planning and execution of Australia’s National Broadband Network.

But what Matthew had told me through this interview would prove them wrong. In the Lyddington VDSL-based fibre-to-the-cabinet rollout, a third of the village had become paying subscribers to this service at the time of publication. In the Hambleton fibre-to-the-premises rollout, two-thirds of that village had “pre-contracted” to that service. This means that they had signed agreements to have the service installed and commissioned on their premises and have paid deposits towards its provision.

Satisfying the business reality

Both towns have hospitality businesses, in the form of hotels, pubs and restaurants that need real broadband. For example, Matthew cited a large “country-house” hotel in Hambleton that appeals to business traffic and this hotel would be on a better footing with this market if they can provide Wi-Fi Internet service to their guests. Similarly, these businesses would benefit from improved innovative cloud-based software that would require a proper Internet connection.

As well, most of the households in these villages do some sort of income-generating work from their homes. This can be in the form of telecommuting to one’s employer or simply running a business from home.

The reality of a proper Internet service for business was demonstrated through the Skype call session with Matthew. Here, the Skype session died during the interview and when he came back on, he told me that the fault occurred at his end. He mentioned that he was working from home at another village that had the second-rate Internet service and affirmed the need for a proper broadband service that can handle the traffic and allow you to be competitive in business.

A commercial effort in a competitive market

Matthew also underlined the fact that this activity is a proper commercial venture rather than the philanthropic effort that besets most other rural-broadband efforts. He also highlighted that there were other rural-broadband improvements occurring around the UK, including the BT Openreach deployments. and this wasn’t the only one to think of.

But what I would see is that an Internet market that is operating under a government-assured pro-consumer pro-competition business mandate is a breeding ground for service improvement, especially when it comes to rural Internet service.

Conclusion

From what Matthew Hare had said to me through the Skype telephone interview, there is a real and probable reason why the countryside shouldn’t miss out on the broadband Internet that city dwellers take for granted.

Now it’s firm – Steve Jobs to resign from chief executive at Apple

Articles

Steve Jobs resigns as Apple Chief Executive | SmartCompany.com.au

Steve Jobs steps down from Apple | CNet

Steve Jobs quits as Apple CEO | The Age (Australia)

My comments

There has been a lot of press about Steve Jobs intending to resign from Apple’s chief-executive position due to ill health. Now it had to happen that he is resigning. He is still able to maintain his position in Apple’s board of directors, both as a director and as the chairman of the board.

I see it as something that had to happen for another of personal-computing’s “old dogs”. These are the people who had founded companies that had been very instrumental to the development and marketing of commercially-viable personal computers. A few years ago, Bill Gates had resigned from Microsoft which he had founded.

This is more about a “change of the guard” at the top of these “pillar companies” as the technology behind these computers leads to highly-capable equipment for the home and business. This includes affordable mobile tablet computers that are operated by one’s touch and the smartphone which becomes a “jack of all trades”, working as a phone, personal stereo, handheld email terminal, handheld Web browser and more.

It is so easy to cast doubt over a company once a figurehead relinquishes the reins but I have seem may companies keep their same spirit alive and continue demonstrating their prowess at their core competencies.

As well, even though people may criticise him for how he manages the iTunes App Store and the Apple platforms, as in keeping them closed, Steve Jobs and Apple are in essence milestones to the connected lifestyle.

HomePlug as part of a home-vehicle network for electric and hybrid vehicles

Articles

Your BMW wants email; the Merc wants Netflix | ITworld

HomePlug GP Networking Specification | The Tech Journal

My comments

The HomePlug Powerline Alliance have cemented the “Green PHY” standard for energy-efficient powerline networking and energy management in stone,

Now the major German vehicle builders have defined a power connection standard to connect their electric or plug-in-hybrid vehicles to the mains power supply for charging. This includes using these HomePlug standards for transferring required data between the vehicle and the host power supply for charging-process control, metering and other similar applications.

The core benefit is to achieve a successful level playing field for connecting these vehicles to the “smart grid” for overnight and rapid charging. This also includes particular requirements like costing of energy used by “guest vehicles”, road-tax implications as well as grid integration such as off-peak charging or vehicle-to-grid setups for offsetting energy peaks.

This also facilitates IP linking to the Internet service via this connection thus allowing for some possibilities beyond the “obvious Internet applications”. One application I have often thought of in this context is the ability to integrate the vehicle’s infotainment system in to the home network.

Here, it could lead to synchronisation of maps, contact lists and media files between the home network and the vehicle or the ability to simply benefit from the data held on the vehicle’s infotainment system in the home network. This would be the networked equivalent of bringing a tape or CD that was in the vehicle’s glovebox or sound system in to your home so you can play it on your music system there.

At least there is an attempt to achieve a level playing field across the vehicle industry to support electric vehicles while catering for flexible setups.

Blu-Ray players–they could give more life to older and cheaper TVs

Article

Smart TV – why are Blu-ray players second-class citizens?

My comments

I agree with the principal argument that this article had put forward concerning the availability of the “smart-TV functionality” in video peripherals like Blu-Ray players or network-media adaptors. There is due to a reality that most of the consumer-electronics industry has been missing concerning how people have purchased and owned TV sets; something I, like most of you, have seen for myself.

The reality with TV purchasing and ownership

Since the 1970s, the typical colour television set has been able to enjoy a very long and reliable service life, thanks to transistorisation. This had been underscored with the gradual introduction of electronic tuner subsystems that were more reliable than older mechanical tuner systems like the old “click-click-click” tuning knobs that were common in most markets or the “push to select, twist to tune” button arrays common on TV sets sold in the UK in the 1960s.

This long service life then allowed for a “push-down” upgrade path to exist in a similar manner to what happens with the household refrigerator. Here, one could buy a nicer newer fridge and place it in the kitchen while the older fridge that it was to replace could go in the garage or laundry and act as extra cold storage space for food and drink, such as the typical “beer fridge”. In the case of the TV, this would mean that one would buy a newer better TV, most likely with a larger screen and place it in the main lounge area. Then the original set which was to be replaced by the new set typically ended up in another room like a secondary lounge area or a bedroom or even in a holiday house.

Usually the only reason most households would scrap a TV set would be if it failed beyond repair or was damaged, Even if a set was surplus to one’s needs, it would be pushed off to another household that could benefit.

Some people may think that this practice has stopped with the arrival of the LCD or plasma flatscreen TV, but it still goes on.

Not all TVs are likely to be “smart TVs”

Not all manufacturers are likely to offer network-enabled TVs in their product cycle. This may be due to a focus on picture quality or the ability to build lower-end products to a popular price point.

It also includes sets like TV-DVD combo units or small-size models that are offered at bargain-basement prices. As well, home-theatre enthusiasts will be interested in buying the latest projector rather than the latest “smart TV”.

Addition of extra functionality to existing televisions with video peripheral devices

The consumer-electronics industry has had success with extending the useability of existing television receivers through the use of well-equipped multi-function video peripherals.

The video recorder as a TV-enablement device

The best example of a device enabling older and cheaper TV sets was the video cassette recorder as it evolved through the 1980s. This wasn’t just in the form of recording of TV shows and playback of content held on videocassettes.

It was in the form of improved television viewing due to the TV tuners integrated in these devices. By model-year 1981 in all markets, the typical video recorder was equipped with a reliable electronic TV tuner. As well, all VHS and Betamax video recorders that implemented logic-control tape transports also implemented a “source-monitor” function when the machine wasn’t playing tapes. This would typically have the currently-selected channel on the machine’s tuner available at the machine’s output jacks including the RF output channel that the TV was tuned to.

Here, this setup gave the old TVs a new lease of life by providing them with a highly-reliable TV signal from the VCR’s tuner. In some cases, users could tune to more broadcasts than what was available on the TV set. Examples of this included cable channels received on an older “non-cable” TV in the USA or Germany; channels broadcasting on the UHF band through a mid-70s VHF-only TV in Australia and New Zealand; or access to Channel 4 on a “4-button” TV in the UK due to more channel spaces.

The ability to change channels using the video recorder’s remote control also allowed a person who had a cheaper or older TV to change channels from the comfort of their armchair, something they couldn’t previously do with those sets.

Similarly, some households would run a connection from the video recorder’s AUDIO OUT to their hi-fi system’s amplifier and have TV sound through their better-sounding hi-fi speakers. This was exploited more with stereo video recorders, especially those units that had a stereo TV tuner integrated in them, a feature that gradually appeared as TV broadcasters started to transmit in stereo sound through the 80s and 90s.

How the Blu-Ray player is able to do this

The typical well-bred Blu-Ray Disc player has the ability to connect to the home network via Ethernet or, in some cases, Wi-Fi wireless. This is typically to support “BD-Live” functionality where a user can download and view extra content held on a Blu-Ray Disc’s publisher’s servers in addition to viewing content held on the disc. As well, the Blu-Ray Disc player can connect to ordinary TV sets as well as the HDMI-equipped flat-screen TVs that are currently in circulation.

Some of the Blu-Ray players, especially recent Samsung, Sony and LG models can also pull down media from the DLNA Home Media Network and show it on these TVs. As well, some manufacturers are rolling out some Internet-ended services to these players.

In the same way as the video recorder was able to extend the functionality of the cheaper or older TV set by offering extended tuner coverage, remote control or access to better sound, the Blu-Ray player or network media adaptor could open the world of Internet–ended entertainment to these sets.

What the industry should do

The industry could work towards achieving similar interactive functionality for the network-enabled video peripherals as the network-enabled TVs. They could achieve this through the establishment of a “platform design” with similar applications and capabilities across a consumer-video product lineup. It is infact what Sony is doing for their consumer-video products at the moment with very little difference in interactive-service lineup between their TVs and their Blu-Ray players.

Here, the interactive-TV software is consistent across the whole lineup of TVs, Blu-Ray players, Blu-Ray-equipped home-theatre systems and other video peripherals. The manufacturer may vary the software according to the device’s function by omitting functions relating to particular hardware requirements like screens, optical drives or broadcast tuners in order to make it relevant to the device class. Of course, there could be support for user-attached peripheral devices like USB Webcams, Bluetooth-enabled mobile phones, UPnP-compliant printers and the like to extend functionality for particular software applications like video-conferencing.

The software may be fully revised every few years to build in new functionality and accommodate better hardware. It may also be a chance to improve the operation experience for the software concerned. Yet this could maintain the branding and skinning that the manufacturer and software partners do desire.

Conclusion

There is a different reality that exists when buying TV equipment and this function should be supported equally in video peripheral equipment like Blu-Ray players and network media adaptors as in TV sets.

IBM have now passed 100 years in a different direction

Article

BBC News – IBM at 100: From typewriters to the cloud

My Comments

When International Business Machines (IBM) had come on to the scene as an office technology company, there weren’t many technologies around that made office life more productive. Now this company has built up a steady path of innovation in this field and it has culminated with the development and refinement of the mainframe computer through the 1960s and 1970s; and the establishment of a highly-desirable office electric typewriter equipped with an interchangeable “golf-ball” typehead, known as the “Selectric”.

But this company had a strong hand in the personal-computing scene with the arrival of the IBM PC. This desktop computer, which was based on Intel electronics and a Microsoft operating system had set the benchmark for an affordable desktop computer for small businesses.  Through the 1980s, this computer was refined through the use of colour graphics, hard disks and faster processors. Australian readers may know that a lot of these computers sold in that market were built in a factory in Wangaratta, Victoria.

In a similar vein, another company called Lotus had developed the quintessential desktop spreadsheet application known as Lotus 1-2-3. Due to its flexibility and capability, this program became the preferred spreadsheet application to be run on an IBM PC.

But these computers had effectively brought the desktop computer out of the realms of the hobbyist and in to the hands of business. This was initially in to the hands of the bookkeepers and similar employees but, in the late 80s and early 90s with the arrival of cost-effective computer networks, ended up in the hands of most office workers from the top floor to the bottom.

The PS/2 era wa a markedly different era with an attempt by IBM to develop their own operating system and graphic user interface, which was known as OS/2. These computers also used a high-speed interface bus, known as the Micro Channel Bus, that was different from the EISA bus that was used by the rest of the industry. The main benefits that these computers had provided for the industry-standard Intel-based computing environment included the use of micro-DIN keyboard and mouse interface ports, including a standard interface for the mouse; a small power-supply reference design which allowed for the power switch to be located on the front panel; and the use of 1.44Mb 3.5” diskettes on the Intel-based PC platform.

Through the late 90s, IBM had shifted away from its hardware roots and moved towards its role as a hardware-software “solutions provider” for big business. This was evident with them devolving their main hardware lines to other companies; like Lexmark for printing and imaging, Hitachi for data storage, and Lenovo for personal computer systems. It was although they bought out Lotus and implemented Lotus, who had shifted to “Notes” as an information-management system,  in their solutions. Here, it has led to them being able to work on “cloud-based” computing projects that can help these businesses manage their information across many locations.

Infact, I would consider the existence of IBM to be a “milestone to the connected lifestyle” in itself due to its development and refinement of both “back-end” and desktop computing equipment central to this lifestyle.

Happy 100th Birthday, IBM

Consumer Reports–the first independent consumer publication to give support to DLNA

Article

DLNA and why it matters | Consumer Reports

My Comments

There are those of you who use magazines like “Which”, “Consumer Reports” or “Choice” to assess the calibre of consumer products that you buy. This is because the organisations behind these magazines assess the products on the basis of how a consumer would experience these products and want to stay at arm’s length from the suppliers’ public-relations efforts. Similarly these same organisations work in their own territory as general consumer advocacy organisations on topics like junk-food consumption and the like.

Now Consumers Union, the American-based consumer information and advocacy organisation, have used their “Conusmers Reports” platform to identify consumer electronics devices that work with the DLNA Home Media Network by using this feature as a distinct attribute in their products’ attribute lists. The main reason I support this is that they support the level of interoperability that this standard provides for media distribution over the home network.

Here, it could be a good idea for other organisations of the same calibre as Consumers Union, like Australian Consumers Association (“Choice”) to use their reviewing platforms to support this standard. One of the reasons is that this standard isn’t controlled by one product vendor but set up for cross-vendor compatibility; and is infact the reason HomeNetworking01.info stands for this technology as a preferred platform for media management via the home or small-business network.