Category: Industry Comments

Choosing a Brother small-business printer or HP inkjet printer could become like choosing a car

Recently, I had reviewed a few Brother printers and had observed a particular trend in how the products are being positioned. It is becoming more akin to how the typical vehicle builder is positioning a particular vehicle model or series of vehicles.

It is also becoming very similar with Hewlett-Packard’s Photosmart and OfficeJet inkjet printer ranges where there are a few common mechanisms implemented in the products. But, in HP’s case, the different models have differing cosmetic designs so as to integrate different feature sets and make the more expensive machines look classier.

A lineup of models with varying feature sets and throughput speeds but with the same design

In the vehicle world, an example of this was Holden’s large family cars sold through the 1960s to the 1970s. These vehicles had different model names depending on their level of luxury and / or their powertrain, with the “Premier” representing the top-of-the-line standard-wheelbase vehicle. Low-end vehicles were referred to initially as “Standard” or “Belmont” vehicles until the early-70s “HQ” series while “step-up” or “mid-tier” vehicles had model names like “Special” or “Kingswood”. This was until the “HQ” series where vehicles in that and subsequent series had “package” suffixes to differentiate entry-level and mid-tier vehicles.

For example, I had noticed that the HL-2240D direct-connect duplex monochrome laser printer was part of a series of laser printers based around a new printer design and print engine. There was a low-end model known as the HL-2130 which couldn’t print both sides as well as the HL-2250DN which was equipped with Ethernet networking and the HL-2280DW being equipped with Wi-Fi networking. Similarly, the more expensive models in the lineups also benefit from higher page throughput due to more powerful components in the design.

A model range derived from another model range

But the practice becomes very similar to how the vehicle builders derive a model range design from another concurrently-running model range design. An example of this would be them designing a longer-wheelbase luxury “executive” car as a derivate of a standard large family car like what Ford have done when they derived the Fairlane and LTD designs from the Falcon designs.

Here, this is reflected in how the designs for this company’s laser-printer lineup are used. I had observed that the multifunction series including the MFC-7360N that I reviewed were derived from the previously-mentioned dedicated laser printer series that the HL-2240D was part of. Here, all the units in both printer lineups used the same print engine and the same replacement parts.

Benefits for product choice

This will allow for a granular range of products in a product class where a person can choose or specify the right kind of printer based on their needs and budget; without needing to create new designs in order to satisfy the different market segments. This also allows the manufacturer to keep product prices within affordable territory because there is the ability to reuse parts across the different models. It also can allow a salesman room to upsell customers to better products or make deals that offer better value.

In most cases, the mid-tier product will offer best value for most users. For example, in these two printer lineups, the mid-tier models (HL-2250DN dedicated printer and MFC-7460DN) will offer the two currently-desirable features – double-sided printing which saves paper; and network connectivity. In some other cases like the dedicated colour laser printers based on Brother’s latest high-throughput colour-laser print engine, the HL-4150CDN which just has Ethernet network connectivity and reduced-time-penalty colour duplex printing would suit most users.

Conclusion

The creation of a granular product range with incremental functionality but a few common design bases and /or descendent product classes can then allow manufacturers to keep consistent value for money when they want to build out a product range.

With two new standards in the works, we could be approaching the Gigabit wireless network

Articles

Understanding gigabit Wireless LAN: 802.11ac and 802.11ad

My comments

What is it all about

At the moment, 802.11n on both the 2.4GHz and 5GHz wavebands is the current link standard for the Wi-Fi wireless network. But the IEEE have decided to work on standards for providing increased-bandwidth wireless networks.

The two standards are 802.11ac, which will primarily work on the 2.4Ghz and 5GHz radio bands and be seen as a migration path from the current 802.11n technology; as well as 802.11ad which works on the 60GHz waveband and has a very short range. The latter technology would be considered best for peer-to-peer applications like short-range wireless backhaul.

Both of these systems will use MIMO (Multiple Input Multiple Output) radio technology; a “front-end diversity” system with multiple transceivers which is what the 802.11n network uses. But this technology will work with at least four “front-ends”; known as “4×4” due to four signals coming in and four going out.

Dedicated bandwidth options

One major benefit that I see with these technologies will provide is dedicated-bandwidth wireless networking which each access point compliant to these standards can do. This is brought on through the use of MU-MIMO (Multi-User Multi-Input Multi-Output) Here, it extends “transmit beamforming” technology which provides improved signal quality in an 802.11n network to allow the access point to provide “switched” Wi-Fi with dedicated bandwidth to stations; similar to the way the typical wired Ethernet network works.

It may be an improvement for network setups with many SSIDs per access point like so-called “guest / hotspot” + “private” networks, shared hotspot access points or many university networks; by allowing full bandwidth to each SSID.

The realities

Of course, the actual throughput that a network link will achieve will typically be less than headline link speed due to overheads associated with the link’s transmission requirements. Here, the average real world maximum throughput will be 867Mbps and the figure may be quoted for first-generation equipment or mature-generation equipment.

How it affects my small network

What will be asked of a small network like a home network would be a 5GHz segment that provides the 802.11ac network.

It may provide for dedicated throughput to client devices like laptops or tablet computers. For those networks that run as dual networks like hotspots or guest networks that share the same wireless router as the private network,the dedicated throughput for each wireless-network segment will be a bonus.

Of course, 2.4GHz will still be used as an 802.11n segment for existing devices and there may be a compatibility mode so that existing 802.11n devices can operate on the same segment.

Other issues

If the 802.11ad technology is to be used as a wireless-backhaul for many 802.11ac access points, there will have to be work on a complementary mesh-network technology. It will then provide a level of fault-tolerance in the wireless backhaul as well as a chance for each station to have and pass on full bandwidth networking. This is something that the IEEE standards body are working on with the 802.11s draft standard.

Conclusion.

It therefore shows that when there is a standard in place, there will be a chance to “raise the bar” with the technology that it covers. This will mean that a Wi-Fi wireless network could become close to the goal of a switched Gigabit network.

ARM-based microarchitecture — now a game-changer for general-purpose computing

Article:

ARM The Next Big Thing In Personal Computing | eHomeUpgrade

My comments

I have previously mentioned about NVIDIA developing an ARM-based CPU/GPU chipset and have noticed that this class of RISC chipset is about to resurface in the desktop and laptop computer scene.

What is ARM and how it came about

Initially, Acorn, a British computer company well known for the BBC Model B computer which was used as part of the BBC’s computer-education program in the UK, had pushed on with a RISC processor-based computer in the late 1980s. This became a disaster due to the dominance of the IBM-PC and Apple Macintosh computer platforms as general-purpose computing platforms; even though Acorn were trying to push the computer as a multimedia computer for the classroom. This is although the Apple Macintosh and the Commodore Amiga, which were the multimedia computer platforms of that time, were based on Motorola RISC processors.

Luckily they didn’t give up on the RISC microprocessor and had this class of processor pushed into dedicated-purpose computer setups like set-top boxes, games consoles, mobile phones and PDAs. This chipset and class of microarchitecture became known as the ARM (Acorn RISC Microprocessor) chipset.

The benefit of these RISC (Reduced Instruction Set Computing) class of microarchitecture was to achieve an efficient instruction set that suited the task-intensive requirements that graphics-rich multimedia computing offered; compared to the CISC (Complex Instruction Set Computing) microarchitecture that was practised primarily with Intel 80×86-based chipsets.

There was reduced interest in the RISC chipset due to Motorola pulling out of the processor game since the mid 2000s when they ceased manufacturing the PowerPC processors. Here, Apple had to build the Macintosh platform for the Intel Architecture because this was offering RISC performance at a cheaper cost to Apple; and started selling Intel-based Macintosh computers.

How is this coming about

An increasing number of processor makers who have made ARM-based microprocessors have pushed for these processors to return to general-purpose computing as a way of achieving power-efficient highly-capable computer systems.

This has come along with Microsoft offering a Windows build for the ARM microarchitecture as well as for the Intel microarchitecture. Similarly, Apple bought out a chipset designer when developed ARM-based chipsets.

What will this mean for software development

There will be a requirement for software to be built for the ARM microarchitecture as well as for the Intel microarchitecture because these work on totally different instruction sets. This may be easier for Apple and Macintosh software developers because when the Intel-based Macintosh computers came along, they had to work out a way of packaging software for the PowerPC and the Intel processor families. Apple marketed these software builds as being “Universal” software builds because of the need to suit the two main processor types.

Windows developers will be needing to head down this same path, especially if they work with orthodox code where they fully compile the programs to machine code themselves. This may not be as limiting for people who work with managed code like the Microsoft .NET platform because the runtime packages could just be prepared for the instruction set that the host computer uses.

Of course, Java programmers won’t need to face this challenge due to the language being designed around a “build once run anywhere” scheme with “virtual machines” that work between the computer and the compiled Java code.

For the consumer

This may require that people who run desktop or laptop computers that use ARM processors will need to look for packaged software or downloadable software that is distributed as an ARM build rather than for Intel processors. This may be made easier through the use of “universal” packages that are part of the software distribution requirement.

It may not worry people who run Java or similar programs because Oracle and others who stand behind these programming environments will be needing to port the runtime environments to these ARM systems.

Conclusion

This has certainly shown that the technology behind the chipsets that powered the computing environments that were considered more exciting through the late 1980s are now relevant in today’s computing life. These will even provide a competitive development field for the next generation of computer systems.

Next Windows to have ARM build as well as Intel build. Apple,used to delivering MacOS X for Motorola PowerPC RISC as well as Intel CPUs, to implement Apple ARM processors on Macintosh laptops.

IPTV now being featured on mainstream TV media

Articles

Smart TVs (A Current Affair article) – NineMSN VIDEO

My Comments

From the recent “A Current Affair” broadcast on the Nine Network, it seems to me that the “Smart TV” or “Internet TV” concept is now ready for prime time.

What is this trend all about?

This is where functionality like access to IPTV channels, “catch-up” TV and video-on-demand is now being integrated in to most of the big-name TV sets that are to be sold at the likes of Harvey Norman. It will also include an “app-store” interface so that users can add functions to these sets in a similar way to how they add functions to a smartphone or tablet computer.

Some of the sets will come with an integrated hard disk which will provide PVR functionality. But what wasn’t mentioned was that most of the sets from the big brands, especially LG, Samsung and Sony, will support integration with the DLNA Home Media Network. This means that these sets could play content held on a computer or network-attached storage device that uses this standards-based technology.

Typically, these functions will be pitched at TVs targeted for the main viewing area i.e. the main lounge room or family room. But this kind of function may be added to existing sets through the use of some of the current-issue Blu-Ray players and network-media adaptors like the Sony SN-M1000P network media adaptor.

A few key questions that I have

“TV plus Apps” or IPTV and interactive-TV content?

There could be a fear that this could turn out as “TV plus apps” with the same old TV content plus some apps such as clients for the popular social networks, photo-sharing sites and YouTube-type sites thrown in.

But some providers are making ties with the various manufacturers to set up free and pay-TV front-ends through the IPTVs. Examples of this include Samsung establishing a tie with BigPond TV to provide direct access to that content or most of the manufacturers running ABC iView through their TV sets. It may also open up opportunities like video-on-demand or boutique content services. As well, once there is a level playing field for adding TV services, this could lead to the addition of extra TV content.

If there is a desire to provide new live or on-demand IPTV services, there needs to be support for adding the newer services to existing IPTV equipment. This could be achieved through an always-live app store on these sets. Similarly, existing broadcast content, both editorial and advertising, must be able to support links to apps and interactive front-ends that are accessible to the average viewer with one click of a particular button through the use of interactive-TV content-delivery standards.

This can include applications ranging from interactive games and competitions that are part of children’s TV through “play-along” quiz shows to polls run in conjunction with current-affairs shows which have the option for you to view “extended-version” interviews.

Equipment Useability

A key issue that I have raised in this site was the useability of services like the Social Web on this class of equipment. Typically, the “smart TV” concept prides itself on connection with social-network services like Twitter and Facebook; but there will be the desire to gain access to photo-sharing sites like Flickr and Picasa or gain full benefit from sites like YouTube. These can make use of “smart-TV” services more daunting for someone who doesn’t find themselves competent or isn’t experienced with technology.

An example of this was when I mentioned to a friend of mine about the Pixel Eyes app on the TiVo platform where they could view their Picasa albums through the lounge-room TV connected to the TiVo PVR. I mentioned that they would have to log in to their Google account using the “pick-pick” method of entering their credentials in order to view their pictures on this service and this idea frightened them off it.

The main problems is that different users will want to log in to this common terminal or, in the case of the Social Web, leave comments in relation to what they are viewing. Typically, this will require a fair bit of text entry and most remote controls won’t be fully engineered to cater to this requirement. The user will typically have to work a D-pad or wave a Wii-style “magic remote” around to pick letters from an onscreen keyboard and may have to switch between logical keyboards to use different character sets like numbers, different-case characters or punctuation. Try entering in a Facebook / Twitter / Google username and password that way or “knocking out” a Tweet that way.  As well, I have raised in that same article methods in which logging in to these services from devices like TVs and set-top boxes can be simplified and referenced how Facebook achieved a login experience suitable for these devices with their HP ePrint app. This includes being able to change the active user associated with a TV or set-top box to another user.

Similarly, I would look at issues like keyboard support for IPTVs. This is whether a TV comes with a QWERTY-enabled remote or not. The best method for add-on keyboard support would be to use Bluetooth HID connectivity so that a Bluetooth-based wireless keyboard can be used as a text-entry tool. Similarly, the ability for one to plug a standard USB computer keyboard in to the USB port usually reserved for USB memory keys and use this for text entry may make things easier. This would work well with those wireless-keyboard sets that plug in to the computer’s USB port.

A remote that doesn’t have a QWERTY keyboard but uses a numeric keypad for direct-channel-selection or parental-code-entry could use this keypad as an “SMS-style” text-entry interface, something which many nimble-fingered teenagers are used to. This would work better if it used the character-set-selection practices used on popular mobile phones.

Other methods that can be looked at include the use of smartphone apps as virtual remote controls like what Samsung has done for their Android smartphones. Here, a user could download an app to their Galaxy S phone and have this become the TV remote control. This could be extended to ideas like multi-control for interactive applications such as “own-account” operation for Social Web and similar applications with the TV screen becoming a “common monitor”.

What to consider when choosing or using your network-enabled TV

DLNA functionality

The TVs or set-top devices should support DLNA Media Player functionality at least, with preferable support for DLNA 1.5 Media Renderer functionality. Initially this would give you access to content held on your computer’s or network-attached-storage device’s hard disk.

The Media Renderer functionality can allow the TV to be controlled by a UPnP AV / DLNA control point such as TwonkyMobile, PlugPlayer or Andromote on your smartphone or tablet computer, or TwonkyManager on your netbook.  In the case of Blu-Ray players and set-top devices, you may even be able to play music from your network storage through your favourite stereo without the need to have the TV on to select the music

If the TV or set-top box offers integrated PVR functionality, look for DLNA Media Server compatibility because this may allow you to play recorded TV shows on other TVs in the house without them needing to be of the same brand.

It is also worth noting that some DLNA functions like DLNA server or Media Renderer may not be enabled by default even though the set has these functions. Here, you may have to go to the setup menus and look for “DLNA control”, “Media Server” or similar options and enable them to benefit fully from these functions.

For further information, it is also worth reading the DLNA Networked Media articles that I have written on this site.

Connecting the set to your home network

When you connect one of these TVs to your home network, I would suggest that you avoid using Wi-Fi wireless connectivity, especially if the TV or set-top box uses a dongle for this connectivity rather than integrated Wi-Fi connectivity. This is because of the fact the Wi-Fi network is radio-based and if anything is shifted slightly between the Wi-Fi router and the TV, you may have service-reliability issues.

Instead, I would recommend that you use a wired method such as Ethernet cable or a HomePlug AV powerline-network setup. The Ethernet-cable solution would work well if the router and TV are in the same room; you have wired your home for Ethernet or you can get away with snaking Ethernet wiring through windows. On the other hand, the HomePlug solution would work well for most users who don’t want to or can’t lay new wiring through their homes because this uses the house’s existing AC wiring.

In fact, if you are renovating or rewiring your home, it may be worth considering wiring the house for Ethernet and making sure you have an Ethernet connection in the main TV-viewing areas of the house. This may be achievable if you have an electrician who is competent or knows one who is competent with communications or data work doing the job.

Conclusion

This site will have regular coverage of home media network issues that will become of importance as we head down the the path towards online home entertainment.

The printer-initiated scan-to-computer feature for network applications could be standardised and implemented at operating-system level

Most, if not all of the network-capable all-in-one printers that I have reviewed on this site have support for network-based scanning. This includes the ability to start a scan job from the printer’s control surface and have the job sent to the computer and handled in a preferred way. But this function isn’t handled in a smooth and reliable manner as judging from my experience when connecting the many different printers to my computer.

The current situation

This function is typically managed by a manufacturer-supplied “scan-monitor” program that is part of the “printer solutions package” and has to be up and running before you start your scan job from the device.

What can typically happen is that this functionality can end up being dependent on the way this “scan-monitor” program behaves. Here, you may end up not being able to scan via the network or not being able to start the scan job at the printer’s control surface. In some cases, you may be able to use the operating system’s scanning infrastructure such as Windows Image Acquisition, rather than the manufacturer’s scan tools to do a scan job,

Why integrate device-initiated scanning for networked hardware in to the operating system

The operating systems could support device-initiated scanning by offering functionality like “scan paths” that are available to each of the devices. Here, the devices could then expose the “scan paths” that are available to them based on their capabilities like colour scanning, automatic document feeder, etc. This means that if two scanners have the same capabilities, they have the same scan pathos for each computer endpoint.

Multiple-machine environments

This could include the ability to identify a particular computer as a destination for the scanned files; as well as allowing applications rather than the manufacturer’s particular applications to be the endpoints. This could allow for applications like OCR, bookkeeping, raster-to-vector and others to simply become “available” at the printer’s control panel rather than having to work the application’s user interface or find image files left by the scan monitor in order to benefit from the scanned work.

Here, it may cater for realities associated with the home or small-business network where there are many computers and, in some cases, two or more multifunction printers. This may be brought on by the use of a premium-level machine with all the bells and whistles like the HP Photosmart Premium Fax C410a or the Canon PiXMA MX-870 being installed in the home office and an economy-level machine like the HP B110a Wireless-E installed in the study, kitchen or bungalow and used as a “second” printer.

Efficient operation

Another obvious benefit of the scan-monitor function being integrated in the operating system is that it works in an efficient manner. This will free up memory and other resources and allow for a quick response from the destination computer. This is compared to a significant time delay occurring when one instigates a scan job from the multifunction printer’s control surface as the scan monitor starts up and handles the scan job.

Points of innovation

The operating system working as a scan monitor can open up paths of innovation when it comes to imaging-driven applications. An example of this could include the use of the multifunction printer’s control surface for entering job-specific information. This is more so as these multifunction printers come equipped with D-pad, numeric keypads and touchscreens; as well as graphical screens and menu-driven operation. Applications of this could include entering the file name for “scan-to-file” operations, determining the nature and amount of an expense when scanning receipts in to a bookkeeping program or entering photograph-specific information when scanning a photograph.

It can also open up another path of innovation in having network-attached-storage devices become scan destinations without the need to remember FTP or other file-path locations for these devices. This can help with activities like archiving of paper documents or scanning of pictures to be made available on the DLNA Home Media Network.

Conclusion

Once we move the workload of device-initiated scanning to the Windows, Macintosh or Linux operating system, it can then yield many improvements to people who scan hard-copy material using the current crop of multifunction printers.

What about having IMAP4 as a standard email protocol

Introduction

Most email services, especially those offered by consumer ISPs, use the old POP3 / SMTP protocols as the backbone for their email services. This works properly when only one computer is working as an email client because there is an expectation for the email to be downloaded off the mail server to that one computer.

Now the reality has changed due to Moore’s Law allowing for the ISP to offer email storage capacity to their customers in the order of gigabytes. As well, the computing paradigm has shifted towards people viewing their email from multiple devices. This has been brought about with small business owners having an office computer and a home computer, as well as the increasing popularity of smartphones, tablet computers and secondary-tier notebook computers like netbooks and 13”-14” ultraportables.

What does IMAP4 offer over POP3?

The IMAP4 technology requires email to be stored on the server and allows a copy of the mail to exist on the client devices. When the email client connects to the IMAP4 server, it simply synchronises all the email between the client and the server. This includes synchronising the client outbox to the server outbox in order to have emails being sent.

There is the ability for an IMAP4 setup to support “header-only” downloading, which would be of importance to people who use portable devices or low-bandwidth connections. As well, an IMAP4 setup can allow the user to operate in “offline” mode where synchronising is done when the user explicitly goes online so that users can prepare their email where Internet access is unavailable but synchronise when it is available.

Compared to POP3 / SMTP, this allows for increased flexibility when it comes to maintaining a mailbox from different email clients. Primarily, the contents of the same mailbox appear in all client devices that can access that mailbox. An example of this benefit would be that the Sent folder contains all messages that are sent from all of the clients rather than from that particular client. Similarly, one could “rough-out” an email using a smartphone or other portable device, then “finish it off” on the desktop because the email will be held in the Drafts mailbox folder.

It also supports the ability to create mailbox folders which will allow you to file the email in a manner that suits you, yet see the same filing arrangement across all your client devices.

It is also worth knowing that IMAP4 is the basic email protocol that OMTP have called as part of their standard for mobile “visual voicemail” services. These services allow a user to manage voicemail that they receive on their mobile phone in a similar manner to how they manage email on their computer or smartphone.

The status quo with IMAP4

IMAP4 is a free open-source technology that is independent of any licensing requirements; and nearly all email clients for desktop and mobile operating environments offer IMAP4 support as standard.

It is even though most of the consumer ISPs don’t offer it as an email protocol to their customers. This is while an increasing number of these providers are now offering mailboxes with gigabyte file capacities to new customers and upsizing existing customers’ mailboxes to these capacities. As well, the current range of data-centre equipment that works as mail servers can handle IMAP4 easily.

Some of these providers would rather offer a “hosted Exchange” service which would require the user to use Microsoft Outlook in Exchange mode. These services are more expensive to provide and may cost more for most personal and small-business users.

What could be done

An Internet service provider could offer IMAP4 mailboxes as a standard option for new customers or customers opening up new mailboxes. As well, they could offer it as a free upgrade option to existing customers, with information on how to convert from POP3 / SMTP to IMAP4.

This kind of setup that IMAP4 offers can allow telcos who offer Internet service and telephony as a bundle or triple-play services to provide a unified messaging environment where customers can manage their voicemail, fax and email from the same terminal. It also opens up ways for these companies to add value to their telephony and Internet services.

It also is a way of supporting the Internet-usage reality which is a reality driven by multiple-computer setups and portable computing.

FCC to set the first yardstick for Net Neutrality

News articles

HP Blogs – FCC does define rules on net neutrality – The HP Blog Hub

FCC Approves First Net Neutrality Rules | Datamation

From the horse’s mouth

FCC Website

Report and Order concerning Net Neutrality (PDF) – FCC

Press Release (PDF) – FCC

My comments

Through this action. the FCC have become the first national-government telecommunications department in a major English-speaking country to use their executive power to  “set in stone” a minimum standard for “Net Neutrality”.

Basically, their standard requires wireline services (cable Internet, ADSL, optical-fibre) to pass all lawful Internet content and allow users to connect non-harmful devices to their Internet services. This would therefor prohibit limiting of access to “over-the-top” Internet video, VoIP and similar services. Similarly it requires wireless services (3G, WiMAX, etc) not to blocking sites that compete with their business offerings like VoIP services.

There is still a problem with the wireless services in that they could block access to competing app stores on platforms that permit such stores, set up “walled gardens” when it comes to mobile content or provide “preferential tariffs” for particular services. This can be of concern to those of us who, for example, use client-side applications and commonly-known URLs to gain access to the Social Web rather than the carrier’s preferred “entry point” bookmarks or URLs. Similarly, the carrier could gouge people who go to favourite media Websites rather than the ones that the carrier has a partnership with. This last point may be of concern when mass-media outlets and wireless-broadband carriers see the “mobile screen” as another point of influence over the populace and establish partnerships or mergers based on this premise.

Net Neutrality will also have to be considered an important issue as part of defining the basic Internet service standard for the country so that service providers or gavernments can’t provide it just to people who purchase upper-tier service for example.

A good issue would be for other national-government or trading-bloc communications authorities to tune this definition further so that if there is the goal of Net Neutrality, it becomes harder to avoid the standard.

Interview Series–Network audio and video

Introduction

Between the end of October and the beginning of November, I had a chance to interview people who work with two different companies that work in the consumer audio-video market and had noticed some trends concerning this market and its relevance to the online world.

One main trend was that there was increased focus by consumer-audio manufacturers who work in the popular marketplace on delivering DAB+ digital radio equipment rather than network-connected audio equipment to the Australian market. This may be because some of these firms need to see this technology become more popular here and want to have “every base covered”.

Sony

From my interview with Kate Winney I had observed that Sony had a strong presence in the connected-TV scene. Here, this was more concentrated with their newer “main-lounge-area” TVs but they are providing this functionality on some of their video peripherals, namely their BD-Live Blu-Ray players.

We agreed that Sony had no Internet radio in its product lineup although they implement Shoutcast on their high-end home-theatre receivers like the STR-DA5500ES. But we agreed that they need to make DAB+ available on their stationary “big sets” like hi-fi tuners, receivers, home-theatre-in-box systems and bookshelf audio systems. They are releasing a few DAB+ sets but most likely as stereo systems rather than as portables or components.

I had stressed to Kate about Sony implementing vTuner or a similar directory-driven service which is implemented in most Internet radios. This is because most of these services offer access to the simulcast streams of the government, commercial and community radio stations broadcasting to local countries around the world as well as the Internet-only streams of the kind that Shoutcast offers. It is also because most people who are interested in Internet radio are likely to want to use it as a way of enjoying the “local flavour” of another country that is provided by that country’s regular broadcasters rather than just looking for offbeat content.

Kate also reckoned that DAB+ digital radio needs to be available in the dashboard of cars in the new fleet, preferably as standard equipment or as a “deal-broker” option offered by car dealers for the technology to become popular. I was also thinking about whether Sony should offer DAB+ technology as part of the XPLOD aftermarket car-audio lineup.

Bush Australia

From my interview with Jacqueline Hickman, I had noticed that Bush are still focused on implementing DAB+ digital radio in Australia but are using Internet radio as a product differentiator for their high-end “new-look” sets that are to appeal to young users

Their market focus for consumer audio is on the “small sets” like table / clock radios, portable radios, small-form stereo systems but I have suggested implementing or trying some value-priced “big sets” as product ideas. This is even though they run some “main-lounge-area” TVs and digital-TV set-top boxes in their consumer video lineup.

The ideas I put forward are a DAB+ or DAB+ / Internet-radio tuner that is for use with existing audio equipment and a FM / DAB+ (or FM / DAB+ / Internet-radio) CD receiver with optional speakers. A market that I cited are the mature-aged people who own “classic hi-fi speakers” from 1960s-1980s that they like the look and sound of but may want to run them with a simpler cost-effective component. I had made a reference to the “casseivers” of the late 70s and early 80s which have an receiver and cassette deck in one housing and what these units offered. Jacqui had reckoned that companies like B&O and Bose filled the market but I have said that some of the companies have gone to active speakers rather than integrating power amplifiers in the equipment. As far as the DAB+ tuner is concerned, she suggested that a person could use a portable DAB+ set and connect it to the amplifier using an appropriate cable.

I raised the topic of IPTV but Jackie was not sure whether this will be implemented in any of their TV sets or set-top boxes at the moment. This sounds like a product class that hasn’t been properly defined with a particular standard and platform especially in this market.

Conclusion

It therefore seems to me that there is more interest by consumer-electronics companies in nurturing the DAB+ digital radio system and the DVB-T digital TV system because they are based on established technology and established metaphors; and appeal more to “Joe Six-Pack” than the Internet-based technologies.

Also, I had noticed that it takes a long time for all equipment classes to benefit from a new technology. This is more so with DAB+ digital radio and, to some extent, Internet radio where the mains-operated stationary “large sets” like hi-fi equipment and stereo systems are under-represented.

Processor Chipsets with built-in Graphics

 

BBC News – Intel to launch chipsets with built-in graphics

My comments

With Intel now showing interest in supplying a processor chip with an integrated graphics processor, this will raise the stakes when it comes to supplying single-chip CPU / GPU solutions.

Why supply a single-chip CPU/GPU solution

There is the obvious benefit in design size that it would yield. This would of course allow for more compact applications and, of course, the bill-of-materials costs would be reduced thus allowing for cheaper devices. Another key benefit would be that the single-chip solution would have reduced power needs, which is important for battery-operated devices like laptops, tablet computers and, especially, smartphones.

There is also the reality that most consumer electronics devices like electronic picture frames, digital cameras, TVs / video peripherals and hi-fi equipment are being designed like the general-purpose computers and most of them will also benefit from these CPU/GPU chips. This has become evident with most of these devices offering network and Internet connectivity in a way to augment their primary function or beyond that primary function.  They will also yield the reduced “bill-of-materials” costs and the reduced power demands for this class of device which will become a market requirement.

Similarly, an increasing number of office equipment / computer peripherals, home appliances and “backbone” devices (HVAC / domestic-hot-water, building safety / security, etc) are becoming increasingly sophisticated and offering a huge plethora of functions. I had noticed this more so with the multifunction printers that I have reviewed on this site where most of them use a colour bitmap LCD display and a D-toggle control as part of their user interfaces.

Therefore manufacturers who design these devices can benefit from these single-chip CPU/graphics solutions in order to support these requirements through reduced supporting-power requirements or design costs. In the case of “backbone” devices which typically require the uses to operate them from remotely-located user-interface panels i.e. programmable thermostats or codepads, there isn’t the need to require too much power from the host device to support one or more of these panels even if the panel is to provide access to extended functions.

The market situation

The Intel Sandy Bridge which is just about to be launched at the time of publication, would provide improved graphics. This is in a market which AMD has just entered with their Zacate CPU / graphics chip and been dominated by ARM who have been involved in the smartphone scene. This firm’s design was infact used as part of the Apple A4 chip used in the iPhone 4 and iPad.

With three companies in the market, this could yield a highly-competitive environment with a run for high-quality quickly-drawn graphics, quick CPU response, power conservation / long battery runtime and small circuit size / reduced bill-of-materials. This may also yield a “run for the best” which also yields desirable functionality being available at prices that most people can afford.

The only limitation with this concept is that the single-chip design may make the market for discrete graphics chipsets and cards only for people who value extreme-performance graphics.

Conclusion

The reduced size of these new single-chip CPU/GPU setups could replicate the success of what has happened with the arrival of the 80486 processor with its integrated floating-point coprocessor. It could then make for a longer battery runtime for portable applications and lead to smaller cooler-running computers for most applications.

Graphics chipsets: ATI is no more, AMD is now the brand

 AMD jettisons ATI brand name, makes Radeon its own – The Tech Report

My comments

Some of us who have observed what has happened with the ATI graphics chipset name was taken over by AMD and were wondering what would happen with this name and the graphics-chipset scene.

Now that AMD has changed the brand for the ATI chipsets to their own brand, who knows what could happen next especially when it comes to computer display solutions, especially integrated-display setups like in laptops, all-in-one PCs and low-profile desktop computers.

One way that the situation could evolve would be for AMD to end up making motherboard or chipset solutions centred around an AMD CPU and GPU setup. This may be in a similar vein to the Intel Centrino solutions which include an Intel Wi-Fi chipset as well as the Intel CPU.

The worst thing that could affect high-end graphics and gaming users is for AMD to pull out of the plug-in display-card scene thus reducing a competitive aftermarket when it comes to performance graphics. This is because the ATI brand has been put up as an alternative to NVIDIA when it came to aftermarket and OEM plug-in display cards pitched at the gaming, multimedia and performance graphics scene.

Once we see disappearance of brands that are part of a competitive market, there has to be others who well provide competing products or a nasty monopoly or cartel can start to exist.