Author: simonmackay

Raising the bar with MiFi router design

Articles

AT&T’s new MiFi Liberate is LTE-capable, ‘world’s first’ with touchscreen display – Engadget

AT&T Shoved a Touchscreen in Its Latest LTE MiFi Hotspot Because, Hey, Why Not! | Gizmodo

AT&T’s MiFi Liberate LTE is first touch-screen hot spot | CNet

From the horse’s mouth

AT&T – Press Release

My Comments

AT&T have released a new “MiFi” router for 4G wireless broadband networks in the form of the AT&T-Novatel MiFi Liberate. Here, this device is not your “father’s old station wagon”.

The device borrows the P-shaped design cues from the Apple Magic Trackpad and some door-handles rather than Microsoft’s newer input devices. Users can manage their connection using a colour LCD touchscreen rather than the typical Web user interface and, in some cases, a monochrome LCD or OLED display.

It can connect up to 10 concurrent Wi-Fi devices to the 4G LTE wireless-broadband connection and can do this for 11 hours on its own battery. What also impresses me about this MiFi is that it, like a few recent AT&T MiFis, has the ability to share files off a microSD card, including the ability to share media to UPnP AV / DLNA devices like Internet radios. This function could be taken further if the MiFi could mount microSDXC cards of 64-128Gb capacity.

At the moment it’s only through AT&T but I would like to see more carries who run LTE-compliant 4G networks offer this device in their 3G wireless router lineup. The firmware that all of the carriers who run with this device should support all of the functions including the file-sharing / DLNA functionality.

Creating a small data cloud–why couldn’t it be done

The small data-replication cloud

Netgear ReadyNAS

These network-attached storage devices could be part of a personal data-replication cloud

Whenever the “personal cloud” is talked of, we think of a network-attached storage device where you gain access to the data on the road. Here, the cloud aspect is fulfilled by a manufacturer-provided data centre that “discovers” your NAS using a form of dynamic DNS and creates a “data path” or VPN to your NAS. Users typically gain access to the files by logging in to a SSL-secured Web page or using a client-side file-manager program.

But another small data cloud is often forgotten about in this class of device, except in the case of some Iomega devices. This represents a handful of consumer or small-business NAS units located at geographically-different areas that are linked to each other via the Internet. Here, they could synchronise the same data or a subset of that data between each other.

This could extend to applications like replicating music and other media held on a NAS to a hard disk installed in a car whether the vehicle is at home, at the office or even while driving. The latter example may be where you purchase or place an order for a song or album via the wireless broadband infrastructure with the content ending up on your car’s media hard disk so it plays through its sound system. Then you find that it has been synchronised to your home’s NAS so you can play that album on your home theatre when you arrive at home.

What could it achieve?

An example of this need could be for a small business to back up their business data to the network-attached storage device located at their shop or office as well as their owner’s home no matter where the data is created.

Similarly, one could copy their music and video material held on the main NAS device out to a NAS that is at holiday home. This can lead to location-specific speedy access to the multimedia files and you could add new multimedia files to the NAS at your holiday home but have this new collection reflected to your main home.

Here, one could exploit a larger-capacity unit with better resiliency, like the business-grade NAS units pitched at small businesses, as a master data store while maintaining less-expensive single-disk or dual-disk consumer NAS units as local data stores at other locations. This setup may appeal to businesses where one location is seen as a primary “office” while the other location is seen as either a shopfront or secondary office.

This kind of setup could allow the creation of a NAS as a local “staging post” for newly-handled or regularly-worked data so as to provide a resilient setup that can survive a link failure. In some cases this could even allow for near-line operation for a business’s computing needs should the link to a cloud service fail.

User interface and software requirements

This same context can be built on the existing remote-access “personal cloud” infrastructure and software so there is no need to “reinvent the wheel” for a multi-NAS cloud.

Similarly, users would have to use the NAS’s existing management Web page to determine the location of the remote NAS devices and the data sets they wish to replicate. This can include how the data set is to be replicated such as keeping a mirror-copy of the data set, or contributing new and changed data to a designated master data set or a combination of both. The data set could be the copy of a particular NAS volume or share, a folder or group of folders or simply files of a kind.

The recently-determined UPnP RemoteAccess v2 standard, along with the UPnP ContentSync standards could simplify the setup of these data-synchronisation clouds. This could also make it easier to provide heterogenous data clouds that exist for this requirement.

But one main requirement that needs to be thought of is that the computer systems at both ends cannot collapse or underperform because the link fails. There has to be some form of scalability so that regular small-business servers can be party to the cloud, which may benefit the small-business owner who wants to integrate this hardware and the home-network hardware as part of a data-replication cloud.

Hardware requirements

A small data cloud needs to support cost-effective hardware requirements that allow for system growth. This means that it could start with two or more consumer or SME NAS devices of a known software configuration yet increase in capacity and resilience as the user adds or improves storage equipment at any location or rents storage services at a later stage.

This could mean that one could start with one single-disk NAS unit at each location, then purchase a small-business NAS equipped with a multi-disk RAID setup, setting this up at the business. The extra single-disk unit could then be shifted to another location as a staging-post disk or extra personal backup.

Conclusion

What NAS manufacturers need to think of is the idea of supporting easy-to-manage multi-device data-replication “personal clouds” using these devices. This is alongside the current marketing idea of the remote-access “personal cloud” offered for these devices.

Product Review–Toshiba AT300 10” Android tablet computer

Introduction

I am reviewing the Toshiba AT300 which is their current-model 10” Android consumer tablet computer. Compared to most other tablet computers, it is available in only one configuration which is a 16Gb unit which works only from Wi-Fi wireless networks.

Toshiba AT300 10" Android tablet computer

Price
– reviewed configuration
RRP AUD$539
Processor NVIDIA Tegra 3
RAM 1Gb
Screen 10” widescreen (1280×800) LED-backlit LCD
User Memory 16Gb SDXC reader
Operating environment Android 4.0 Ice Cream Sandwich
Connectivity Wi-Fi 802.11g/n
Bluetooth 3.0
USB 1 x MicroUSB 2.0,
USB 2.0 via proprietary docking plug
Audio 3.5mm audio input-output (headset) jack, audio output via proprietary docking plug, digital audio via HDMI
Video microHDMI, HDMI via proprietary docking plug.
Performance Index Quadrant 3985 (below ASUS Transformer Prime TF201)

The unit itself

Aesthetics and build quality

The Toshiba AT300 was well built for a good-quality tablet and had a metal-mesh backing. It was also well finished even though the glossy touchscreen was able to get the fingerprints too easily.

As for temperature control, this unit was able to keep its cool thanks to he mesh backing. This may be important if we see Android apps that work the Tegra 3 ARM processor very hard.

Display

The Toshiba AT300 tablet’s display was very responsive to the touchscreen input, showing the results very quickly and rendering the animations that Android Ice Cream Sandwich put up very quickly. Still pictures come through very crisply with this tablet, making it suitable to use as a photo viewer or digital photo frame for home or business.

For video playback, the display subsystem even shone with the smoothness even when fed the video via an on-demand video service. As I have said before, the glossy display is still prone to be too reflective in broad daylight.

Audio

The Toshiba AT300 played some music files from my networl-attached storage device as well as Internet hosted audio like Internet radio and this worked very smoothy. The sound quality was very good when I used the device with  lot of slim devices, audio quality doesn’t make it with the integrated speakers

The Toshiba Media Player app that comes with this tablet is no crapware – it works properly with DLNA media servers as well as content hosted locally on the tablet. I tried this out with music and photos held on a WD NAS that uses the TwonkyMedia Server software as its DLNA media server.

Connectivity and Expandability

Toshiba AT300 10" Android tablet docking connector

The docking connector that the tablet uses for charging and data transfer

The Toshiba AT300 used a MicroUSB data port but also used a proprietary docking connector for its power supply. This is to primarily work with a tablet dock that Toshiba supplies as an optional extra, where it has an audio output, standard USB connectors and standard HDMI connector.

But the MicroUSB connector could be a data / power port so you can use the standard MicroUSB cable with a charger rather than worrying whether you have the Toshiba cable or not. There is also a microHDMI connector that you can use with a suitable cable to connect to HDMI-equipped external displays.

Toshiba AT300 10" Android tablet side connections - SD card, microUSB, microHDMI and headphone jack

Side connections – SD card slot, microUSB port, microHDMI port and headphone/microphone jack

Like most Android tablets, the Toshiba has an integrated SD card reader which you can use to effectively expand your tablet’s memory. This is also handy if you want to use the tablet to review and edit your images that you just took with your digital camera.

Performance

The Toshiba AT300 10” Android tablet performed as expected for a good-quality Android tablet using the NVIDIA Tegra chipset.

The network performance was very smooth for most activities including video streaming. The unit was also very sensitive for the Wi-Fi reception.

I ran the Quadrant Android performance test and found that this unit comes in at a benchmark of 3985 which works just under the ASUS Transformer Prime TF201 hybrid tablet. This shows that it can come up properly with its peers as far as computing and graphics performance go.

As far as the battery life was concerned, the battery yielded 80% left after I watched one hour of on-demand video via the home network. It was also very frugal with the battery for most other activities.

Limitations and Points Of Improvement

If Toshiba were to create a tablet that is a viable iPad alternative, they could supply a variant with an integrated wireless broadband modem. On the other hand, this tablet could just be used with the home Wi-Fi network, a public-access Wi-Fi network, a “Mi-Fi” router or an Android phone that supports Wi-Fi tethering.

Toshiba AT300 10" Android tabletAs well, I would like to see this tablet put in the queue for the Android 4.1 Jelly Bean update in order to satisfy newer expectations that would be required of this platform.

Conclusion

This may be a hard decision to call but I would recommend the Toshiba AT300 as a 10” consumer-grade highly-capable alternative to the Apple iPad. This is more so if you value the tablet to he just about up to date on the operating environment and expect it to be used for multimedia, games, email-reading and Web-surfing while in bed or on the couch. Business users could value it for use as part of digital visual merchandising efforts, as a large-screen reference book or quick-view information terminal.

Improving the way mobile operating environments are managed

Lately, Apple pulled the pin from Google’s YouTube app by not including it with the iOS 6 distribution. As well, they worked on their own maps platform for this same distribution rather than continue to use Google’s mapping platform. In some cases, there have been functionality or security weaknesses in the Android or iOS operating platform which require Apple or Google to furnish a new “point-level” distribution of the operating system. This also applies if they want to roll up the operating platform to a new requirement.

These updates typically require the host device to be restarted as part of the update process. As well, the process of downloading the complete package to fix a problem could place the device at risk of being put out of action if the connection failed. In some cases, the mobile operating-platform vendor puts off rolling out a needed patch until they have to add extra key functionality that makes it worth the while to deliver a major update.

Compare this to how an app for these platforms is kept up to date. Once you download the app from your operating platform’s app store, it is always checked for the latest version updates. Once there is a new version of the app available, the software is placed on the “Updates” list so you can start a bulk update or, depending on the platform and app store, you could set up an automatic app update so that the software is updated in the background.

Personally, I would like to see baseline functions for mobile computing devices made available as separately-updatable apps. This practice, of what Google has done with keeping YouTube and Google Maps up to date on Android, allows the functions like music / media players, email / messaging apps and the like to be kept up to date in a similar manner to the app you download from the platform’s app store.

Here, the platform developer could keep a mapping program up to date and behaving properly or add functionality to and improve the quality of the music player without having to wait for the next operating-system update. The user then experiences the mapping program, music player up to date to new requirements and working properly by just simply calling in to the app store and checking the update panel.

For the developer, they can have teams working on maintaining these apps and rolling out the updates as they are signed off and ready while another team can hone the baseline operating system through its lifecycle. In some cases, it could allow the developer to do things like prepare peripheral-interface code for new peripheral-device types and have that delivered as needed to the devices.

In this case, if Apple used this practice for keeping their Maps function up to date, they can be sure that their fanbois can update the function without having to download a “point” version of iOS 6 to their devices. Similarly, Google could “fix up” Android function apps that are misbehaving frequently and allow their users to see a stable Android device.

Wi-Fi login problems with iOS 6 devices

Article

What went wrong with iOS 6 Wi-Fi | ZDNet – loop

My Comments

You may have upgraded your iPhone or iPad to iOS 6. But after your Apple device shuts down and restarts as part of applying the update, you find that you are not on your home or business Wi-Fi network even though you downloaded that update through the same network.

The problem is not necessarily a flawed network configuration, but part of the iOS Wi-Fi automatic troubleshooting routine. Here, the software attempts to load a “Success” stub page from the Apple servers. This logic is intended to cause the iOS device to load a login or “assent” page that is part of a public-access or guest-access Wi-Fi network’s user experience. This stub was deleted by a former Apple employee before he left without realising it was part of iOS 6 troubleshooting logic.

The computer press have realised that this logic is flawed because this can place the servers at risk of denial-of-service attacks thus crippling iOS 6 devices. Similarly, someone could use a “man-in-the-middle” or “evil-twin” attack to point the device to a site that is of a malevolent nature. If a “show particular Webpage” logic is to be implemented in a network troubleshooting logic, it could work with a list of commonly-available Websites like Web portals or Web resource pages which the device chooses from at random.

It could be a chance for software developers to create network-test logic that makes less reliance on loading a particular Web site as proof of function. This could be through use of simplified randomised test routines that work with locations that are randomly chosen from a list of commonly-known highly-available Internet locations. This can be augmented by government standards bodies and similar organisations like NIST or BSI adding basic-HTML “Internet Success” pages to their Websites and making the URLs available to the IT industry.

Sometimes an NTP or similar time-fetch routine that obtains the time from one of many atomic-clock time servers to synchronise a device’s internal clock can work as a simplified Internet-functionality-test routine. If the time-server supports HTTP access where the UTC time is obtained via an HTML or text string, this could be achieved using HTTP so as to test Web-access functionality.

By not relying on one particular server as a proof-of-functionality test for Internet access and integrating a “login-page load” failover routine for public-access networks, we can achieve a safe and sure network setup experience.

Other manufacturers can yield more cool devices now

Click to view: Samsung’s latest video / TV ad for the Galaxy S 3

My Comments

Just lately, as Apple were launching the iPhone 5 and the fanbois were lininig up outside the Apple Stores or mobile-carrier outlets to be the first to get this phone, Samsung have been running a video campaign about how more advanced their phones are compared to the Apple product.

Previously, I touched on Android’s competitive-environment abilities such as the use of other browsers or ability to shift content to the phone using the computer’s file system. This has also underscored the ability to provide paths to innovation that we are seeing in devices that work to this platform. The commercial that I am referring to, along with other Samsung TV commercials for the Galaxy S3, even emphasised the near-field communication technology as a content-transfer technology rather than just as an authentication technology, thanks to Android Beam.

Similarly, the latest crop of Windows-based computers that appeared over the last few years are showing that this operating environment is still a breeding ground for innovation. One key feature that we will be seeing more of is the touchscreen on these computers, most of which will have this feature work alongside a supplied or standards-compliant optional keyboard. I even reviewed a taste of things to come when I reviewed the Sony VAIO J Series all-in-one desktop. This was also augmented when I heard of a Toshiba Ultrabook that was to come with an NFC, which could support file transfer in the Android Beam manner.

This is showing that there are other companies and IT operating platforms out there who can make and improve the technology that maintains the “cool factor” in its use, rather than only one company with its platforms. It is the sign of healthy competition when this kind of innovation takes place.

Symantec Symposium 2012–My observations from this event

Introduction

Yesterday, I attended the Symantec Symposium 2012 conference which was a chance to demonstrate the computing technologies Symantec was involved in developing and selling that were becoming important to big business computing.

Relevance to this site’s readership

Most solutions exhibited at this conference are pitched at big business with a fleet of 200 or more computers. But there were resellers and IT contractors at this event who buy these large-quantity solutions to sell on to small-business sites who will typically have ten to 100 computers.

I even raised an issue in one of the breakout sessions about how manageability would be assured in a franchised business model such as most fast-food or service-industry chains. Here, this goal could be achieved through the use of thin-client computers or pre-configured equipment bought or leased through the franchisor.

As well, the issues and solution types of the kind shown at this Symposium tend to cross over between small sites and the “big end of town” just like a lot of office technology including the telephone and the fax machine have done so.

Key issues that were being focused were achieving a secure computing environment, supoorting the BYOD device-management model and the trend towards cloud computing for the systems-support tasks.

Secure computing

As part of the Keynote speech, we had a guest speaker from the Australian Federal Police touch on the realities of cybercrime and how it affects the whole of the computing ecosystem. Like what was raised in the previous interview with Alastair MacGibbon and Brahman Thiyagalingham about secure computing in the cloud-computing environment, the kind of people committing cybercrime is now moving towards organised crime like East-European mafia alongside nation states engaging in espionage or sabotage. He also raised that it’s not just regular computers that are at risk, but mobile devices (smartphones and tablets), point-of-sale equipment like EFTPOS terminals and other dedicated-purpose computing devices that are also at risk. He emphasised issues like keeping regular and other computer systems up to date with the latest patches for the operating environment and the application software.

This encompassed the availability of a cloud-driven email and Website verification system that implements a proxy-server setup. This is designed to cater for the real world of business computing where computer equipment is likely to be taken and used out of the office and used with the home network or public networks like hotel or café hotspots. It stays away from the classic site-based corporate firewall and VPN arrangement to provide controlled Internet access for roaming computers. It also was exposing real Internet-usage needs like operating a company’s Social-Web presence, personal Internet services like Internet banking or home monitoring so as to cater for the ever-increasing workday, and the like. Yet this can still allow for an organisation to have control over the resources to prevent cyberslacking or viewing of inappropriate material.

Another technique that I observed is the ability to facilitate two-factor authentication for business resources or customer-facing Websites. This is where the username and password are further protected by something else in the similar way that your bank account is protected at the ATM using your card and your PIN. It was initially achieved through the use of hardware tokens – those key fobs or card-like devices that showed a random number on their display and you had to enter them in your VPN login; or a smart card or SIM that required the use of a hardware reader. Instead Symantec developed a software token that works with most desktop or mobile operating systems and generates this random code. It even exploits integrated hardware security setups in order to make this more robust such as what is part of the Intel Ivy Bridge chipset in second-generation Ultrabooks.

Advanced machine-learning has also played a stronger part in two more secure-computing solutions. For example, there is a risk assessment setup being made available where an environment to fulfill a connection or transaction can be assessed against what is normal for a users’s operating environment and practices. It is similar to the fraud-detection mechanisms that most payment-card companies are implementing where they could detect and alert customers to abnormal transactions that are about to occur, like ANZ Falcon. This can trigger verification requirements for the connection or transaction like the requirement to enter a one-time-password from a software token or an out-of-band voice or SMS confirmation sequence.

The other area where advanced machine-learning plays a role in secure computing is data loss prevention. As we hear of information being leaked out to the press or, at worst, laptops, mobile computing devices and removable storage full of confidential information disappearing and falling in to wrong hands, this field of information security is becoming more important across the board. Here, they used the ability to “fingerprint” confidential data like payment card information and apply handling rules to this information. This includes implementation of on-the-fly encryptions for the data, establishment of secure-access Web portals, and sandboxing of the data. The rules can be applied at different levels and affect the different ways the data is transferred between computers such as shared folders, public-hosted storage services (Dropbox, Evernote, GMail, etc), email (both client-based and Webmail) and removable media (USB memory keys, optical disks). The demonstration focused more on the payment-card numbers but I raised questions regarding information like customer/patient/guest lists or similar reports and this system supports the ability to create the necessary fingerprint of the information to the requirements desired.  

Cloud-focused computing support

The abovementioned secure-computing application makes use of the cloud-computing technology which relies on many of the data centres scattered around the world.

But the Norton 360 online backup solution that is typically packaged with some newer laptops is the basis for cloud-driven data backup. This could support endpoint backup as well as backup for servers, virtual machines and the like.

Mobile computing and BYOD

Symantec have approached the mobile computing and BYOD issues in two different paths. They have catered for the fully-managed devices which may appeal to businesses running fleets of devices that they own or using tablets as interactive customer displays. But they allowed for “object-specific” management where particular objects (apps, files, etc) can be managed or run to particular policies.

It includes the ability to provide a corporate app store with the ability to provide in-house apps, Web links or commercial apps so users know what to “pick up” on their devices. These apps are then set up to run to the policies that affect how that user runs them, including control of data transfer. This setup may also please the big businesses who provide those services that small businesses often provide as an agent or reseller, such as Interflora. Here, they could run the business-specific app store with the line-of-business apps like a flower-delivery-list app that runs on a smartphone. There is the ability to remotely vary and revoke permissions concerning the apps, which could come in handy when the device’s owner walks out of the organisation.

Conclusion

What this conference shows at least is the direction that business computing is taking and was also a chance to see core trends that were affecting this class of computing whether you are at the “big end of town” or not.

Is this what the new super slim PlayStation 3 is all about

Articles

Sony unveils super slim PlayStation 3 | Crave – CNET

Sony PlayStation 3 2012 up close and personal eyes on | Engadget

From the horse’s mouth

US Press Release

European Press Release

My Comments

The press have been afield with the news about Sony’s latest PlayStation 3 games console. But this one is a major redesign to cope with the smaller space that newer consolidated electronics can occupy. This has yielded a smaller console that is significantly lighter and doesn’t use as much power as the existing units.

One main difference is that it has a top-loading Blu-Ray drive for your games and movies. This uses a sliding lid in a similar vein to some CD players like the B&O Beocenter 9000 series music systems rather than the hinged lid that, in my opinion, is asking for problems. 

There are two main design variants – one with a 500Gb hard disk and a cheaper variant with 12Gb flash memory with the ability to add in an optional 250Gb hard disk. The American market would have the console come with the 250Gb hard disk in the box. The cheaper version may work with occasional gamers and those of us who use the PS3 more as a network media client rather than as the full-on games console.

Of course there will be access to the PlayStation Network and the local video-on-demand services that has allowed the PS3 to earn its keep as a network multimedia terminal rather than just a games console for teenagers and young men. It will also have the same performance expectations as the current-generation PS3.

But could these variants be a way to bring the PlayStation Experience to more households or allow one to increase the feasibility for more of the multi-player multi-machine gaming from this console?

A CCTV hacking incident could be a lesson in system lifecycle issues

Article

How A Prison Had Its CCTV Hacked | Lifehacker Australia

My Comments

In this article, it was found that a prison’s video-surveillance system was compromised. The security team checked the network but found that it wasn’t the institution’s main back-office network that was compromised but a Windows Server 2003 server that was affected. This box had to be kept at a particular operating environment so it could work properly with particular surveillance cameras.

The reality with “business-durable” hardware and systems

Here, the problem was focusing on an issue with “business-durable” hardware like the video-surveillance cameras, point-of-sale receipt printers and similar hardware that is expected to have a very long lifespan, usually in the order of five to ten years. But computer software works to a different reality where it evolves every year. In most cases, it includes the frequent delivery of software patches to improve performance, remedy security problems or keep the system compliant to new operating requirements.

Newer software environments and unsupported hardware

The main problem that can occur is that if a computer is running a newer operating environment, some peripherals will work on lesser functionality or won’t work at all. It can come about very easily if a manufacturer has declared “end of life” on the device and won’t update the firmware or driver set for it. This also applies if a manufacturer has abandoned their product base in one or more of their markets and leaves their customers high and dry.

Requirement to “freeze” software environments

Then those sites that are dependent on these devices will end up running servers and other computer equipment that are frozen with a particular operating environment in order to assure the compatibility and stability for the system. This can then compromise the security of the system because the equipment cannot run newly-patched software that answers the latest threats. Similarly, the system cannot perform at its best or support the installation of new hardware due to the use of “old code”.

In some cases, this could allow contractors to deploy the chosen updates using removable media which can be a security risk in itself.

Design and lifecycle issues

Use standards as much as possible

One way to tackle this issue is to support standard hardware-software interfaces through the device’s and software’s lifecycle. Examples of these include UPnP Device Control Protocols, USB Device Classes, Bluetooth Profiles and the like. It also includes industry-specific standards like ONVIF for video-surveillance, DLNA for audio-video reproduction

If a standard was just ratified through the device’s lifespan, I would suggest that it be implemented. Similarly, the operating environment and application software would also have to support the core functionality such as through device-class drivers.

Provide a field-updatable software ecosystem

Similarly, a device would have to be designed to support field-updatable software and any software-update program would have to cover the expected lifespan of these devices. If a manufacturer wanted to declare “end of life” on a device, they could make sure that the last major update is one that enshrines all industry-specific standards and device classes, then encompass the device in a “software roll-up” program that covers compliance, safety and security issues only.

As well, a “last driver update” could then be sent to operating-system vendors like Microsoft so that the device can work with newer iterations of the operating systems that they release. This is more so if the operating-system vendor is responsible for curating driver sets and other software for their customers.

The device firmware has to work in such a way to permit newer software to run on servers and workstations without impairing the device’s functionality.

As well, the field-updating infrastructure should be able to work in a similar way to how regular and mobile computer setups are updated in most cases. This is where the software is sourced from the developers or manufacturers via the Internet, whether this involves a staging server or not. This should also include secure verification of the software such as code-signing and server verification where applicable.

Conclusion

What this hacking situation revealed is that manufacturers and software designers need to look seriously at the “business-durable” product classes and pay better attention to having them work to current expectations. This then allows us to keep computer systems associated with them up to date and to current secure expectations.

WiFi Direct–Another way to share files between Android devices

Article

WiFi Shoot: Sharing files over Wi-Fi Direct | Android Authority

My Comments

The Android mobile phone platform has provided many options for “throwing” files between devices.

Firstly, there was the Bluetooth “object-push” profile where you can share material between devices that have this protocol and are set up for it. This includes Android and Symbian-based mobile phones and some devices like a few Bluetooth printers and printing kiosks.

There was the subsequent arrival of the “Bump” ecosystem which allowed you to transfer the files via Internet after you “bump” the phones next to each other. This implemented a “recognised bump” pattern to register users with this system.

Next the Android platform integrated Near-Field Communication as part of the Ice Cream Sandwich iteration and implemented the file transfer as a specific function called “Android Beam”. This was exemplified in the TV advertising that Samsung did for the popular Galaxy S II phone and Samsung’s “super variant” of that function where two people touched each others’ phones to each other.

Now that most newer Android devices come with Wi-Fi Direct, a new app has been launched to enable one to “throw” files between these devices using this method. The app which is called WiFi Shoot and is currently in beta version, exposes itself as a “share” option for images and videos and can transmit the images or videos; or receive any of these files.

There are plans to open it up to a larger array of content types once the bugs are ironed out of it. Similarly, it could support “throwing” of files to and from other non-Android devices that use Wi-Fi Direct as a file-transfer or object-transfer method such as printers that could print photos or Windows PCs that have the appropriate software.

I see this as another way that the Android platform is working towards a level and competitive playing field for activities involving mobile computing.