Tag: cloud computing

Telephone Interview–UPnP Forum (Wouter van der Beek)

Introduction

UPnP Forum logo courtesy of UPnP ForumI have had the chance to interview Wouter van der Beek who is the Vice President of the UPnP Forum which defines the standards and specifications associated with UPnP technology. This interview is primarily about the direction that the UPnP Forum and this technology is heading in the face of current personal-computing trends like cloud computing and the Internet Of Things.

What is UPnP

This is a collection of standard for interlinking network-connected devices at an application level. It is to facilitate discovery of the devices by other devices on that network along with the ability to benefit from what the device has. The idea had been seeded 15 years ago when the home network was becoming commonplace thanks to affordable but powerful computers along with affordable broadband Internet services, but there needed to be foolproof ways to allow most people to set up, manage and benefit from these networks without requiring extensive computer skills.

Freebox Révolution - courtesy Iliad.fr

Freebox Révolution – an example of equipment designed with UPnP in mind

This has been facilitated initially with the Internet Gateway Device which has simplified management of Internet access for devices on a home network. If you use a UPnP-capable router and have its UPnP IGD function enabled, you don’t have to meddle around with different settings to get an online game or Skype to work via the Internet.

DLNA collections listed as sources on the TV

DLNA content collections listed as sources on a Samsung Smart TV

It has also been facilitated with DLNA-capable media devices which use the UPnP AV MediaServer or MediaRenderer device control protocols. This is where you could use a smart TV or a Blu-Ray player to discover photos or vides kept on your computer or network-attached storage device or “push” music from a Windows computer, NAS or Android smartphone to a Wi-Fi-enabled wireless speaker. Here, it has become to that point where UPnP and DLNA have become so synonymous as an expectation for anything that uses the home network to provide or play / show multimedia content in a similar way that Dolby noise reduction was an expected feature for good-quality cassette players.

The foolproof way of setting up and using UPnP-capable network equipment has, for that matter, had me look for devices that support these specifications when I am involved in buying or specifying network equipment.

New Directions for UPnP

UPnP’s New Zones of Relevance

Previously, the Universal Plug And Play technology was confined to the home network which encompassed computers and related devices that existed in one’s home and connected to a router which served as the network’s Internet “edge”.

Thanks to trends like the highly-mobile devices like smartphones, tablets and laptops; the online services and cloud computing, and the increasing role of social media in our lives;  the UPnP technology and, to some extent, the home network has changed its zone of relevance. This encompasses the following zones of relevance:

  • Personal, which would encompass the devices we take with us or have on ourselves like smartphones, tablets, smartwatches and fitness bands
  • Home, which would encompass what we have at home such as computers, routers, NAS units, home AV, appliances and the like, even encompassing devices associated with comfort, energy management and security
  • Car, which encompasses the technology associated or integrated in our vehicles like infotainment systems or powertrain-management systems
  • Workplace / Business which encompasses the computing and communications technologies used in the office and would also encompass devices associated with comfort, energy management and security
  • Industry which would encompass systems that provide the backbone for the modern life.

It also encompasses the Internet Of Things where devices can be required to be sensors or actuators for other devices and services in a universal manner.

An example of this was to establish some Device Control Protocols like the Telephony DCPs with a view to look towards the many zones of relevance and increase the UPnP ecosystem’s relevance to more users.

Cloud and Remote Access now part of UPnP

One major change is to integrate cloud computing, remote access and online services in to the UPnP ecosystem. Previously, a UPnP ecosystem was encompassing just one network, typically your home network and required each endpoint to be on the same network.

Different zones of relevance

UPnP is now about online services and remote access

Now situations have risen such as the desire to gain access to your content held at your home from your friend’s home or a hotel, or exhibit pictures held on Facebook or Dropbox on our smart TVs at home. Similarly, even in the same home, not all devices are connected to the same home network such as portable devices drifting in to Wi-Fi “dark spots” where there is very little reception or devices that are connected to a “guest network” on our routers.

Now cloud and remote access were written on as an annex to the UPnP Device Architecture but support for this is a requirement for UPnP+ certification. This is to factor in the ability for a UPnP “realm” to transcend across multiple logical networks.

One of the key additions was to integrate XMPP in to UPnP as part of the Cloud initiative in order to provide a level open playing field for cloud-driven applications. This also will provide for secure transport of the necessary data. It is more centred around the concept of creating virtual rooms which UPnP devices and services are invited in to as needed with these rooms being part of different logical networks or IP subnets.

Making UPnP “safe for business”

Empire State Building picture courtesy of <a href="http://ny-pictures.com/nyc/photo/photographer/604482/araswami">araswami</a> and <a href="http://ny-pictures.com/nyc/photo/">New York Pictures</a

UPnP – to be safe for business

You may also wonder whether there are steps to make UPnP technologies “safe for business”? There are some steps that have taken place to assure this goal because the different zones of relevance like workplace / business and industry place a key emphasis on security.

One of these is the DeviceProtection DCP which allows the creation of a “network of trust” amongst UPnP Devices and Control Points. This will be mandatory as part of UPnP+ certification whereas it was simply an optional feature for UPnP networks. Other DCPs that will become mandatory for UPnP+ certification include the “management” DCPs: DeviceManagement, ConfigurationManagement and SoftwareManagement which look after how a device is set up and updated.

Of course, these are considered “retrofit” solutions which assure secure links and setups and any security concept is primarily about “buying time” from hackers.

As well, DLNA had integrated various content-protection measures in to the VIDIPATH specification which encompasses UPnP AV standards to assure secure content delivery for premium content like Hollywood films and big-league sports.

The Internet Of Things

Rethinking Device Control Protocols

Previously the UPnP Forum placed emphasis on the Device Control Protocol as being the way to describe a UPnP device and what it can do. This ended up with each of these protocols taking a long time to develop, whether at the initial stages or as they were being revised.

Examples of these were the UPnP Internet Gateway Device which described what a modem or router was about and this was shaped by telcos and network-equipment vendors; and the AV Device which described media storage, playback and control with this being shaped by most of the main consumer-electronics names.

As well as the long time it took to develop a Device Control Protocol, there was the risk of focusing these protocols on an application-specific vertical plane with functionality being duplicated amongst multiple protocols.

The new direction was enshrined in the “Internet Of Things Management And Control” DCP which is focused around the particular tasks a sensor or actuator device can do. This also enshrines language and data models that can be used to define applications. But it allows a sensor or actuator which does the same thing to be described the same way.

There were two examples we talked of: – a temperature sensor, and a lamp used as part a home automation or building automation setup. A temperature sensor measures temperature but it could be part of a room thermostat, a weather station or a fridge, but it does the same job by measuring and reporting the current temperature. A lamp is turned on and off or has its brightness increased or decreased but this could work as part of a “smart home” setup or as part of a building automation setup for an office building or an apartment block.

As well, the data models can be evolved for particular applications and there is a short turnaround time required to set a data model in stone. This could allow one to define an application-level device class based on a collection of sensors and the kind of measurements to be used.

Network Bridges

Another reality that UPnP would face is devices based on other standards. This encompasses sensor and similar devices that work on networks like Zigbee, Z-Wave and Bluetooth that don’t work on an IP/Ethernet-based structure or Ethernet-based technology that doesn’t implement IP as a way to liaise with devices at higher levels. In a lot of cases, these networks have come about due to an expectation that battery-operated sensor and similar devices are expected to run for six months or more on a single set of commodity “dry-cell” batteries like AA-size Duracells or CR2035 “button-size” batteries.

The UPnP Internet Of Things effort also includes Device Control Protocols to address Network Bridges so they can work in a UPnP or UPnP+ ecosystem. This should solve a very common problem with “smart-home” devices typically smart locks and central-heating controls, where Internet-connectivity bridges for these devices are supplied by the manufacturer and are designed to work only with that manufacturer’s devices.

Achieving vendor universality

The UPnP Forum has made big strides in achieving vendor universality but it still relied on the use of logo programs like DLNA or Designed For Windows or potential buyers pouring through specifications to achieve this goal when buying or specifying devices. But some competing ecosystems typically required one physical device such as a wireless speaker to have physical and logical support for each of them, thus the row of logos that adorn the top edge of a device.

But they would like to use concepts like Network Bridges to provide support across different logical ecosystems and have UPnP as a “glue” between the ecosystems.

Conclusion

By stripping the UPnP platform to functions that are on an elementary level, it means that the ecosystem can be evolved to newer requirements that work across any computing zone-of-relevance independent of where the data source or destination is.

Consumer Electronics Show 2015–Part 2

Previously, in Part 1, I covered the trends that are affecting personal computing which encompases laptops / notebooks, tablets including the “2-in-1” convertible or detachable units, and the smartphones.

As I continue coverage of the trends shown at this year’s Consumer Electronics Show, I am highlighting what is being highlighted when we think of the connected world and the Internet Of Things. This is where devices we have on ourselves or use in the home, or the cars we drive, connect to each other and the Internet to acquire a range of impressive capabilites.

Wearable technology

There is an increasing number of smartwatches and other wearables being launched at this year’s Consumer Electronics Show. These are based on the Android Wear platform along with Tizen and other proprietary wearable platforms. It is although Apple has their smartwatch close to launch as part of their iOS ecosystem. A question that often came to mind is whether the smartwatch is to be seen as a bridge device between your smartphone and other wearable devices.

Sony raised the bar for Android Wear by integrating a GPS in to the metal-look variant of their Smartwatch 3 Android Wear watch. It may be seen as a way to provide standalone navigation and distance measurement for this watch or to serve as a secondary GPS sensor for your smartphone.

LG had headed towards smartwatches by putting forward one that is to run WebOS. This is part of having their devices run the descendent of the Palm operating system which HP refashioned as WebOS.

Lenovo had jumped on the wearable bandwagon by offering the Vibe lineup of wearable products. At the moment, the first of these products is the Vibe Band which is a water-resistant fitness band that uses an e-ink display, allowing for this device to run longer on a single battery charge.

There have been a few weirdly wonderful wearable devices like some snowboard bindings that help you plough through the powder better. These bindings measure the forces you apply on your feet as you slide down the slope and an app uses your smartphone’s GPS and these sensors to assess your snowboarding prowess. There is the Misfit LED which works alongside the Misfit range of activity trackers to show how you are performing. But the most weird device is the Emiota Belty which is a men’s dress belt that records your waistline and reports it back to your smartphone.

Hyundai Blue Link smartwatch app press photo courtesy of Hyundai America

Hyundai Blue Link smartwatch app – your smartwatch is your keyfob

The smartwatch is becoming part of the “connected car” ecosystem thanks to some vehicle builders. As I will mention below, BMW uses the smartwatch as a key fob that is to be part of their self-parking setup that they are working on. But Hyundai has presented the Blue Link app for the Apple Watch and Android Wear platforms so you can use this watch like the typical button-equipped car keyfob. Think of this as being to touch your watch to start your Veloster from afar, open its doors or have that coupe flash its headlights so you can locate it in the car park.

The connected car

Speaking of which, the car that links to the home network and the Internet is being given a fair bit of airtime by most of the vehicle manufacturers. This is promoted by Mercedes-Benz who were exhibiting a capsule-style self-driving concept car, Ford demonstrating their idea of a self-driving car, and other vehicle builders talking about the self-driving idea for cars.

Smartwatch control surface for car press picture courtesy of BMW America

Smartwatch as control element of BMW car

BMW took the modest path by demonstrating a self-parking variant of the i3 car. This smartwatch-controlled car looks for a parking spot by itself and implements a map-based setup where it has pre-loaded maps of car parks. This is very like a valet-parking setup but without the car-park attendant parking your car for you in that car park.

BMW self-parking car press picture courtesy of BMW America

It parks itself

Ford launched the third iteration of their Sync connected-car technology which will implement a touchscreen as part of its control surface and use of Blackberry QNX technology. This is intended to be part of what will be offered for the 2016 model-year vehicles.

Even the chipset manufacturers have dipped their finger in the connected-car scene with NVIDIA announcing that they are purposing Tegra and similar processors to power the connected-car dashboards.

Next generation VW infotainment setup press picture courtesy of VW America

Next generation VW infotainment works with Apple Play, Android Auto or MirrorLink

As for infotainment, there is a trend to support both Android Auto and Apple CarPlay in both factory-supply and aftermarket infotainment setups. This means that the advanced abilities of these systems can work in a system-native manner to both iPhone and Android users. The Volkswagen Group had put this forward in the latest factory-spec infotainment setups and were even involved in the level-playing-field idea of MirrorLink even when it was put forward.

Parrot have premiered the RNB6 which is a 2-DIN media unit which runs both Android Auto and Apple CarPlay but has 55 watts per channel output for all of the channels along with more options. Pioneer have launched this function in to some of their newer 2-DIN car radios. These efforts satisfy realities that exist in countries like Australia where people are likely to keep their cars on the road for a very long time.

Internet Of Everything

The Internet Of Everything has become a key feature of this show with companies either showcasing new gadgets that link with the Internet or showcasing improvements for existing gadgets with this kind of ability. Most of these devices are still pitched as a “system” of devices, cloud services and apps supplied by the same vendor that are dependent on each other and there haven’t been any devices that are pitched in a manner where they can work with other manufacturers’ devices, services or apps.

There have been some devices that are targeted at your baby’s health such as a smart baby bottle holder measures food intake. Another of these is a Bluetooth-connected infant thermometer that uses your smartphone as its display with this being developed by the company that is behind Moto’s smart temporary tattoo.

Parrot has launched houseplant water monitors that link to the home network. One is the H2O which is a sensor and automated watering system that you can use in-situ with your plants and the other is the Parrot Pot to put your plant into.

D-Link DCH-S160 myDLink water sensor press picture courtesy of D-Link America

D-Link myDLink water detector alerts you via your smartphone if your washing machine leaks or the bath overflows

BeeWi and D-Link are snapping at Belkin’s WeMo home-automation technology with their own technology. The latter have packaged it in as their myDLink package which is dependent on a home-automation hub even for the Wi-Fi devices. They have Z-Wave motion sensors and door magnet/reed sensors which interlink with this hub and also work as ambient temperature sensors.

They also have a Wi-Fi-based water-leak sensor that uses a wire to sense leaking water from that dribbling washing machine along with a Wi-Fi siren unit and smart plugs. This system is managed on your mobile device through an app that D-Link supplies. TRENDNet are running a HomePlug-based home automation package that links with their TPL-406E HomePlug AV500 adaptor and the THA-102PL appliance controller with both devices using the AC wiring to communicate to each other. They also have the THA-103AC which is a Wi-Fi-managed appliance controller that works as an AC750 Wi-Fi range extender and both these systems are controlled using an app for the iOS and Android platforms.

Kwikset Kevo cylindrical deadbolt in use - Kwikset press image

Kwikset Kevo Plus extends online monitoring and control to this Kwikset Kevo smart deadbolt

Two companies that are known for the common door lock have fielded some “smart-lock” products, but they are focused around the “bore-through” cylindrical deadbolt form-factor that is common on many American front doors. Firstly, Kwikset have provided an IP bridge and online service for their Kevo smart deadbolt. Here, the Bluetooth-IP bridge and online service allows for such functions as “remote unlock” for situations like when you have a friend or relative who doesn’t have a smartphone with the Kwikset Kevo app to come to your house to do some caretaking or fetch something for you or to have a repair technician visit your house to perform some repair works on an appliance while you are at work. The service is offered as an annually-billed service. August who offer a similar Bluetooth-driven smart lock have come up this path using their own IP bridge to provide “remote check / remote release” functionality.

Yale Real Living NFC-capable smart deadbolt - outside view (brass finish) press picture courtesy of Yale America

Yale Real Living smart deadbolt – enter using the code on the keypad or touch your open-frame smartphone to it

As well, Yale have launched an NFC-based smart lock that works to the Seos NFC-based smart locking platform that ASSA Abloy, the “Electrolux” of the door-hardware industry, have established. This is one that comes in the same form factor as the Kwikset Kevo but doesn’t use a key outside as a failover method. As well, it requires you to touch your NFC-capable Android smartphone to the outside keypad to unlock your door.

Tagg are working with Alarm.com to implement a tracker system for your pets. This will be based around a collar attachment that implements GPS to locate and uses 3G as a “report-back” mechanism.

The CES tech fair has given Roost some boost with their “smart battery” for existing smoke alarms. Here, they were able to show and demonstrate this battery in action as a monitoring device for the common smoke alarm.

Appliances

Unlike the Internationaler Funkaustellung where a home-appliance trade show had been merged with this consumer-electronics trade show, there has become an increasing de-facto presence of home appliances at the Consumer Electronics Show. This has been brought on by some of the Korean and Japanese consumer-electronics manufacturers wanting to show their appliances at this trade show along with appliances, both major-class “white-goods” and countertop “small-goods” and is demonstrating that home appliances are increasingly becoming part of the “Internet Of Things”.

Dacor used this show to premiere their Android-controlled ovens which used an “app-cessory” approach to controlling these ovens. This also goes alongside the use of a touchscreen as a local control surface and is representative of what is to come about for premium “white goods”.

LG Twin Wash System press photo courtesy of LG America

LG Twin Wash System – two washing machines in one

LG have fielded some interesting “white goods” at this show. The show-stopper for them in this department was the Twin Wash “drawer-load” second washing machine which is installed underneath their recent front-load washing machines. It works in a manner where you can wash a small load while the main machine is processing another load. The example often cited was for ladies to wash a change of delicate underwear on the delicate-wash cycle while the main machine runs a lot of normal-cycle washing. Another example from my experience would be to turn around two white shirts by themselves while a large quantity of coloured clothes is being washed, with everything being ready to dry at the same time. They also fielded a “double door-in-door” fridge for easier organisation of food in the fridge. Samsung were fielding some interesting appliances like a dual-cavity oven and their “ActiveWash’ washing machine which implements an advanced wash action.

The coffee making scene closes in to the home network more with Smarter running a “bean-to-cup” espresso machine for the US market which uses Wi-Fi technology to facilitate its app-cessory control surface.

In the next part of this series, I will be looking at what the Consumer Electronics Show 2015 is representing for entertainment in the connected home.

Internet Of Things connectivity issues

Article

Don’t get sidetracked: Connecting the residential IoE | The Beacon (Wi-Fi.org)

My Comments

Saeco GranBaristo Avanti espresso machine press picture courtesy of Philips

Appliances like this coffee machine are now working with dedicated mobile platform apps.

As the “Internet Of Things” or “Internet Of Everything” becomes ubiquitous in one’s lifestyle, there will always be some key issues with implementing this concept. It doesn’t matter whether it is for our health, wellbeing, convenient living or security that these issues will come in to play.

The core issue around the initial complexities will be due to use of network transports that don’t work on Internet-Protocol methodologies that have been established well before the Internet came to fruition in the mid-1990s. Rather, some of these implement an industry-specific data transport that requires the use of a so-called “bridge” between the non-IP transport and the IP transport.

Current implementation issues

Filling your computing devices with apps for each device and cloud service

Kwikset Kevo cylindrical deadbolt in use - Kwikset press image

The Internet Of Things should be about allowing these smart locks to work with other home-automation devices

At the moment, a lot of devices that offer control by smartphone require the use of vendor-developed apps and as you add more devices with this capability to your network, you end up filling your mobile device with many different apps. This leads to user confusion because you end up with having to work out which app you use to work with which device.

The same issue also affects cloud-based services where each vendor impresses on users to use the vendor’s supplied apps to benefit from these services. Again this leads to operator confusion which typically we would have noticed when we use social-media, over-the-top messaging or cloud-storage front-ends on our computing devices for each social-media, messaging or cloud-storage service.

This kind of situation makes it harder for one to develop software that makes best use of a device’s functions because they have to engineer a device to work specifically with a particular vendor’s devices. It brings us back to the days of DOS-based software where games vendors had to write the driver software to allow their software to interface with the computer system’s peripherals. This made it harder for customers to determine if that program they are after was to be compatible with their computer hardware.

Home-control systems and the home network

One issue that was highlighted was linking devices that use non-IP networks like Zigbee, Z-Wave or Bluetooth to the IP-based home network which works on Cat5 Ethernet, Wi-Fi and/or HomePlug. Typically this requires the use of a network-bridge device or module that connects to one of the Ethernet ports on the home-network router to link these devices to your home network, the Internet and your mobile devices.

Multiple bridge devices being needed

Nest Learning Thermostat courtesy of Nest Labs

… such as this room thermostat

The main question that was raised was whether we would end up with multiple bridge devices because each non-IP sensor or controller system was working in a proprietary manner, typically bound to a particular vendor’s devices or, in some cases, a subset of the devices offered by that vendor.

The worst-case scenario is a vendor who implements a Zigbee-based distributed heating control system for a UK-style hydronic central heating system that has thermostatic radiator valves for each radiator. In this scenario this system’s components will only link to the Internet and home network using the network bridge supplied by that vendor even though it works on the Zigbee network. But if you introduce a lighting system provided by another vendor that uses Zigbee technology, this system may require the use of another bridge that is supplied by that vendor for network-based lighting control.

Support for gradual system evolution

Also there is the issue of installation woes creeping up when you install or evolve your home-automation system. Some of us like the idea of “starting small” with local control of a few devices, then as funds and needs change, will change towards a larger more-capable system with Internet and mobile-device connectivity. The issue that is raised here is that a vendor could impress upon us to buy and install the network bridge before we start out installing the home-automation devices rather than enrolling the network bridge in to an established control system at a later date. In some cases, you may have to perform a reset operation upon all of the existing components and re-configure you system when you install that network bridge.

This also underscores the situation where a vendor may allow in-place upgrading and integration of a device known to have a long service life like most major appliances, HVAC or building-security devices. This is typically achieved through the use of an expansion module that the user or a technician installs in the device and this device gains the extra functionality. Here, it should be required for the device to be integrated in to the “Internet Of Things” network without you having to reset your network or do other difficult tasks.

To the same extent, one could easily start a system around one or more older devices, yet install newer devices in to the system. For example, you have a UK-style central heating system that is based around an existing boiler that has support for an advanced heating-control system if you choose to have a control module retrofitted to that unit and this module has an LCD touchscreen as its user interface.

You purchase this module and ask the central-heating technician to install it in your boiler so you can save money on your fuel bills. Here, this system uses a room thermostat which you start out with but also can work with thermostatic radiator valves and you buy and attach these valves to the radiators around the house to improve the heating efficiency and these devices work together properly, showing the results on the module’s LCD touchscreen.

Subsequently the boiler reaches the end of its useful life and you replace it with a newer more efficient model that has integrated support for the heating-control system that you implemented but in a newer form. Here, you don’t want to lose the functionality that the room thermostat or the thermostatic radiator valves offered, but want to fully benefit from what the new unit offers such as its inherent support for modulated output.

Needs

Task-focused application-level standards

The needs highlighted here are to implement task-focused application-level standards that work for the purpose of the device and support a simplified installation routine. As well, the role of any bridge device implemented in an “Internet Of Things” setup is to provide a proper application-level bridge between different medium types independent of device vendor.

But what are these task-focused application-level standards? These are IT standards that are focused on what the device does for that class of device rather than the device as being a particular model from a particular vendor. An “Internet Of Things” example would be a smart thermostat that is known to the other devices as a “HVAC thermostat” with attributes like current temperature, setpoint (desired-comfort-level) temperature, setpoint schedules and other comfort-control factors. This makes it easier for other devices to interact with these devices to, for example set up a situation-specific “preferred” room temperature for your heating when you use a particular user-code with your building alarm system or have a weather-forecast service cause the temperature to be adjusted in a manner to suit an upcoming situation.

Some good examples of the application-level standards are the UPnP Device Control Protocols for IP networks, or the Bluetooth application profiles. In one case, the Bluetooth Human Interface Device profile used for the Bluetooth keyboards, mice and remote controls was based on the USB Human Interface Device standards used for these similar devices. This simplified the design of host operating systems to design interoperability with Bluetooth and USB input devices using code that shared the same function.

Ability for a fail-safe network

An issue that is starting to crop up regarding the Internet Of Everything is being sure of a fail-safe network. This is in the form of each device in the network always discovering each other, control devices controlling their targets every time and sensor devices consistently providing up-to-date accurate data to their target devices.As well, a device that has a “standalone” function must be able to perform that function without being dependent on other devices.

Some devices such as smart locks have to he able to perform their essential functionality in a standalone manner if they lose connectivity with the rest of the network. This can easily happen due to a power cut or a network bridge or the Internet router breaking down.

Network bridges that work with multiple non-IP standards

As well, manufacturers could be challenged to design network bridges that work with more than two connection types such as a bridge that links Zigbee and Z-Wave home-automation devices to the one IP network using the one Ethernet connection.

This would include the ability to translate between the different non-IP standards on a task-based level so that each network isn’t its own silo. Rather, each device could expose what it can do to or the data it provides to other devices in the same logical network.

This may come to the fore with the concept of “meshing” which some standards like Zigbee and Z-Wave support. Here, a network can be created with each node being part of a logical mesh so that the nodes carry the signals further or provide a fail-safe transmission path for the signals. The “bridges” could work in a way to create a logical mesh with IP networks and networks that work on other media to use these other paths to create a totally fail-safe path.

Conclusion

It will take a long time for the “Internet Of Everything” to mature to a level playing field as it has taken for desktop and mobile computing to evolve towards to that goal. This will involve a lot of steps and place pressure on device manufacturers to implement these upgrades through the long working life of these devices.

Cloud routers–the current hot feature for the home network

Increasingly every home-networking equipment vendor is pitching a mid-range or high-end router range that offers “cloud” abilities and features. This kind of feature was simply offered as a remote-access feature but is being marketed under the cloud term, used as a way to make their devices appear to look cool to the customers.

These features are more about simplifying the process of providing authorised users remote access to the control functionality and similar features on these devices and providing this kind of access to someone who is using a smartphone or tablet. It also extends to file access for those of us who connect an external hard disk to these devices to purpose them as network storage.

What benefits does this offer for the home network router

The key feature that is offered for these devices is the ability to allow you to manage them from any Internet connection. This may be about troubleshooting your connection or locking down the Internet connection for rarely-occupied premises like a holiday home or city apartment.

If you connect an external hard disk to your cloud-capable router, you would have the same remote-access functionality as a cloud-capable NAS. This means that you could put and get data while you are on the road using your regular or mobile computing device and an Internet connection.

Some vendors integrate an application-level gateway to their cloud-assisted network services like video surveillance as part of this cloud functionality. This allows you to gain access to these services from the same point of entry as you are provided for your router.

How is this achieved

Like the cloud NAS, this involves the vendor providing a dynamic DNS service to aid in discovery of your router along with the use of SSL and other technologies to create a secure path to your router’s management dashboard.

It is also assisted with a client-side app for the mobile computing platforms so as to provide an integrated operational experience for your smartphone or tablet. This caters for items like access to the notification list, use of the interface style that is distinctive for the platform as well as the ability to get and put files according to what the platform allows.

Vendors who offer other cloud-based services would provide an application-level gateway in the router that ties in with these services and the devices that benefit from them. This is to provide a tight and finished user experience across all of their devices on your network, and is a way to keep you “vendor-loyal”.

Current limitations with this setup and what can be done

As we head towards cloud-capable network devices and add more of these devices to our networks, we will end up with a situation where we have to remember multiple Web addresses and user logins for each of these destinations. The manufacturers like D-Link would exploit this by integrating the cloud functionality for all of their devices or, more likely, devices within certain product ranges so that a user comes in to one entry point to benefit from the cloud functionality for that manufacturer’s device universe.

But the reality is that most of us would create a heterogenous network with devices supplied by different manufacturers and of different product classes. Here, one would have to keep a list of usernames, passwords and Web entry points or install multiple apps on a mobile device to benefit from every device’s cloud functionality.

Similarly, a manufacturer would be interested in evolving their “cloud-side” part of the equation for newer products but could place older products at risk of being shut out. Here, they could maintain the same functionality by keeping the remote access functionality alive and passing stability and security improvements to those of us who maintain the older devices.

Of course, working on systems that are true to industry standards and specifications like TR-069 for remote management can allow for pure interoperability and a future-proof environment. It can also allow for increased flexibility and the ability for third parties to provide the “cloud router” services with their own functionality and branding.

Three-way data storage–the way to go for the home network

There is a new trend that is affecting how we store data on our home computers and in our home network.

The three-island trend

This is the existence of three data islands

  • The main secondary storage that is part of our regular or mobile computing devices
  • A Network-attached storage system or a removeable hard disk.

The NAS would serve as a network-wide common storage destination as well as having ability to serve media data to network-capable media playback devices without the need for a PC to be on all the time. On the other hand, the removeable hard disk simply is used as an auxiliary storage destination for a particular regular computer.

  • A Cloud or remote storage service

The remote storage services like Dropbox or SkyDrive are typically used either for offsite data backup or as a data drop-off point that exists across the Internet. Most of these services work on a “freemium” business model where you have a small storage capacity available for free but you are able to rent more capacity as you need it. Some of these providers may work alongside hardware or software partners in opening up increased storage space for users of the hardware or software sold by these partners. In the same case, the remote storage services are increasingly offering business-focused packages that are optimised for reliability and security either on a similar freemium model or simply as a saleable service.

The role of file-management and backup software

Previously, backup software was charged with regularly sending copies of data that existed on a computer’s main secondary storage to removeable storage, a network-attached storage system or, in some cases, remote storage services.

New requirements

Tiered data storage

Now this software is charged with backup not just out to removeable or local-network storage, but to be able to set up storage tiers amongst this storage and remote storage. This is a practice that is familiar with large-business computing where high-cost high-availability storage is used for data that is needed most, cheaper medium-availability storage for data that isn’t as needed like untouched accounts with the cheapest, slowest storage media used for archival purposes or for data that doesn’t change.

The remote storage and the NAS or removable storage can each serve as one of these tiers depending on the capacity that the device or service offers.

Remote storage serves as temporary data location

In some cases, the remote storage may exist simply as a data drop-off point between a backup client on a portable computer and a backup agent on a network-attached storage device as part of a remote backup routine. Here, a user may back up the portable computer to a particular share in something like Dropbox. Then an agent program built in to a small-business or high-end consumer NAS would check that share and move or copy the data from Dropbox to the NAS.

Similarly, a remote storage service could work alongside a locally-installed network-attached-storage and another NAS installed at another premises for asynchronous data transfer between these devices. This can be useful if one of these devices isn’t always accessible due to unreliable power or Internet service.

In the case of that small business that starts to add branches, this concept can work well with sharing business data such as price lists or customer information between the branches. Businesses that work on the “storefront-plus-home-office” model could benefit this way by allowing changes to be propagated between locations, again using the remote storage service as a buffer.

Remote storage serves as a share-point

In some cases, a remote-storage service like Dropbox can permit you to share data like a huge image / video album between multiple people. Here, they can have access to the content via a Web page or simply download the content to local storage. In some cases, this could be about copying that image / video collection of a wedding to the “DLNA” folders on a NAS so they can view these pictures on that Samsung Smart TV anytime.

What does the software need

Backup software needs to identify file collections that exist in a backup job and make the extra copies that appear at different locations, whether as different folders on the same target drive or at a different target location. Similar a timed backup job could also encompass synchronisation or “shifting” of other file collections to one or more target locations.

Similarly, the backup routine isn’t just about “copy and compress” files to a large metafile before trransferring it to the backup destination. It is about working the collection file-by-file according to the destination.

You could do this with most software by adding extra backup jobs with different parameters. But this involves creating more large metafiles with most backup software. Here, file-synchronisation software could perform the job better by working at the file level.

Support for remote data storage in a NAS

Some network-attached-storage devices, especially those that work on an application-driven platform, work as clients to remote storage services. Here, this can cater for off-site file replication or “data-fetching” setups without a desktop or laptop computer having to be on all the time.

In some setups where portability is considered paramount, the idea of a NAS using remote data storage can allow a user to temporarily hold files destined for the remote data storage service on a NAS that is offline as far as the Internet is concerned. Then the NAS is just connected to the Internet to synchronise the files with a remote storage service.

Similarly, a media file collection that is shared via a remote data storage service like Dropbox may then end up on a NAS primarily to be made available to DLNA client devices at all times as well as not occupying precious disk space on the computer. This may be relevant for one or more large video files or a collection of many photos from that special occasion.

Conclusion

As we start to see the concept of the “three-island” data storage arrangement in our home and small-business networks, we well have to be able to work with these arrangements whether by copying or moving the data between the different storage “islands”.

There is room for the next-generation broadband service

A common remark that I hear about next-generation broadband is that it is a service we don’t need. Here the image that is underscored is that current-generation broadband services like ADSL2 or cable-modem Internet are good enough for email and Web browsing with a dash of multimedia communication thrown in.

But the next-generation Internet services are providing for newer realities especially as we increasingly do some of our work from home or increase the use of multimedia that is available online.

Video and entertainment applications

A major driver for the next-generation broadband technology is its role in delivering entertainment content to customers. This has been underscored through the availability of network-enabled AV equipment that can also draw down this content from the Internet.

High-resolution video

One major application class that I see with next-generation broadband is the distribution of video that has very high resolution. This will become the norm as more display devices will have high pixel-density displays. For example, a device like a 10” tablet to a 21” personal-display screen will acquire something like a 1080p resolution while the 32”-55” group-viewing displays will acquire the resolution for a 4K UHDTV picture.

This year, the 4K ultra-high-definition TV technology is being premiered by the likes of Sony, with the idea of the content currently being delivered on to hard disk media players connected to these displays.

Similarly, more newer video content is being turned out with the 1080p full high definition images. This includes older content, especially the material that was mastered to 16mm or 35mm film being mastered to 1080p full high definition video.

IPTV and video on demand

Another key application is the provision of Internet-based video services. These could be in the form of scheduled IPTV broadcasts or video-on-demand services where you can pull in video content to view. The video-on-demand services could be offered as a streaming service where the server streams down the content as you view it or as a download service where the content is downloaded to local mass-storage for you to view from that location.

The cost of entry is being reduced significantly at both the service provider’s end and the consumer’s end. In the latter case, this is enabled through various smart-TV platforms offering this service through TV sets and video peripherals like Blu-Ray players, games consoles and network media receivers. The former case is underscored by the arrival of an “action sport movie” channel that is running movie and TV content themed around high-action sports and making use of IPTV due to its low cost of entry.

It also appeals to the different business models like advertising-supported, pay-per-view, content rental, time-based subscription and download-to-own, with the operators being able to offer a mix of models to suit the content and the audience.

Telecommuting and small-business enablement

Another key application that the next-generation broadband will provide is various communications and business-enablement services. This can cater for people who telecommute (work from home for an employer) on a full-time or ad-hoc basis, people who maintain a shopfront for their business but do their office work at home or those of us who run professional or other business services from our homes.

Videoconferencing and IP communications

With the success of Skype in the consumer space, the concept of IP-based communications is likely to drive the need for next-generation broadband.

For example, the videocalls currently offered through Skype allow for 720p video resolution through the current generation of Webcams in the field. Similarly, HD voice communications which allows one’s voice to come through in FM-radio quality is being supported by Viber and most over-the-top telecommunications services. This latter ability can benefit people who have a distinct accent in that they can be heard easily.

In some cases, this could extend to “real-business” telecommunications like PABX functionality or telepresence / teleconference being made available to the small-business crowd. For example, a small-business owner who sets up shop in another area could benefit from VoIP tie-lines that link both locations or a professional services provider could engage in videocalls with clients using Skype or better services.

Cloud computing

Another key driver for next-generation broadband is the idea of “cloud computing”. This can extend from email, social-networks and Internet banking through to file-drop, media-sharing and online-backup services. Even businesses and multiple-premises home networks are or will be implementing “small private cloud” setups which interlink computer systems that are at multiple locations, whether on a remote-access or peer-to-peer basis.

But what is common with these services is that they require the ability to transfer large amounts of data between the home network and the service provider. This will cause a demand for the bandwidth offered by the next-generation broadband services.

Conclusion

Although it is so easy to say that there isn’t a need for next-generation broadband, as the new applications come on to the scene, these applications could ultimately underscore the desire and need for this technology.

MacOS X users can now consolidate multiple cloud-based notes storage services in one app

Article

Notesdeck Consolidates Evernote, Dropbox And iCloud Notes Into One App | Lifehacker Australia

My Comments

Some of us may start using the cloud-driven notes storage services like Evernote or Dropbox. This is due to the ability for us to gain access to the material we create on these services from any regular or mobile computing device.

But we can be encouraged to create accounts with more than one of these services, such as through a service provider having a presence on our new computers; or our colleagues, relatives or friends recommending a particular service to us. In some cases, we may exploit a particular service as a data pool for a particular project.

Subsequently we end up with multiple “front-end” links to different cloud-based storage services on our computers and end up not knowing where a particular piece of data is held – is it on Dropbox, is it on Evernote, is it on iCloud or whatever.

Now someone has written a MacOS X app that provides the same kind of interface and useability to these cloud-based services that an email client provides to most email services. In the Apple Macintosh context, Apple Mail would be able to allow you to set up multiple email accounts such as your SMTP/POP3 mailbox your ISP gave you, your Exchange account that work gave you as well as your GMail account that you set up as a personal account.

At the moment, the software called NotesDeck, sells for $11.49 but according to the review, there needs to be a few improvements. One that was raised was that entries are listed for services that you aren’t set up with. This is compared to the typical email client that doesn’t list service types that you don’t have presence with. It could be rectified properly if the software could use a provisioning user experience similar to the typical email client where you click on “Add Account” to create details about the mailbox you are integrating to your client. 

Of course, I would like to be sure that this program does allow you to transfer notes between accounts and also between local resources such as your word-processing documents. This may be important if you intend to consolidate your cloud-based notes services towards fewer services or copy the notes out to the magnum opus that you are working on.

Similarly, the program could be ported to the Windows platform or to the mobile platforms (iOS, Android, Windows Phone 8) so that users who use these platforms can have the ability to work the multiple accounts on their devices using one program.

Creating a small data cloud–why couldn’t it be done

The small data-replication cloud

Netgear ReadyNAS

These network-attached storage devices could be part of a personal data-replication cloud

Whenever the “personal cloud” is talked of, we think of a network-attached storage device where you gain access to the data on the road. Here, the cloud aspect is fulfilled by a manufacturer-provided data centre that “discovers” your NAS using a form of dynamic DNS and creates a “data path” or VPN to your NAS. Users typically gain access to the files by logging in to a SSL-secured Web page or using a client-side file-manager program.

But another small data cloud is often forgotten about in this class of device, except in the case of some Iomega devices. This represents a handful of consumer or small-business NAS units located at geographically-different areas that are linked to each other via the Internet. Here, they could synchronise the same data or a subset of that data between each other.

This could extend to applications like replicating music and other media held on a NAS to a hard disk installed in a car whether the vehicle is at home, at the office or even while driving. The latter example may be where you purchase or place an order for a song or album via the wireless broadband infrastructure with the content ending up on your car’s media hard disk so it plays through its sound system. Then you find that it has been synchronised to your home’s NAS so you can play that album on your home theatre when you arrive at home.

What could it achieve?

An example of this need could be for a small business to back up their business data to the network-attached storage device located at their shop or office as well as their owner’s home no matter where the data is created.

Similarly, one could copy their music and video material held on the main NAS device out to a NAS that is at holiday home. This can lead to location-specific speedy access to the multimedia files and you could add new multimedia files to the NAS at your holiday home but have this new collection reflected to your main home.

Here, one could exploit a larger-capacity unit with better resiliency, like the business-grade NAS units pitched at small businesses, as a master data store while maintaining less-expensive single-disk or dual-disk consumer NAS units as local data stores at other locations. This setup may appeal to businesses where one location is seen as a primary “office” while the other location is seen as either a shopfront or secondary office.

This kind of setup could allow the creation of a NAS as a local “staging post” for newly-handled or regularly-worked data so as to provide a resilient setup that can survive a link failure. In some cases this could even allow for near-line operation for a business’s computing needs should the link to a cloud service fail.

User interface and software requirements

This same context can be built on the existing remote-access “personal cloud” infrastructure and software so there is no need to “reinvent the wheel” for a multi-NAS cloud.

Similarly, users would have to use the NAS’s existing management Web page to determine the location of the remote NAS devices and the data sets they wish to replicate. This can include how the data set is to be replicated such as keeping a mirror-copy of the data set, or contributing new and changed data to a designated master data set or a combination of both. The data set could be the copy of a particular NAS volume or share, a folder or group of folders or simply files of a kind.

The recently-determined UPnP RemoteAccess v2 standard, along with the UPnP ContentSync standards could simplify the setup of these data-synchronisation clouds. This could also make it easier to provide heterogenous data clouds that exist for this requirement.

But one main requirement that needs to be thought of is that the computer systems at both ends cannot collapse or underperform because the link fails. There has to be some form of scalability so that regular small-business servers can be party to the cloud, which may benefit the small-business owner who wants to integrate this hardware and the home-network hardware as part of a data-replication cloud.

Hardware requirements

A small data cloud needs to support cost-effective hardware requirements that allow for system growth. This means that it could start with two or more consumer or SME NAS devices of a known software configuration yet increase in capacity and resilience as the user adds or improves storage equipment at any location or rents storage services at a later stage.

This could mean that one could start with one single-disk NAS unit at each location, then purchase a small-business NAS equipped with a multi-disk RAID setup, setting this up at the business. The extra single-disk unit could then be shifted to another location as a staging-post disk or extra personal backup.

Conclusion

What NAS manufacturers need to think of is the idea of supporting easy-to-manage multi-device data-replication “personal clouds” using these devices. This is alongside the current marketing idea of the remote-access “personal cloud” offered for these devices.

Symantec Symposium 2012–My observations from this event

Introduction

Yesterday, I attended the Symantec Symposium 2012 conference which was a chance to demonstrate the computing technologies Symantec was involved in developing and selling that were becoming important to big business computing.

Relevance to this site’s readership

Most solutions exhibited at this conference are pitched at big business with a fleet of 200 or more computers. But there were resellers and IT contractors at this event who buy these large-quantity solutions to sell on to small-business sites who will typically have ten to 100 computers.

I even raised an issue in one of the breakout sessions about how manageability would be assured in a franchised business model such as most fast-food or service-industry chains. Here, this goal could be achieved through the use of thin-client computers or pre-configured equipment bought or leased through the franchisor.

As well, the issues and solution types of the kind shown at this Symposium tend to cross over between small sites and the “big end of town” just like a lot of office technology including the telephone and the fax machine have done so.

Key issues that were being focused were achieving a secure computing environment, supoorting the BYOD device-management model and the trend towards cloud computing for the systems-support tasks.

Secure computing

As part of the Keynote speech, we had a guest speaker from the Australian Federal Police touch on the realities of cybercrime and how it affects the whole of the computing ecosystem. Like what was raised in the previous interview with Alastair MacGibbon and Brahman Thiyagalingham about secure computing in the cloud-computing environment, the kind of people committing cybercrime is now moving towards organised crime like East-European mafia alongside nation states engaging in espionage or sabotage. He also raised that it’s not just regular computers that are at risk, but mobile devices (smartphones and tablets), point-of-sale equipment like EFTPOS terminals and other dedicated-purpose computing devices that are also at risk. He emphasised issues like keeping regular and other computer systems up to date with the latest patches for the operating environment and the application software.

This encompassed the availability of a cloud-driven email and Website verification system that implements a proxy-server setup. This is designed to cater for the real world of business computing where computer equipment is likely to be taken and used out of the office and used with the home network or public networks like hotel or café hotspots. It stays away from the classic site-based corporate firewall and VPN arrangement to provide controlled Internet access for roaming computers. It also was exposing real Internet-usage needs like operating a company’s Social-Web presence, personal Internet services like Internet banking or home monitoring so as to cater for the ever-increasing workday, and the like. Yet this can still allow for an organisation to have control over the resources to prevent cyberslacking or viewing of inappropriate material.

Another technique that I observed is the ability to facilitate two-factor authentication for business resources or customer-facing Websites. This is where the username and password are further protected by something else in the similar way that your bank account is protected at the ATM using your card and your PIN. It was initially achieved through the use of hardware tokens – those key fobs or card-like devices that showed a random number on their display and you had to enter them in your VPN login; or a smart card or SIM that required the use of a hardware reader. Instead Symantec developed a software token that works with most desktop or mobile operating systems and generates this random code. It even exploits integrated hardware security setups in order to make this more robust such as what is part of the Intel Ivy Bridge chipset in second-generation Ultrabooks.

Advanced machine-learning has also played a stronger part in two more secure-computing solutions. For example, there is a risk assessment setup being made available where an environment to fulfill a connection or transaction can be assessed against what is normal for a users’s operating environment and practices. It is similar to the fraud-detection mechanisms that most payment-card companies are implementing where they could detect and alert customers to abnormal transactions that are about to occur, like ANZ Falcon. This can trigger verification requirements for the connection or transaction like the requirement to enter a one-time-password from a software token or an out-of-band voice or SMS confirmation sequence.

The other area where advanced machine-learning plays a role in secure computing is data loss prevention. As we hear of information being leaked out to the press or, at worst, laptops, mobile computing devices and removable storage full of confidential information disappearing and falling in to wrong hands, this field of information security is becoming more important across the board. Here, they used the ability to “fingerprint” confidential data like payment card information and apply handling rules to this information. This includes implementation of on-the-fly encryptions for the data, establishment of secure-access Web portals, and sandboxing of the data. The rules can be applied at different levels and affect the different ways the data is transferred between computers such as shared folders, public-hosted storage services (Dropbox, Evernote, GMail, etc), email (both client-based and Webmail) and removable media (USB memory keys, optical disks). The demonstration focused more on the payment-card numbers but I raised questions regarding information like customer/patient/guest lists or similar reports and this system supports the ability to create the necessary fingerprint of the information to the requirements desired.  

Cloud-focused computing support

The abovementioned secure-computing application makes use of the cloud-computing technology which relies on many of the data centres scattered around the world.

But the Norton 360 online backup solution that is typically packaged with some newer laptops is the basis for cloud-driven data backup. This could support endpoint backup as well as backup for servers, virtual machines and the like.

Mobile computing and BYOD

Symantec have approached the mobile computing and BYOD issues in two different paths. They have catered for the fully-managed devices which may appeal to businesses running fleets of devices that they own or using tablets as interactive customer displays. But they allowed for “object-specific” management where particular objects (apps, files, etc) can be managed or run to particular policies.

It includes the ability to provide a corporate app store with the ability to provide in-house apps, Web links or commercial apps so users know what to “pick up” on their devices. These apps are then set up to run to the policies that affect how that user runs them, including control of data transfer. This setup may also please the big businesses who provide those services that small businesses often provide as an agent or reseller, such as Interflora. Here, they could run the business-specific app store with the line-of-business apps like a flower-delivery-list app that runs on a smartphone. There is the ability to remotely vary and revoke permissions concerning the apps, which could come in handy when the device’s owner walks out of the organisation.

Conclusion

What this conference shows at least is the direction that business computing is taking and was also a chance to see core trends that were affecting this class of computing whether you are at the “big end of town” or not.

What small businesses need to be aware of when considering cloud solutions

Article

What smaller businesses should look for in cloud software • The Register

My comments

Businesses, especially small businesses, are being sold the idea of using cloud-based computer setups rather than site-based setups for their computer needs very frequently.

The selling points used for these cloud-based systems include reduced hardware costs to run the system; capital expenditure being deferred to operational expenditure; scalability and flexibility; as well as increased security, resilience and uptime for the computer system. They are being pitched as being more suitable for small businesses due to the business not needing an IT team always on hand.

In some cases, the cloud technology is used as a way of offering the small business some of that “big-business” IT functionality including up-to-date line-of-business computerisation. I had learnt about this through a Skype interview that I had with Matthew Hare when I was talking about the FTTP fibre-optic next-generation broadband setup in Hambledon in the UK. He was referring to the businesses in that area, especially hotels, being tempted to use cloud-based IT solutions to provide the “big-business” services they want to provide.

But there are some caveats that one has to be careful of.

The cloud-based computer setup has a lot of the processing and storage performed on backend computers held at one or more remote datacenters located on the Internet.

Here, you have to make sure that you have a business-grade Internet service if you are relying on cloud technology. This is more so if the cloud-based technology is driving your line-of-business IT needs such as point-of-sale or hotel property management.

This situation happened with a hotel who was experiencing trouble with a half-performing POS/property-management system. I was aware of this with transactions taking exceedingly long times to process and some terminals not performing some business-essential transactions like some food/beverage cash sales completely. What I had found out later was that the communications link was down and certain online components couldn’t work properly. They had the communications equipment fixed the next morning and the system was working normally.

There also has to be some form of fault tolerance where essential parts of the system can be used if the connection or the backend goes down. This factor is important if cloud technology is to drive a transaction-driven line-of-business setup; and it should be feasible to perform the essential transactions in a “data-capture” manner without the need to be online as a continuity measure.

One of the possible indicators for a cloud-based system’s level of fault-tolerance is whether a mobile or tablet-based implementation runs a platform-native app rather than operation that is totally Web-based. The platform-native app, if designed properly, would have the opportunity to capture transaction data to the device’s storage if the device was offline.

The cloud-based setup also has to provide for secure data transfer and you have to raise the issue of what happens if the service provider goes bust or changes hands. Here, you have to protect the proprietary and integrity of your data as well as the continuity of your service.

It is more so if cloud-based services follow the same path as Internet-based services in the late 90s and early 2000s with the dot-com bubble. This is where the bubble would “burst” and companies either collapse or get taken over by other companies; and these kind of situations could easily sort the wheat from the chaff.

Similarly, your needs may change and you may come across another cloud-based or site-based solution provider who suits your newer requirements. This may encompass situations like the establishment of a branch or increased business traffic.

Here, I would make sure that the business data that describes your operation is able to be exported and imported or exchanged to other cloud or on-site software using data formats and conventions that are accepted by the business’s industry type. Then a part of the on-site backup routine should include exporting your data to the industry-standard format so you can handle these changes better.

What to look for when planning cloud services for your business

  • A guaranteed minimum standard of service quality, reliability and security from the cloud service provider
  • A guaranteed level of service availability and throughput from your Internet service provider
  • A level of fault-tolerance that allows for essential business continuity if a cloud-based system fails.
  • The ability for the business owner or manager to troubleshoot or or make good communications equipment that has failed
  • The ability to export and import the data to industry-specific standards to facilitate movement between site-based or cloud-based systems or use as a snapshot backup.

Conclusion

A well-thought-out cloud-based business computing solution that provides a level of resilience can allow a business to save money and, especially, allow a small business to be able to “think big”.

Update: 1 April 2018

Indicator of fault tolerance in a cloud-based transaction system.