Tag: cloud computing

European businesses still value data protection for their online services

Article Map of Europe By User:mjchael by using preliminary work of maix¿? [CC-BY-SA-2.5 (http://creativecommons.org/licenses/by-sa/2.5)], via Wikimedia Commons

Europäische Cloud-Anbieter profitieren von Datenschutzbedenken (European cloud offerings profit from data protection) | Netzwoche.ch (German language / Deutsche Sprache)

My Comments

I am following the scene as far as European online services and cloud computing for both business and consumer use is going. This is based on how I see that Europe could compete with the US establishment when it comes to offering any online service and ensure it respects European values.

I have just read a Swiss article which talked about the US and Chinese hyperscale cloud platforms dominating the European cloud-computing scene. But this article is stating that European cloud-computing / online-service providers are catching up with these behemoths. Here these companies are using data protection as a selling point due to data-protection and user-privacy concerns by European businesses and government authorities.

An example I saw of this is Germany and France working towards creating public-cloud computing services with the goal of being able to compete against the public-cloud services offered by the USA and Asia.

A recent survey completed by the French IT consultant Capgemini highlighted that the German-speaking part of Europe (Germany, Australia and Switzerland) were buying minimal European IT services. But the same Capgemini survey were saying that 45 of the respondents wanted to move to European providers in the future thanks to data protection and data sovereignty issues.

Data security is being given increasing importance due to recent cyber attacks and the increased digitalisation of production processes. But the Europeans have very strong data protection and end-user privacy mandates at national and EU level thanks to a strong respect for privacy and confidentiality within modern Europe.

COVID-19 had placed a lot of European IT projects on ice but there has been a constant push to assure business continuity even under the various public-health restrictions mandated by this plague. This includes the support for distributed working whether that be home-office working or remote working.

But how is this relevant to European households, small businesses and community organisations? I do see this as being relevant due to the use of various online and cloud IT services as part of our personal life thanks to the like of search engines, email / messaging, the Social Web, online entertainment, and voice driven assistants. As well, small businesses and community organisations show interest in online and cloud-based computing as a means of benefiting from what may be seen as “big-time” IT without needing much in the way of capital expenditure.

It will be a slow and steady effort for Europe to have online and cloud computing on a par with the US and Asian establishment but this will be about services that respect European privacy, security and data-sovereignty values.

Why do I defend Europe creating their own tech platforms?

Previous Coverage on HomeNetworking01.info Map of Europe By User:mjchael by using preliminary work of maix¿? [CC-BY-SA-2.5 (http://creativecommons.org/licenses/by-sa/2.5)], via Wikimedia Commons

Europeans could compete with Silicon Valley when offering online services

How about encouraging computer and video games development in Europe, Oceania and other areas

My Comments

Regularly I keep an eye out for information regarding efforts within Europe to increase their prowess when it comes to business and personal IT services. This is more so as Europe is having to face competition from the USA’s Silicon Valley and from China in these fields.

But what do Europeans stand for?

Airbus A380 superjumbo jet wet-leased by HiFly at Paris Air Show press picture courtesy of Airbus

Airbus have proven that they are a valid European competitor to Boeing in the aerospace field

What Europeans hold dear to their heart when it comes to personal, business and public life are their values. These core values encompass freedom, privacy and diversity and have been build upon experience with their history, especially since the Great Depression.

They had had to deal with the Hitler, Mussolini and Stalin dictatorships especially with Hitler’s Nazis taking over parts of European nations like France and Austria; along with the Cold War era with Eastern Europe under communist dictatorships loyal to the Soviet Union. All these affected countries were run as police states with national security forces conduction mass surveillance of the populace at the behest of the dictators.

The EU’s European Parliament summed this up succinctly on their page with Europeans placing value on human dignity, human rights, freedom, democracy, equality and the rule of law. It is underscored in a pluralistic approach with respect for minority groups.

I also see this in the context of business through a desire to have access to a properly-functioning competitive market driven by publicly-available standards and specifications. It includes a strong deprecation of bribery, corruption and fraud within European business culture, whether this involves the public sector or not. This is compared to an “at-any-cost” approach valued by the USA and China when it comes to doing business.

As well, the European definition of a competitive market is the availability of goods or services for best value for money. This includes people who are on a very limited budget gaining access to these services in a useable manner that underscores the pluralistic European attitude.

How is this relevant to business and consumer IT?

Nowadays, business and consumer IT is more “service-focused” through the use of online services whether totally free, complementary with the purchase of a device, paid for through advertising or paid for through regular subscription payments. Increasingly these services are being driven by the mass collection of data about the service’s customers or end-users with people describing the data as being the “new oil”.

Examples of this include Web search engines, content hosting providers like YouTube or SoundCloud, subscription content providers, online and mobile gaming services, and voice-driven assistants. It also includes business IT services like cloud-computing services and general hosting providers that facilitate these services.

Europeans see this very differently due to their heritage. Here, they want control over their data along with the ability to participate in a competitive market that works to proper social expectations. This is compared to business models operated by the USA and China that disrespect the “Old World’s” approach to personal and business values.

The European Union have defended these goals but primarily with the “stick” approach. It is typically through passing regulations like the GDPR data-protection regulations or taking legal action against US-based dominant players within this space.

But what needs to happen and what is happening?

What I often want to see happen is European companies build up credible alternatives to what businesses in China and the USA are offering. Here, the various hardware, software and services that Europe has to offer respects the European personal and business culture and values. They also need to offer this same technology to individuals, organisations and jurisdictions who believe in the European values of stable government that respects human rights including citizen privacy and the rule of law.

What is being done within Europe?

Spotify Windows 10 Store port

Spotify – one of Europe’s success stories

There are some European success stories like Spotify, the “go-to” online subscription service that is based in Sweden as well as a viable French competitor in the form of Deezer, along with SoundCloud which is an audio-streaming service based in Germany.

Candy Crush Saga gameplay on Windows 10

Candy Crush Saga – a European example of what can be done in the mobile game space

A few of the popular mobile “guilty-pleasure” games like Candy Crush Saga and Angry Birds were developed in Europe. Let’s not forget Ubisoft who are a significant French video games publisher who have set up studios around the world and are one of the most significant household names in video games. Think of game franchiese like Assassin’s Creed  or Far Cry which are some of the big-time games that this developer had put out.

Then Qwant appeared as a European-based search engine that creates its own index and stores it within Europe. This is compared to some other European-based search engines which are really “metasearch engines” that concatenate data from multiple search engines including Google and Bing.

There have been a few Web-based email platforms like ProtonMail surfacing out of Switzerland that focus on security and privacy for the end-user. This is thanks to Switzerland’s strong respect for business and citizen privacy especially in the financial world.

Freebox Delta press photo courtesy of Iliad (Free.fr)

The Freebox Delta is an example of a European product running a European voice assistant

There are some European voice assistants surfacing with BMW developing the Intelligent Personal Assistant for in-vehicle use while the highly-competitive telecommunications market in France yielded some voice assistants of French origin thanks to Orange and Free. Spain came in on the act with Movistar offering their own voice assistant. I see growth in this aspect of European IT thanks to the Amazon Voice Interopability Initiative which allows a single hardware device like a smart speaker to allow access to multiple voice-assistant

AVM FritzBox 7530 press image courtesy of AVM GmBH

The AVM FRITZ!Box 7530 is a German example of home network hardware with European heritage

Technicolor, AVM and a few other European companies are creating home network hardware typically in the form of carrier-supplied home-network routers. It is although AVM are offering their Fritz lineup of of home-network hardware through the retail channel with one of these devices being the first home-network router to automatically update itself with the latest patches. In the case of Free.fr, their Freebox products are even heading to the same kind of user interface expected out of a recent Synology or QNAP NAS thanks to the continual effort to add more capabilities in these devices.

But Europe are putting the pedal to the metal when it comes to cloud computing, especially with the goal to assure European sovereignty over data handled this way. Qarnot, a French company, have engaged in the idea of computers that are part of a distributed-computing setup yielding their waste heat from data processing for keeping you warm or allowing you to have a warm shower at home. Now Germany is heading down the direction of a European-based public cloud for European data sovereignty.

There has been significant research conducted by various European institutions that have impacted our online lives. One example is Frauhofer Institute in Germany have contributed to the development of file-based digital audio in both the MP3 and AAC formats. Another group of examples represent efforts by various European public-service broadcasters to effectively bring about “smart radio” with “flagging” of traffic announcements, smart automatic station following, selection of broadcasters by genre or area and display of broadcast-content metadata through the ARI and RDS standards for FM radio and the evolution of DAB+ digital radio.

But what needs to happen and may will be happening is to establish and maintain Europe as a significantly-strong third force for consumer and business IT. As well, Europe needs to expose their technology and services towards people and organisations in other countries rather than focusing it towards the European, Middle Eastern and Northern African territories.

European technology companies would need to offer the potential worldwide customer base something that differentiates themselves from what American and Chinese vendors are offering. Here, they need to focus their products and services towards those customers who place importance on what European personal and business values are about.

What needs to be done at the national and EU level

Some countries like France and Germany implement campaigns that underscore products that are made within these countries. Here, they could take these “made in” campaigns further by promoting services that are built up in those countries and have most of their customers’ data within those countries. Similarly the European Union’s organs of power in Brussels could then create logos for use by IT hardware and software companies that are chartered in Europe and uphold European values.

At the moment Switzerland have taken a proactive step towards cultivating local software-development talent by running a “Best of Swiss Apps” contest. Here, it recognises Swiss app developers who have turned out excellent software for regular or mobile computing platforms. At the moment, this seems to focus on apps which primarily have Switzerland-specific appeal, typically front-ends to services offered by the Swiss public service or companies serving Swiss users.

Conclusion

One goal for Europe to achieve is a particular hardware, software or IT-services platform that can do what Airbus and Arianespace have done with aerospace. This is to raise some extraordinary products that place themselves on the world stage as a viable alternative to what the USA and China offer. As well, it puts the establishment on notice that they have to raise the bar for their products and services.

More companies participate in Confidential Computing Consortium

Article

Facebook, AMD, Nvidia Join Confidential Computing Consortium | SDx Central

AMD, Facebook et Nvidia rejoignent une initiative qui veut protéger la mémoire vive de nos équipements  (AMD, NVIDIA and Facebook join an initiatiative to protect the live memory of our equipment) | O1Net.com (France – French language / Langue française)

From the horse’s mouth

Confidential Computing Consortium

Web site

My Comments

Some of online life’s household names are becoming part of the Confidential Computing Consortium. Here, AMD, Facebook, NVIDIA are part of this consortium which is a driver towards secure computing which is becoming more of a requirement these days.

What is the Confidential Computing Consortium

This is an industry consortium driven by the Linux Foundation to provide open standards for secure computing in all use cases.

It is about creating a standard software-development kits that are about secure software execution. This is to allow software to run in a hardware-based Trusted Execution Environment that is completely secure. It is also about writing this code to work independent of the system’s silicon manufacturer and to work across the common microarchitectures like ARM, RISC-V and x86.

This is becoming of importance nowadays with malware being written to take advantage of data being held within a computing device’s volatile random-access memory. One example of this include RAM-scraping malware targeted at point-of-sale / property-management systems that steal customers’ payment-card data while a transaction is in progress. Another example are the recent discoveries by Apple that a significant number of familiar iOS apps are snooping on the user’s iPhone or iPad Clipboard with their iPhones without the knowledge and consent of the user.

As well, in this day and age, most software implements various forms of “memory-to-memory” data transfer for many common activities like cutting and pasting. There is also the fact that an increasing number of apps are implementing context-sensitive functionality like conversion or translation for content that a user selects or even for something a user has loaded in to their device.

In most secure-computing setups, data is encrypted “in-transit” while it moves between computer systems and “at rest” while it exists on non-volatile secondary storage like mechanical hard disks or solid-state storage. But it isn’t encrypted while it is in use by a piece of computer software to fulfil that program’s purposes. This is leading to these kind of exploits like RAM-scraping malware.

The Confidential Computing Consortium is about encrypting the data that is held within RAM and allowing the user to grant software that they trust access to that encrypted data. Primarily it will be about consent-driven relevance-focused secure data use for the end-users.

But the idea is to assure not just the security and privacy of a user’s data but allow multiple applications on a server-class computer to run in a secure manner. This is increasingly important with the use of online services and cloud computing where data belonging to multiple users is being processed concurrently on the same physical computer.

This is even relevant to home and personal computing, including the use of online services and the Internet of Things. It is highly relevant with authenticating with online services or facilitating online transactions; as well as assuring end-users and consumers of data privacy. As well, most of us are heading towards telehealth and at-home care which involves the handling of more personally-sensitive information relating to our health through the use of common personal-computing devices.

The fact that Facebook is on board is due to the fact the social network’s users make use of social sign-on by that platform to sign up with or log in to various online services. In this case, it would be about protecting user-authentication tokens that move between Facebook and the online service during the sign-up or log-in phase.

As well,  Facebook has two fingers in the consumer online messaging space in the form of Facebook Messenger and WhatsApp products and both these services feature end-to-end encryption with WhatsApp having this feature enabled by default. Here, they want users to be sure that the messages during, say, a WhatsApp session stay encrypted even in the device’s RAM rather than just between devices and within the device’s non-volatile storage.

I see the Confidential Computing Consortium as underscoring a new vector within the data security concept with this vector representing the data that is in the computer’s memory while it is being processed. Here, it could be about establishing secure consent-driven access to data worked on during a computing session, including increased protection of highly-sensitive business and personal data.

Germany to instigate the creation of a European public cloud service

Article

Map of Europe By User:mjchael by using preliminary work of maix¿? [CC-BY-SA-2.5 (http://creativecommons.org/licenses/by-sa/2.5)], via Wikimedia Commons

Europe to have one or more public cloud services that respect European sovereignty and values

Germany to Unveil European Cloud to Rival Amazon, Alibaba | ITPro Today

France, Germany want more homegrown clouds to pick from | ITNews (Premium)

My Comments

Germany is instigating a European-wide project to create a public cloud-computing service.  As well, France is registering intent in this same idea but of creating another of these services.

Both countries’ intention is to rival what USA and Asia are offering regarding public-cloud data-processing solutions. But, as I have said before, it is about having public data infrastructure that is sovereign to European laws and values. This also includes the management and dissemination of such data in a broad and secure manner.

Freebox Delta press photo courtesy of Iliad (Free.fr)

… which could also facilitate European software and data services like what is offered through the Freebox Delta

The issue of data sovereignty has become of concern in Europe due to the USA and China pushing legislation to enable their governments to gain access to data held by data service providers that are based in those countries. This is even if the data is held on behalf of a third-party company or hosted on servers that are installed in other countries. The situation has been underscored by a variety of geopolitical tensions involving especially those countries such as the recent USA-China trade spat.

It is also driven by some European countries being dissatisfied with Silicon Valley’s dominance in the world of “as-a-service” computing. This is more so with France where there are goals to detach from and tax “GAFA” (Google, Apple, Facebook and Amazon) due to their inordinate influence in consumer and business computing worlds.

or BMW’s voice-driven assistant for in-car infotainment

Let’s not forget that Qarnot in France has designed computers that put their waste heat to use for heating rooms or creating hot water in buildings. This will appeal to a widely-distributed data-processing setup that could be part of public cloud-computing efforts.

Questions that will crop up with the Brexit agenda when Europe establishes this public cloud service will include British data sovereignty if data is held on the European public cloud or whether Britain will have any access or input into this public cloud.

Airbus A380 superjumbo jet wet-leased by HiFly at Paris Air Show press picture courtesy of Airbus

… just like this Airbus A380 superjumbo jet shows European prowess in aerospace

Personally I could see this as facilitating the wider creation of online services by European companies especially with the view to respecting European personal and business values. It could encompass ideas like voice-driven assistant services, search engines, mapping and similar services for consumers or to encourage European IT development.

Could this effort that Germany and France put forward be the Airbus or Arianespace of public-cloud data services?

Different kinds of cloud IT systems–what to be aware of

Apple iPad Pro 9.7 inch press picture courtesy of Apple

The iPad is seen as part of the cloud-based mobile computing idea that Silicon Valley promotes

Very often “cloud” is used as a Silicon-Valley-based buzzword when describing information-technology systems that have any sort of online data-handling abilities.

This is more so if the IT system is sold to the customer “as a service” where the customer pays a subscription to maintain use of the system. It also is used where the user’s data is stored at an online service with minimal data-processing and storage abilities at the user’s premises.

It is because small business users are being sold on these systems typically due to reduced capital expenditure or reduced involvement in maintaining the necessary software. It also allows the small business to be able to “think big” when it comes to their IT systems without paying a prince’s ransom.

What is strictly a cloud system

Single Server online system

Single Server online system

But, strictly speaking, a cloud-based system relies on multiple online locations to store and/or process data. Such a system would have multiple computers at multiple data centres processing or storing the data, whether in one geopolitical jurisdiction or many depending on the service contract.

This is compared to the single-server online IT system sold as a service that implements at least a Web-based “thin-client” where you work the data through a Web page and, perhaps, a mobile-platform native app to work your data on a smartphone or tablet. Typically, the data would be held on one system under the control of the service provider with this system existing at a data centre. It works in a similar vein to common Internet services like email or Web-hosting with the data held on a server provided by the Wehhost or ISP.

Hybrid cloud systems

Hybrid Cloud online system

Hybrid Cloud online system with primary data kept on premises

One type of cloud system is what could be best described as a “hybrid” system that works with data stored primarily on the user’s premises. This is typically to provide either a small private data cloud that replicates data across branches of a small business or to provide online and mobile functionality such as to allow you to manage the data on a Web page or native mobile-platform app anywhere around the world, or to provide messaging abilities through a mobile-messaging platform.

For example, a lot of NAS units are marketed as “cloud” NAS units but these devices keep the user’s data on their own storage media. Here, they use the “cloud” functionality to improve discovery of that device from the Internet when the user enables remote access functionality or data-syncing between two NAS devices via the Internet. It is due to the reality that most residential and some small-business Internet connections use outside IP addresses that change frequently.

WD MyCloud EX4100 NAS press image courtesy of Western Digital

WD MyCloud EX4100 NAS – one of the kind of NAS units that uses cloud functionality for online access

Or a small medical practice who keeps their data on-premises is sold a “cloud-based” messaging and self-service appointment-management add-on to their IT system. Here, the core data is based on what is held on-premises but the messaging functionality or Web-based user interface and necessary “hooks” enabling the mobile-platform native app for the self-service booking function are hosted on a cloud service built up by the add-on’s vendor. When a patient uses the mobile-platform app or Web-front to book or change an appointment, they alter the data on the on-premises system through the cloud-hosted service.

It may also be used with something like an on-premises accounting system to give business functionality like point-of-sale abilities to a mobile-platform device like an iPad through the use of a cloud-based framework. But the core data in the on-premises system is altered by the cloud-based mobile-platform setup as each transaction is completed.

Full-cloud systems

Full Cloud online system

Full Cloud online system with data processing and storage across multiple different computers

On the other hand, a full-cloud system has the user’s primary data held online across one or more server computers with minimum local hardware or software to work the user’s data. There may be some on-premises data-caching to support offline operation such as to provide transaction-capture if the link is down or simply to improve the system’s performance.

The on-premises data-caching can be extended to a system that has concurrent data storage both in the cloud and on the premises. This may help businesses who have invested in on-premises data-storage by providing them with the security and flexibility of a cloud setup with the fail-safe and responsive operation of the on-premises setup. In some cases, it may also be about facilitating secure payment-card transaction processing with that taking place in a separate compliant network hosted by the IT solution provider.

The IT infrastructure for a full-cloud system will have some measure of scalability to allow for an increasing customer base, typically with the service provider annexing more computer power as the customer base increases. Such a service will have tiered pricing where you pay more for increased capacity.

Client software types

The user-interface for an online or cloud IT system would primarily be Web-driven where you work the data with a Web browser. On the other hand, it could use native client software that works tightly with the client computer’s operating system whether as a “thick” client with a significant amount of local data-processing or storage on the endpoint computing device or a “thin” client which just has a window to the data such as simply using a Web browser.

Public vs private cloud

Another concept regarding cloud-based IT is the difference between a public cloud and a private cloud. The public cloud has the computing power managed by another firm like Microsoft Azure or Amazon Web Services while the private cloud has all its computing power managed by the service provider or client company and effectively isolated from public access through a separate private network.

This can be a regular server-grade computer installed at each of the business’s branches, described as an internal cloud, Or it can be multiple high-grade server computers installed at data centres managed by someone else but available exclusively for the business, known as a hosted private cloud.

Data Privacy, Security and Sovereignty

Another factor that comes in to question regarding cloud and online computing is the issue of data privacy, security and sovereignty.

This covers how the data is handled to assure privacy relating to end-users whom the data is about; and assurance of security over data confidential to the IT system’s customer and its end-users. It will call out issues like encryption of data “in transit” (while moved between systems) and “at rest” (while stored on the systems) along with policies and procedures regarding who has access to the data when and for what reason.

It is becoming a key issue with online services thanks to the European GDPR directive and similar laws being passed in other jurisdictions which are about protecting end-users’ privacy in a data-driven world.

The issue of data sovereignty includes who has effective legal control over the data created and managed by the end-user of the online service along with which geopolitical area’s rules the data is subject to. Some users pay attention to this thanks to countries like the continental-European countries who value end-user privacy and similar goals heavily.

There is also the issue of what happens to this data if the user wants to move to a service that suits their needs better or if the online service collapses or is taken over by another business.

Cloudlets, Fog Computing and Edge Computing

Edge Computing setup

Edge computing setup where local computing power is used for some of the data handling and storage

This leads me to the concept of “edge computing”, which uses terminology like “fog computing” or “cloudlets”. This involves computing devices relatively local to the data-creation or data-consumption endpoints that store or process data for the benefit of these endpoints.

An example can be about a small desktop NAS, especially a high-end unit, on a business premises that handles data coming in to or going out to a cloud-based online service from endpoint devices installed on that premises. Or it could be a server installed in the equipment rack at a telephone exchange that works as part of a content-delivery system for customers who live in the neighbourhood served by that exchange.

Qarnot Q.Rad press image courtesy of Qarnot

Qarnot Q.Rad room heater that is a server computer for edge-computing setups

Similarly, the Qarnot approach which uses servers that put their waste heat towards heating rooms or creating domestic hot water implements the principle of edge computing. Even the idea of a sensor drone or intelligent videosurveillance camera that processes the data it collects before it is uploaded to a cloud-based system is also about edge computing.

It is being touted due to the concept of decentralised data processing as a way to overcome throughput latency associated with the public Internet links. As well, this concept is being underscored with the Internet of Things as a way to quickly handle data created by sensors and turn it in to a form able to be used anywhere.

Conclusion

Here, the issue is for those of us who buy service-based IT whether for our own needs or for a workplace is to know what kind of system we are dealing with. This includes whether the data is to exist in multiple locations, at the premises or at one location.

Originally published in July 2019. Updated in January 2020 in order to encompass cloud-based IT solutions that keep business data both in the cloud and on the premises as well as using cloud-based technology to facilitate secure transaction processing.

The NAS as an on-premises edge-computing device for cloud services

QNAP TS-251 2-bay NAS

QNAP TS-251 2-bay NAS – units like this could become a capable edge-computing device

The high-end network-attached storage system is a device able to augment the cloud computing trend in various forms. This is by becoming a local “edge processor” for the cloud-computing ecosystem and handling the data that is created or used by end-users local to it.

High-end network-attached-storage systems

We are seeing the rise of network-attached-storage subsystems that are capable of running as computers in their own right. These are typically high-end consumer or small-business devices offered by the likes of QNAP, Synology or NETGEAR ReadyNAS that have a large app store or software-developer community.

The desktop variants would be the size ranging form half a loaf of bread to a full bread loaf, with some rack-mounted units about the size of one or two pizza boxes.This is compared to servers that were the size of a traditional tower computer.

But some of the apps work alongside public cloud-driven online services as a client or “on-ramp” between these services and your local network. A typical use case is to synchronise files held on an online storage service with the local storage on the NAS unit.

These high-end network-attached-storage devices are effectively desktop computers in their own right, with some of them using silicon that wouldn’t look out of place with a traditional desktop computer. Some of these machines even support a “local console” with a display connection and USB connections that support keyboards and mice.

Cloud computing

Cloud computing takes an online-service approach to computing needs and, in a lot of cases, uses multiple computers in multiple data centres to perform the same computing task. This is typically to host the data in or close to the end-user’s country or to provide a level of scalability and fault-tolerance in the online service approach.

Lot 3 Ripponlea café

A cafe like this could benefit from big-business technology without paying a king’s ransom thanks to cloud computing

Small businesses are showing an interest in cloud-driven computing solutions as a way to come on board with the same things as the “big end of town” without paying a king’s ransom for hardware necessary for an on-premises computing solution. In some cases, it is also about using different endpoint types like mobile-platform tablets for daily use or as a management tool, underscoring such concepts as low cost or portability that some endpoints may offer.

Typically, this kind of computing is offered “as a service” where you subscribe to the service on a regular, usually monthly or annual, basis rather than you spending big on capital expenses to get it going.

But, due to its nature as an always-online service, cloud computing can cause reliability and service-availability issues if the Internet connection isn’t reliable or the service ends up being oversubscribed. This can range from real-time services suffering latency towards a cloud-computing experience becoming unresponsive or unavailable.

Then there is the issue of privacy, data security, service continuity and data sovereignty which can crop up if you change to a different service or the service you use collapses or changes hands. It can easily happen while cloud-computing faces points of reckoning and the industry goes in to consolidation.

Edge computing

But trends that are being investigated in relationship to the “Internet Of Things” and “Big Data” are the concepts  of “edge” and “fog” computing. It is based around the idea of computing devices local to the source or sink of the data that work with the locally-generated or locally-used data as part of submitting it to or fetching it from the cloud network.

It may allow a level of fault-tolerance for applications that demand high availability or permit scalability at the local level for the cloud-computing application. Some systems may even allow for packaging locally-used data in a locally-relevant form such as for online games to support local tournaments or an online movie service to provide a local storage of what is popular in the neighbourhood.

The ideas associated with “edge” and “fog” computing allow for the use of lightweight computer systems to do the localised or distributed processing, effectively aggregating these systems in to what is effectively a heavyweight computer system. It has been brought about with various early distributed-computing projects like SETI and Folding@Home to use personal computers to solve scientific problems.

What is serving the edge-computing needs

Qarnot Q.Rad press image courtesy of Qarnot

This Qarnot Q.Rad heater is actually a computer that is part of edge computing

Some applications like drones are using the on-device processing to do the local workload. Or we are seeing the likes of Qarnot developing edge-computing servers that heat your room or hot water with the waste heat these computing devices produce.  But Amazon and QNAP are working on an approach to use a small-office NAS as an edge-computing device especially for Internet-Of-Things applications.

The NAS serving this role

Here, it is about making use of these ubiquitous and commonly-available NAS units for this purpose as well as storing and serving data that a network needs. In some cases, it can be about the local processing and storing of this locally-generated / locally-used data then integrating the data with what is available on the cloud “backbone”.

For some applications, it could be about keeping enough data for local needs on the NAS to assure high availability. Or it could be about providing scalability by allowing the NAS to do some of the cloud workload associated with the locally-generated data before submitting it to the cloud.

Netgear ReadyNAS

The NETGEAR ReadyNAS on the right is an example of a NAS that is capable of being an edge-computing node

This may be of importance with IT systems that are moving from a totally on-premises approach towards the use of cloud-computing infrastructure with data being stored or processed online. It is where the focus of the cloud infrastructure is to make business-wide data available across a multi-site business or to provide “portable access” to business data. Here, a NAS could simply be equipped with the necessary software to be a smart “on-ramp” for this data.

For small and medium businesses who are moving towards multiple locations such as when a successful business buys another business in another area to increase their footprint, this technology may have some appeal. Here, it could be about doing some pre-processing for data local to the premises before submitting to the cloud as part of an online management-information-system for that small effort.  As well, it could be about keeping the business-wide data “in-sync” across the multiple locations, something that may be important with price lists or business-wide ledgers.

This kind of approach works well with the high-end NAS units if these units’ operating platforms allow third-party software developers to write software for these devices. It can then open up the possibilities for hybrid and “edge” computing applications that involve these devices and the network connectivity and on-device storage that they have.

Conclusion

What needs to happen is that the high-end network-attached-storage systems of the Synology or QNAP kind need to be considered as a hardware base for localised “edge computing” in an online “cloud-computing” setup.

This can be facilitated by the vendors inciting software development in this kind of context and encouraging people involved in online and cloud computing to support this goal especially for small-business computing.

WD cracks the 14 Terabyte barrier for a standard desktop hard disk

Article HGST UltraStar HS14 14Tb hard disk press image courtesy of Western Digital

Western Digital 14TB hard drive sets storage record | CNet

From the horse’s mouth

HGST by Western Digital

Ultrastar HS14 14Tb hard disk

Product Page

Press Release

My Comments

Western Digital had broken the record for data stored on a 3.5” hard disk by offering the HGST by WD UltraStar HS14 hard disk.

This 3.5” hard disk is capable of storing 14Tb of data and has been seen as a significant increase in data-density for disk-based mechanical data storage. It implements HelioSeal construction technology which yields a hermetically-sealed enclosure filled with helium that leads to thinner disks which also permit reduced cost, cooling requirements and power consumption.

At the moment, this hard disk is being pitched at heavy-duty enterprise, cloud and data-center computing applications rather than regular desktop or small-NAS applications. In this use case, I see that these ultra-high-capacity hard disks earn their keep would be localised data-processing applications where non-volatile secondary storage is an important part of the equation.

Such situations would include content-distribution networks such as the Netflix application or edge / fog computing applications where data has to be processed and held locally. Here, such applications that are dependent on relatively-small devices that can be installed close to where the data is created or consumed like telephone exchanges, street cabinets, or telecommunications rooms.

I would expect that this level of data-density will impact other hard disks and devices based on these hard disks. For example, applying it to the 2.5” hard-disk form factor could see these hard disks approaching 8Tb or more yielding highly capacious compact storage devices. Or that this same storage capacity is made available for hard drives that suit regular desktop computers and NAS units.

The home-network gateway device to become advanced

D-Link Covr router and wireless extender package press image courtesy of D-Link

Expect a lot more out of the router that comes with your Internet service when Technicolor gets its way

The device that represents the network-Internet “edge” for your home network i.e. the router won’t just be serving that function in a standalone way anymore. Here, it will work in tandem with other Internet-side and network-side computing devices to become a highly-sophisticated “hub” for your home network.

One of these drivers is to provide a simplified customer-support process, especially for those of us who use carrier-provided equipment at the edge. Here, the support and provisioning process can be fulfilled by the router supplying information to your carrier or ISP regarding your Internet service’s and home network’s performance.without wasting time requiring the customer to supply this information during a support call. This may be considered controversial but has value regarding the support and troubleshooting process which can perplex those of us who aren’t competent with technology such as a lot of older people.

It also encompasses the fact that distributed Wi-Fi will be the “new norm” for the home network, whether through multiple access points connected to a wired or dedicated-wireless backbone, the use of one or more wireless range extenders or a mesh-driven distributed wireless network. Here, it may be about simplifying the process of commissioning the “satellite” wireless devices and making sure that they are performing as expected to assure maximum Wi-Fi coverage across your premises.

The other factor is for a call to provide for always-maintained software in these devices thanks to issues being raised regarding the security of our home networks and the Internet. It was underscored through the recent distributed denial-of-service attacks against various Internet services and blogs using the Mirai bot network that was running compromised software on routers, network cameras and the like which hosted poorly-maintained software to facilitate these attacks.

Let’s not forget that the home-network gateway device will be expected to do more in conjunction with cloud services. Here, they want to provide this kind of service in the same context as the “app-store” commonly associated with mobile computing platforms but increasingly associated with regular computing platforms, and an increasing number of dedicated-purpose devices like printers. It is where a customer can add on extra functionality to their home-network router after they have bought and installed that device rather than buying and installing a new device to achieve this goal.

I was learning about this thanks to a news release offered to me by Diego Gastaldi from Technicolor Connected home regarding this topic. Technicolor came in on this game thanks to buying in to Thomson who supplies a lot of the customer-premises equipment provisioned by telcos and ISPs for their broadband Internet service, especially the triple-play services. This company had presented at Mobile World Congress some of their new concepts for the home-network gateway devices that will be pitched to the likes of Telstra or Bouygues Télécom for their services along with how they can add that extra value.

This is in conjunction with Technicolor announcing their solutions for managed distributed Wi-Fi setups along with devices supporting wireline broadband and mobile wireless broadband on the Internet (WAN) side. The latter trend existed mainly with small-business equipment but its appeal for the home network is being underscored with the “quick-to-provide” goal for an interim wireless service before a wireline service is rolled out, a “fatter pipe” for broadband service by aggregating wireline and mobile broadband services; and always-available broadband for business, e-health / ageing-at-home and the smart home’s security.

The typical applications that will be called out would be to provide business-style “unified threat management” for the home network as a network security measure. Or they could be about joining a “community wireless” platform like Fon where they can share Wi-Fi bandwidth with guests or customers.

But they are also highlighting applications like monitoring elderly loved ones at home to be sure they are OK. Earlier on in 2010, I had a conversation with a representative from Ekahau regarding their Wi-Fi-based Real Time Location System in a residential or small-business environment. This was more so with their T301BD Wi-Fi Pager Tag, pitched primarily as a name tag with duress-alert abilities for healthcare and similar enterprise-level applications, being used as part of an “ageing at home” or similar home-based care scenario. Then I had noticed initial doubt about this kind of application in the home but such setups could be made real with distributed Wi-Fi and them being offered on a cloud-driven “as-a-service” model.

By using a multiple-computer “cloud” approach, there isn’t a requirement to overload a router device with extra processing circuitry which would require a large device to be designed. Typically this would be fulfilled by the use of one or more data centers connected to the Internet like the Amazon Web Services approach Technicolor are using. But, as the compact network-attached-storage maintains its appeal as an on-premises network storage hub with most of these devices offering “remote access” or “personal cloud” functionality, this kind of “cloud” approach could encompass these devices along with other “function-specific” hubs like smart meters or security systems.

But what is happening is that there will be more expectations out of the router device that sits between the home network and the Internet with it being a “gateway” to more online services.

Finnish building-management systems cop the brunt of cyberattacks

Article

There needs to be a level of cyber-security awareness regarding the design and maintenance of building-automation systems

There needs to be a level of cyber-security awareness regarding the design and maintenance of building-automation systems

Finns chilling as DDoS knocks out building control system | The Register

My Comments

Two apartment buildings in Finland became victims of distributed denial-of-service attacks which nobbled their building-management systems. This caused the buildings’ central heating and domestic hot water systems to enter a “safety shutdown” mode because the remote management systems were in an endless loop of rebooting and both these systems couldn’t communicate to each other. The residents ended up living in cold apartments and having cold showers because of this failure.

What is being realised is that, as part of the Internet Of Things, building-management equipment is being seen to be vulnerable, due to factors like the poor software maintenance and an attitude against hardening these systems against cyber-attacks. Then there is the issue of what level of degraded-but-safe functionality should exist for these systems if they don’t communicate to a remote management computer. This also includes the ability for the systems themselves to pass alarm information to whoever is in charge.

This situation has called out data-security issues with design and implementation of dedicated-purpose “backbone devices” connected to the Internet; along with the data-security and service-continuity risks associated with cloud-based computing. It is also an issue that is often raised with essential services like electricity, gas and water services or road-traffic management being managed by Internet-connected computers with these computers being vulnerable to cyberattack.

One of the issues raised included the use of firewalls that run up-to-date software and configurations to protect these systems from cyberattack.

I would also look at a level of fail-safe operation for building management systems that can be implemented if the Internet link to remote management computers dies; along with the ability to use cellular-telephony SMS or similar technology to send alarm messages to building management during a link-fail condition. The fail-safe mode could be set up for a goal of “safe, secure, comfortable” quasi-normal operation if the building-local system identifies itself as operating in a safe manner.

Qarnot uses computers to provide free room heat for buildings

Qarnot Q.Rad press image courtesy of Qarnot

Qarnot Q.rad heater is actually a computer

One of the common ways of using electricity to provide room heat in a building is to use a panel or column heater that has a material like oil heated by an electric element.A variant that existed in the UK and, to some extent, Australia was a “storage heater” or “heat bank” that used a heavier material like bricks that stored more heat and was heated during overnight when the power was cheaper. Then this material diffuses this heat in to the room. These kind of heaters are able to provide this diffused heat to take the chill off a room but were expensive to run.

But Qarnot, a French cloud-computing firm, have looked at the issue of using the waste heat from a computer integrated in this heater to heat a room or building. Here, they have designed the Q.Rad which connects to your home network and electrical power and works as a data-server for their distributed-computing effort while using the waste heat to heat a room.

It also implements an integrated power meter so that you can be reimbursed for the power that it uses as part of the cloud-computing network, effectively providing “free heat”. But a question that can be raised for implementation in markets like Australia, New Zealand or, increasingly, the USA is the requirement to calculate transferred data and establish a mechanism to refund users’ bandwidth charges for this data. This is because of the practice where ISPs are either charging for data transferred or throttling users’ bandwidth if they transfer more than an allotted amount of data.

Qarnot Q.Rad exploded view press image courtesy of Qarnot

Processing power inside this heater – the waste heat from that goes to keeping you warm

The data that Qarnot processes using these heaters is typically for the likes of research labs, banks and animation studios where they “offload” calculations in to this cloud-computing array. They also have the ability to seek out distributed-computing research projects of the SETI or Folding@Home kind to keep the network alive and generating heat where needed. For data security, these heaters don’t implement any storage for the distributed-computing client’s data while implementing end-to-end encryption for this data,

Qarnot will implement an “upgrade and replace” program so that higher-speed processors are used in the Q.Rad computing heaters and there is the ability to deal with failed equipment quickly and easily to assure high availability.

Householders are still able to adjust the heater to their preferred comfort level and make it reflect their lifestyle by using a smartphone app or the controls on the heater. This kind of thermostatic control is achieved by deflecting some of the workload away from the heater that is not needed when there isn’t the need for heat output.

They rate the output of a single unit to around 500 watts which would cover a 150-300 foot area in an insulated building. Qarnot are also pitching these heaters as part of the smart-building concept by having them able to be equipped with sensors and being programmable for any IoT / building-automation application. Similarly, Qarnot have added functionality like USB or Qi wireless charging to these heaters so users can charge mobile devices on them.

At the moment, these heaters are being issued to large buildings in Europe and the USA where 20 units or more need to be deployed. But in 2017, Qarnot wants to release these heaters to individuals who want to take advantage of this heating concept. For householders, this may be seen as being advantageous for “always-needed low-output” heating applications such as kitchens, downstairs areas in split-level houses and similar areas.

In some cases, Qarnot could make it feasible to have the Q.Rad heaters provide services to a network, whether as a router, NAS, home-automation hub or something similar. This could be achieved through the use of extra hardware or software to fulfil these tasks.

What Qarnot has done is to harvest waste heat from computing processes and use this for heating rooms in buildings with little cost to the building owner.