edge computing Archive

Google Nest Mini uses edge computing to improve search performance

Articles

Google Nest Mini smart speaker press picture courtesy of Google

The Google Nest Mini smart speaker – a follow on from the Home Mini smart speaker and having its own local processing to improve Google Assistant’s responsiveness

Google Nest Mini gets louder and gains onboard Assistant processing | SlashGear

Google debuts Nest Mini with wall mount and dedicated ML chip | VentureBeat

From the horse’s mouth

Google

Nest Mini

Nest Mini brings twice the bass and an upgraded Assistant (Product Blog Post)

My Comments

The Google Nest Mini smart speaker, which is the successor to the Google Home Mini smart speaker, shows up a significant number of improvements including a richer sound. But it has also come about with the idea of locally processing your voice commands for better Google Assistant performance.

The traditional approach to processing voice commands that are said to a smart speaker or similar device is for that device to send them out as a voice recording to the cloud servers that are part of the voice-driven assistant platform. These servers then implement their artificial-intelligence and machine-learning technology to strip background noise, interpret the commands and supply the appropriate replies or actions back to that device.

But Google has improved on this by using a leaf out of the book associated with edge-computing technology. This is where some of the data storage or processing is performed local to the user before the data is sent to a cloud computing system. Here, Google uses a dedicated machine-language processor chip in their Nest Mini smart speaker to do some of the command processing before sending data about the user’s command to the Google Assistant cloud system.

It reduces the idea of your Google Nest Mini smart speaker being a simple conduit between your home network and the Google Assistant cloud. The key benefit is that you see a quicker response from the Google Assistant via that device. You also have the benefit of reducing the Internet bandwidth associated with handling the voice-driven home assistant activity, avoiding reduced performance for online gaming or multimedia streaming.

Google is working on taking this further with having Google-Assistant-based devices that have this kind of local processing process logic associated with user requests and programmable actions locally. It also includes keeping the logic associated with the Assistant liaising with other smart devices local to your home network, allowing for improvements to performance, user privacy and data security.

It could be seen by Amazon and others as a way to improve the performance of their voice-driven home-assistant platforms. This is more so where the competition between these platforms becomes more keen. As well, there could be a chance for third-party Google Assistant (Home) implementations to look towards using local processing to improve the Assistant’s response.

An issue that will crop up is having multiple devices that have this kind of local processing existing on the same home network help each other to increase the voice-driven assistant’s performance. This can also include using a software approach to make the devices equipped with the local processing provide improved performance for those that don’t have this processing. It will be an issue with the likes of Google Nest Mini and similar entry-level devices that appeal to the idea of having many installed around the house, along with the idea of equipping smart displays with this kind of local processing.

What I see of this is that the use of edge-computing technology is coming to the fore as far as improving responsiveness in the common voice-driven home-assistant platforms.

Send to Kindle

Different kinds of cloud IT systems–what to be aware of

Apple iPad Pro 9.7 inch press picture courtesy of Apple

The iPad is seen as part of the cloud-based mobile computing idea that Silicon Valley promotes

Very often “cloud” is used as a Silicon-Valley-based buzzword when describing information-technology systems that have any sort of online data-handling abilities.

This is more so if the IT system is sold to the customer “as a service” where the customer pays a subscription to maintain use of the system. It also is used where the user’s data is stored at an online service with minimal data-processing and storage abilities at the user’s premises.

It is because small business users are being sold on these systems typically due to reduced capital expenditure or reduced involvement in maintaining the necessary software. It also allows the small business to be able to “think big” when it comes to their IT systems without paying a prince’s ransom.

What is strictly a cloud system

Single Server online system

Single Server online system

But, strictly speaking, a cloud-based system relies on multiple online locations to store and/or process data. Such a system would have multiple computers at multiple data centres processing or storing the data, whether in one geopolitical jurisdiction or many depending on the service contract.

This is compared to the single-server online IT system sold as a service that implements at least a Web-based “thin-client” where you work the data through a Web page and, perhaps, a mobile-platform native app to work your data on a smartphone or tablet. Typically, the data would be held on one system under the control of the service provider with this system existing at a data centre. It works in a similar vein to common Internet services like email or Web-hosting with the data held on a server provided by the Wehhost or ISP.

Hybrid cloud systems

Hybrid Cloud online system

Hybrid Cloud online system with primary data kept on premises

One type of cloud system is what could be best described as a “hybrid” system that works with data stored primarily on the user’s premises. This is typically to provide either a small private data cloud that replicates data across branches of a small business or to provide online and mobile functionality such as to allow you to manage the data on a Web page or native mobile-platform app anywhere around the world, or to provide messaging abilities through a mobile-messaging platform.

For example, a lot of NAS units are marketed as “cloud” NAS units but these devices keep the user’s data on their own storage media. Here, they use the “cloud” functionality to improve discovery of that device from the Internet when the user enables remote access functionality or data-syncing between two NAS devices via the Internet. It is due to the reality that most residential and some small-business Internet connections use outside IP addresses that change frequently.

WD MyCloud EX4100 NAS press image courtesy of Western Digital

WD MyCloud EX4100 NAS – one of the kind of NAS units that uses cloud functionality for online access

Or a small medical practice who keeps their data on-premises is sold a “cloud-based” messaging and self-service appointment-management add-on to their IT system. Here, the core data is based on what is held on-premises but the messaging functionality or Web-based user interface and necessary “hooks” enabling the mobile-platform native app for the self-service booking function are hosted on a cloud service built up by the add-on’s vendor. When a patient uses the mobile-platform app or Web-front to book or change an appointment, they alter the data on the on-premises system through the cloud-hosted service.

It may also be used with something like an on-premises accounting system to give business functionality like point-of-sale abilities to a mobile-platform device like an iPad through the use of a cloud-based framework. But the core data in the on-premises system is altered by the cloud-based mobile-platform setup as each transaction is completed.

Full-cloud systems

Full Cloud online system

Full Cloud online system with data processing and storage across multiple different computers

On the other hand, a full-cloud system has the user’s primary data held online across one or more server computers with minimum local hardware or software to work the user’s data. There may be some on-premises data-caching to support offline operation such as to provide transaction-capture if the link is down or simply to improve the system’s performance.

The on-premises data-caching can be extended to a system that has concurrent data storage both in the cloud and on the premises. This may help businesses who have invested in on-premises data-storage by providing them with the security and flexibility of a cloud setup with the fail-safe and responsive operation of the on-premises setup. In some cases, it may also be about facilitating secure payment-card transaction processing with that taking place in a separate compliant network hosted by the IT solution provider.

The IT infrastructure for a full-cloud system will have some measure of scalability to allow for an increasing customer base, typically with the service provider annexing more computer power as the customer base increases. Such a service will have tiered pricing where you pay more for increased capacity.

Client software types

The user-interface for an online or cloud IT system would primarily be Web-driven where you work the data with a Web browser. On the other hand, it could use native client software that works tightly with the client computer’s operating system whether as a “thick” client with a significant amount of local data-processing or storage on the endpoint computing device or a “thin” client which just has a window to the data such as simply using a Web browser.

Public vs private cloud

Another concept regarding cloud-based IT is the difference between a public cloud and a private cloud. The public cloud has the computing power managed by another firm like Microsoft Azure or Amazon Web Services while the private cloud has all its computing power managed by the service provider or client company and effectively isolated from public access through a separate private network.

This can be a regular server-grade computer installed at each of the business’s branches, described as an internal cloud, Or it can be multiple high-grade server computers installed at data centres managed by someone else but available exclusively for the business, known as a hosted private cloud.

Data Privacy, Security and Sovereignty

Another factor that comes in to question regarding cloud and online computing is the issue of data privacy, security and sovereignty.

This covers how the data is handled to assure privacy relating to end-users whom the data is about; and assurance of security over data confidential to the IT system’s customer and its end-users. It will call out issues like encryption of data “in transit” (while moved between systems) and “at rest” (while stored on the systems) along with policies and procedures regarding who has access to the data when and for what reason.

It is becoming a key issue with online services thanks to the European GDPR directive and similar laws being passed in other jurisdictions which are about protecting end-users’ privacy in a data-driven world.

The issue of data sovereignty includes who has effective legal control over the data created and managed by the end-user of the online service along with which geopolitical area’s rules the data is subject to. Some users pay attention to this thanks to countries like the continental-European countries who value end-user privacy and similar goals heavily.

There is also the issue of what happens to this data if the user wants to move to a service that suits their needs better or if the online service collapses or is taken over by another business.

Cloudlets, Fog Computing and Edge Computing

Edge Computing setup

Edge computing setup where local computing power is used for some of the data handling and storage

This leads me to the concept of “edge computing”, which uses terminology like “fog computing” or “cloudlets”. This involves computing devices relatively local to the data-creation or data-consumption endpoints that store or process data for the benefit of these endpoints.

An example can be about a small desktop NAS, especially a high-end unit, on a business premises that handles data coming in to or going out to a cloud-based online service from endpoint devices installed on that premises. Or it could be a server installed in the equipment rack at a telephone exchange that works as part of a content-delivery system for customers who live in the neighbourhood served by that exchange.

Qarnot Q.Rad press image courtesy of Qarnot

Qarnot Q.Rad room heater that is a server computer for edge-computing setups

Similarly, the Qarnot approach which uses servers that put their waste heat towards heating rooms or creating domestic hot water implements the principle of edge computing. Even the idea of a sensor drone or intelligent videosurveillance camera that processes the data it collects before it is uploaded to a cloud-based system is also about edge computing.

It is being touted due to the concept of decentralised data processing as a way to overcome throughput latency associated with the public Internet links. As well, this concept is being underscored with the Internet of Things as a way to quickly handle data created by sensors and turn it in to a form able to be used anywhere.

Conclusion

Here, the issue is for those of us who buy service-based IT whether for our own needs or for a workplace is to know what kind of system we are dealing with. This includes whether the data is to exist in multiple locations, at the premises or at one location.

Originally published in July 2019. Updated in January 2020 in order to encompass cloud-based IT solutions that keep business data both in the cloud and on the premises as well as using cloud-based technology to facilitate secure transaction processing.

Send to Kindle

The NAS as an on-premises edge-computing device for cloud services

QNAP TS-251 2-bay NAS

QNAP TS-251 2-bay NAS – units like this could become a capable edge-computing device

The high-end network-attached storage system is a device able to augment the cloud computing trend in various forms. This is by becoming a local “edge processor” for the cloud-computing ecosystem and handling the data that is created or used by end-users local to it.

High-end network-attached-storage systems

We are seeing the rise of network-attached-storage subsystems that are capable of running as computers in their own right. These are typically high-end consumer or small-business devices offered by the likes of QNAP, Synology or NETGEAR ReadyNAS that have a large app store or software-developer community.

The desktop variants would be the size ranging form half a loaf of bread to a full bread loaf, with some rack-mounted units about the size of one or two pizza boxes.This is compared to servers that were the size of a traditional tower computer.

But some of the apps work alongside public cloud-driven online services as a client or “on-ramp” between these services and your local network. A typical use case is to synchronise files held on an online storage service with the local storage on the NAS unit.

These high-end network-attached-storage devices are effectively desktop computers in their own right, with some of them using silicon that wouldn’t look out of place with a traditional desktop computer. Some of these machines even support a “local console” with a display connection and USB connections that support keyboards and mice.

Cloud computing

Cloud computing takes an online-service approach to computing needs and, in a lot of cases, uses multiple computers in multiple data centres to perform the same computing task. This is typically to host the data in or close to the end-user’s country or to provide a level of scalability and fault-tolerance in the online service approach.

Lot 3 Ripponlea café

A cafe like this could benefit from big-business technology without paying a king’s ransom thanks to cloud computing

Small businesses are showing an interest in cloud-driven computing solutions as a way to come on board with the same things as the “big end of town” without paying a king’s ransom for hardware necessary for an on-premises computing solution. In some cases, it is also about using different endpoint types like mobile-platform tablets for daily use or as a management tool, underscoring such concepts as low cost or portability that some endpoints may offer.

Typically, this kind of computing is offered “as a service” where you subscribe to the service on a regular, usually monthly or annual, basis rather than you spending big on capital expenses to get it going.

But, due to its nature as an always-online service, cloud computing can cause reliability and service-availability issues if the Internet connection isn’t reliable or the service ends up being oversubscribed. This can range from real-time services suffering latency towards a cloud-computing experience becoming unresponsive or unavailable.

Then there is the issue of privacy, data security, service continuity and data sovereignty which can crop up if you change to a different service or the service you use collapses or changes hands. It can easily happen while cloud-computing faces points of reckoning and the industry goes in to consolidation.

Edge computing

But trends that are being investigated in relationship to the “Internet Of Things” and “Big Data” are the concepts  of “edge” and “fog” computing. It is based around the idea of computing devices local to the source or sink of the data that work with the locally-generated or locally-used data as part of submitting it to or fetching it from the cloud network.

It may allow a level of fault-tolerance for applications that demand high availability or permit scalability at the local level for the cloud-computing application. Some systems may even allow for packaging locally-used data in a locally-relevant form such as for online games to support local tournaments or an online movie service to provide a local storage of what is popular in the neighbourhood.

The ideas associated with “edge” and “fog” computing allow for the use of lightweight computer systems to do the localised or distributed processing, effectively aggregating these systems in to what is effectively a heavyweight computer system. It has been brought about with various early distributed-computing projects like SETI and Folding@Home to use personal computers to solve scientific problems.

What is serving the edge-computing needs

Qarnot Q.Rad press image courtesy of Qarnot

This Qarnot Q.Rad heater is actually a computer that is part of edge computing

Some applications like drones are using the on-device processing to do the local workload. Or we are seeing the likes of Qarnot developing edge-computing servers that heat your room or hot water with the waste heat these computing devices produce.  But Amazon and QNAP are working on an approach to use a small-office NAS as an edge-computing device especially for Internet-Of-Things applications.

The NAS serving this role

Here, it is about making use of these ubiquitous and commonly-available NAS units for this purpose as well as storing and serving data that a network needs. In some cases, it can be about the local processing and storing of this locally-generated / locally-used data then integrating the data with what is available on the cloud “backbone”.

For some applications, it could be about keeping enough data for local needs on the NAS to assure high availability. Or it could be about providing scalability by allowing the NAS to do some of the cloud workload associated with the locally-generated data before submitting it to the cloud.

Netgear ReadyNAS

The NETGEAR ReadyNAS on the right is an example of a NAS that is capable of being an edge-computing node

This may be of importance with IT systems that are moving from a totally on-premises approach towards the use of cloud-computing infrastructure with data being stored or processed online. It is where the focus of the cloud infrastructure is to make business-wide data available across a multi-site business or to provide “portable access” to business data. Here, a NAS could simply be equipped with the necessary software to be a smart “on-ramp” for this data.

For small and medium businesses who are moving towards multiple locations such as when a successful business buys another business in another area to increase their footprint, this technology may have some appeal. Here, it could be about doing some pre-processing for data local to the premises before submitting to the cloud as part of an online management-information-system for that small effort.  As well, it could be about keeping the business-wide data “in-sync” across the multiple locations, something that may be important with price lists or business-wide ledgers.

This kind of approach works well with the high-end NAS units if these units’ operating platforms allow third-party software developers to write software for these devices. It can then open up the possibilities for hybrid and “edge” computing applications that involve these devices and the network connectivity and on-device storage that they have.

Conclusion

What needs to happen is that the high-end network-attached-storage systems of the Synology or QNAP kind need to be considered as a hardware base for localised “edge computing” in an online “cloud-computing” setup.

This can be facilitated by the vendors inciting software development in this kind of context and encouraging people involved in online and cloud computing to support this goal especially for small-business computing.

Send to Kindle

WD cracks the 14 Terabyte barrier for a standard desktop hard disk

Article HGST UltraStar HS14 14Tb hard disk press image courtesy of Western Digital

Western Digital 14TB hard drive sets storage record | CNet

From the horse’s mouth

HGST by Western Digital

Ultrastar HS14 14Tb hard disk

Product Page

Press Release

My Comments

Western Digital had broken the record for data stored on a 3.5” hard disk by offering the HGST by WD UltraStar HS14 hard disk.

This 3.5” hard disk is capable of storing 14Tb of data and has been seen as a significant increase in data-density for disk-based mechanical data storage. It implements HelioSeal construction technology which yields a hermetically-sealed enclosure filled with helium that leads to thinner disks which also permit reduced cost, cooling requirements and power consumption.

At the moment, this hard disk is being pitched at heavy-duty enterprise, cloud and data-center computing applications rather than regular desktop or small-NAS applications. In this use case, I see that these ultra-high-capacity hard disks earn their keep would be localised data-processing applications where non-volatile secondary storage is an important part of the equation.

Such situations would include content-distribution networks such as the Netflix application or edge / fog computing applications where data has to be processed and held locally. Here, such applications that are dependent on relatively-small devices that can be installed close to where the data is created or consumed like telephone exchanges, street cabinets, or telecommunications rooms.

I would expect that this level of data-density will impact other hard disks and devices based on these hard disks. For example, applying it to the 2.5” hard-disk form factor could see these hard disks approaching 8Tb or more yielding highly capacious compact storage devices. Or that this same storage capacity is made available for hard drives that suit regular desktop computers and NAS units.

Send to Kindle

Qarnot to use edge computing to heat your shower or swimming-pool water

Articles

Yarra's Edge apartment blocks

Qarnot will be satisfying the hot-water needs of residents in buildings like these for free

AMD’s Ryzen Pro is doing double duty as a processor and house heater | PC Gamer

Asperitas and Qarnot Collaborate in Liquid Cooling | Inside HPC

Deal puts cloud computing in boilers and heaters across Europe | EE News Europe

From the horse’s mouth

Qarnot

1500 AMD Ryzen PRO will heat homes and offices next year in Bordeaux, France (Blog post)

Asperitas

Green edge computing partnership (Press Release)

My Comments

Qarnot Q.Rad press image courtesy of Qarnot

using the same kind of technology as the Q.Rad heater

The high-performance computing industry places importance on keeping the processors cool and one way this has happened is to use liquids which works in a similar manner to how the engine is kept cool in most vehicles. This is where a loop of liquid coolant is passed between the engine block and the radiator and heater core to shift the waste heat away from the engine and, in the typical passenger car or commercial vehicle, keep the passenger cabin to a comfortable temperature when it’s cold outside.

In the computing context, it is implemented in some of the most advanced and tricked-out gaming rigs through the use of a water loop and cooling blocks attached to each processor chip along with a radiator that disperses this heat. Qarnot and Asperitas are implementing it as a way of heating water for free using the distributed-computing “micro data center” concept that Qarnot is known for with the Q.Rad data-processing room heater.

This is also in conjunction with Qarnot implementing AMD Ryzen CPUs in the next generation of the Q.Rad because these workstation processors implement perform better than the previously-used Intel CPUs yet yield the same heat output. This is important due to the fact that Qarnot’s distributed-processing market is focused on 3D rendering and visual effects for the movie and TV industry or risk-analysis in the financial industry.

Gaming rig

and the same kind of liquid-based cooling technology used in some of these gaming rigs

But the Qarnot / Asperitas approach to harvesting the processor heat for water heating is focused primarily around providing domestic hot water for the building’s occupants or to heat the water in a swimming pool or spa installed in that building. It would be seen as having a similar environmental and running-cost advantage to a solar hot-water installation used for the same purposes. Let’s not forget that a larger number of the “edge-computing” servers that dissipate their heat in to the building’s hot-water infrastructure using this approach can provide the right amount of heated water that a building needs.

It also yields an advantage for Qarnot either during the summer season or in areas that have warmer climates because they can maintain the distributed processing concept in these areas due to their need to heat water for our year-round hot-water needs.

Send to Kindle

Pokemon Go and other similar situations may underscore the need for local micro-data-centers

Article – French language / Langue Française

A small server in an apartment block's basement communications room could be part of a distributed computing setup

A small server in an apartment block’s basement communications room could be part of a distributed computing setup

Le phénomène Pokémon GO révéle nos besoins en datacenters de proximité | L’Usine Digitale

My Comments

This year has shown a few key situations that are placing traditional data-center technology under stress. This is based around fewer large data centers placed sparsely through a geographic area and used primarily to store or process data for many businesses.

One of these is the popularity of Pokemon GO. As people started playing this augmented reality game on their smartphones, there was the need to draw down data representing the app from the different data centers associated with the mobile platforms’ app stores. Then, during play, the game would be exchanging more data with the servers at the data centers that Niantic Labs controls. In some cases, there was problems with underperformance due to latency associated with this data transfer.

Qarnot Q.Rad press image courtesy of Qarnot

.. as could one of these Qarnot Q.rad room-heater servers

Then lately, there was a recent attack, purported to be a denial-of-service attack, against the data centers that were being used to collect the data for the census taking place in Australia on Tuesday 10 August. It is although the census is being targeted towards an online effort where households fill in Web pages with their data rather than filling out a larger book that is dropped off then collected.

Both these situations led to data-center computers associated with these tasks failing which effectively put a spanner in the works when it came to handling these online activities.

What is being shown is that there needs to be an emphasis on so-called “edge computing” or the use of small localised data centers also known as “cloudlets” to store and process data generated in or called upon by a particular area like a suburb or an apartment block. These data centers would be linked to each other to spread the load and pass data to similar centers that need the data.

One application that Netflix put forward was their “Open Connect Appliance” which as a storage device that an ISP or telco could install in their equipment rack if they end up with significant Netflix traffic. This box caches the local Netflix library and is updated as newer content comes on line and older locally-untouched content is removed. Such a concept could be taken further with various content delivery networks like Cloudflare or those implemented by the popular news services or social networks.

The trend I would initially see would be suburban telephone exchanges, cable-TV headends or similar facilities being seen as a starting point for local micro datacenters that serve a town or suburb. Then this could be evolving to street cabinets associated with traffic signals, FTTC/FTTN services and the like, or the basement telecommunications rooms in multi-tenancy buildings being used for this purpose with these smaller data centers being used to serve their immediate local areas.

Qarnot, with its Q.Rad room heaters that are actually data servers, weighed in on the idea that a cluster of these room heaters in a premises or town could become effectively a local “micro data center”.

As for applications that I would see for these “micro data centers” that implement region-focused data processing, these could include: distributed content delivery of the Cloudflare or Netflix kind; localised “store and process” for data loads like a nation’s census;,  online gaming of the Pokemon GO kind; and distributed-computer applications that ask for high fault-tolerance. There will still be the need for the “virtual supercomputer” that would be needed for huge calculation loads like sophisticated financial calculations or 3D animation renderings which a collection of these “micro data centers” could become.

Similarly, the issue of distributed localised computing concepts like edge computing and local “micro data centers” could reduce the need for creating large data centers just for handling consumer-facing data.

What could be seen as affecting the direction for cloud-based computing would be the implementation of localised processing and storage in smaller spaces rather than the creation of large data centers.

Send to Kindle