Artificial intelligence is taking the event-based “common image pool” further to facilitate the creation of high-quality videos of family and community events.
What happens in most family and community events is that most, if not all, of the attendees take pictures and videos of these events for their own private collections. But the hosts of these events want to create a common image or video collection representing those pictures.
It may be in addition to engaging a professional or advanced-hobbyist photographer or videographer to take photos or videos of that event and present them as a primary keepsake.
This is typically done through the host or someone else setting up a folder on an online file-sharing or similar service and inviting the guests to upload images and videos to that folder. But the host may want to do something better than have a pile of images or videos existing in an online folder or removeable storage.
A person with good editing skills may want to turn out an impressive audio-visual keepsake that consists of curated image, audio and video content. But they may have to claw through all the material and make sure it represents particular points of the event in the best manner.
.. and cameras
There is research taking place regarding the use of artificial intelligence to simplify the content-selection and editing task in order to create the equivalent of a multiple-camera production. It is similar to a Microsoft Research effort to time-align multiple separate audio recordings of the same event to create a higher-quality master or reference recording of the whole event.
Here, it is about recognising what each person has photographed or filmed and applying metadata tags to the content about each person in the imagery. As well, content-creation timestamps are also used to automatically “time-align” audio and video content and assess synchronisation reliability. It could also be done using analysis of audio and video content for this purpose, especially if some video was taken of the same event before or after someone else started or stopped filming.
Then there is the fact that someone would photograph or film using media that doesn’t support machine-readable timestamps like analog media and submit some of this footage to the content pool using a scanner or video-capture setup. Add to this situations where digital cameras don’t have their internal clocks set properly due to various reasons like occasional use or new acquisition.
This will also benefit large-screen views at events like this Polish music festival
The images and video would be analysed for distance between objects, photography attributes like camera direction and orientation. This would be classified with respect to objects. scenery, speech, people’s faces, image quality and content assessment. It may then be used as a way to switch between videos or images to yield the best pictures of key subjects including creating that multi-camera vision of a key moment in the event. It would be independent of the timestamp issues mentioned above.
They even envisage the idea of realtime multicamera video production using AI-driven image recognisiton along with high-speed low-latency wireless networking technology. Examples of this could be Wi-FI 6 or 5G mobile broadband setups.
… or a contemporary praise-and-worship service at a church like Neuma Church
I would also see this apply to the big screen vision used at sporting, cultural and similar events where the organisers set up one or more big screens showing vision of the performers or athletes as they perform. In this use case, there would be audience members taking pictures or video of the event for their own collections and they could contribute that kind of vision to the video mix appearing on the big screens.
There would be the ability to automatically edit the content together to create the final content. This would be through selecting video clips and images, mixing the audio and implementing visual themes and special effects to add polish to the content. In some cases, it could be about drafting in content from other sources such as inserting stock imagery or “filling out” music used in a video taken during the event with an album-quality stereo recording of that music like what is done in film and TV.
On the other hand, this kind of editing could be performed on a semi-automatic basis to allow you to have some creative control over the final product. An example of this would be to time-align videos and stills of an event so you can choose the images you want to represent it with and how they should appear. Or there could be the desire to offer a short version of the video for viewers who are short on time as well as a longer, more detailed, version.
This research will also be include means to authenticate images that come in for editing and authenticate the master recording for distribution. Security will be further assured with account-driven operation that implements usernames and passwords or similar user authentication; along with device-level authentication and image watermarking.
As well, it is about being able to repackage the master recording for use cases like the Social Web or home-theatre playback. Here, it may be about offering different aspect ratios or image resolutions or offering a stereo or surround soundmix.
This kind of artificial intelligence coming in to play for compiling images and videos from different sources to create a keepsake video could come in to play with people who don’t have good editing skills or want to make things a bit easier for creating visual keepsakes or live video displays.
Traditional TV will soon be able to be delivered via the Internet
A direction that we are expecting to see for broadcast radio and TV technology is to stream it via Internet-based technologies but assure users of a similar experience to how they have received content delivered this way.
It is about being able to use the agile wired and wireless Internet technologies like 5G mobile broadband, fibre-to-the-premises, fixed-wireless broadband; and Ethernet and Wi-Fi wireless local area networks to deliver this kind of content.
What is the goal here
… with the same kind of familiar user experience like channel surfing with a remote control
The goal here is to provide traditional broadcast radio and TV service through wired or wireless broadband-service-delivery infrastructure in addition to or in lieu of dedicated radio-frequency-based infrastructure.
The traditional radio-frequency approach uses specific RF technologies like FM, DAB+, DVB and ATSC to deliver audio or video content to radio and TV receivers. This can be terrestrial to a rooftop, indoor or set-attached antenna referred to in the UK and most Commonwealth countries as an aerial; via a cable system through a building, campus or community; or via a satellite where it is received using special antennas like satellite dishes.
The typical Internet-Protocol network used for Internet service uses different transport media, whether that be wired or wireless. It can be mobile broadband receivable using a mobile phone; a fixed setup like fibre-to-the-premises, fixed wireless or fibre-copper setups. As well, such networks typically include a local-area network covering a premises or building that is based on Ethernet, Wi-Fi wireless, HomePlug or G.Hn powerline, or similar technologies.
The desireable user experience
… as will radio be delivered that way, including to have a very rich user experience
It also is about providing a basic setup and use experience equivalent to what is expected for receiving broadcast radio and TV service using digital RF technologies. This includes “scanning” the wavebands for stations to build up a station directory of what’s available locally as part of setting up the equipment; using up/down buttons to change between stations or channels; keying in “channel numbers” in to a keypad to select TV channels according to a traditional and easy-to-remember channel numbering approach; using a “last-channel” button to flip between two different programmes you are interested in; and allocating regularly-listened-to stations to preset buttons so you have them available at a moment’s notice.
This has been extended to a richer user experience for broadcast content in many ways. For TV, it has extended to a grid-like electronic programme guide which lists what is showing now or will be shown in the coming week on all of the channels so you can switch to a show that you like to watch or have that show recorded. For radio, it has been about showing more details about what you are listening to like the name of that song you are listening to for example. Even ideas like prioritising or recording the news or traffic announcements that matter or selecting content by type has also become another desireable part of the broadcast user experience.
Relevance of traditional linear broadcasting today
There are people who cast doubt on the relevance of traditional linear broadcast media and its associated experiences in this day and age.
This is brought about through the use of podcasts, Spotify-like audio streaming services, video-on-demand services and the like who can offer a wider choice of content than traditional broadcast media.
But some user classes and situations place value upon the traditional broadcast media experience. Firstly, Generation X and prior generations have grown up with broadcast media as part of their life thanks to affordable sets with a common user experience and an increasing number of stations or channels being available. Here, these users are often resorting to broadcast media for casual viewing and listening with a significant number of these users recording broadcast material to enjoy again on their own terms.
Then there is the reliance on traditional broadcast media for news and sport. This is due to the ability to receive up-to-date facts without needing to do much. Let’s not forget that some users rely on this media experience for discovery of content curated by someone else like staff at a TV channel or a radio station rather than an online service’s content-recommendation engine. Even the on-air talent is valued by a significant number of listeners or viewers as personalities in their own right because of how they present themselves on radio or TV.
Access without traditional radio-frequency infrastructure
One of these goals here is to allow access to traditional broadcast radio and TV without being dependent on particular radio-frequency infrastructure types and reception conditions. This can encompass someone to offer a linear broadcast service with all the trappings of that service but not needing to have access to RF-based broadcast technologies like a transmitter.
To some extent, it could be a method to use the likes of SpaceX Starlink or 5G mobile broadband to deliver radio and TV service to rural and remote areas. This could come in to its own where the goal is to provide the full complement of broadcasting services to these areas.
It also is encompassing a situation happening with cable-TV networks in some countries where these networks are being repurposed purely for cable-modem Internet service. As well, some neighbourhoods don’t take kindly to satellite dishes popping up on the roofs or walls of houses, seeing them as a blight. Here, multi-channel pay-TV operators have had to consider using Internet-based delivery methods to bring their services to potential customers without facing these risks.
It can also be about using smartphones, tablets and laptops to watch TV with the full experience
Let’s not forget that IP-based data networks are being seen as a way to extend the reach of traditional broadcast services in to parts of a building that don’t have ready access to a reliable RF signal or traditional RF infrastructure. This may be due to it being seen as costly or otherwise prohibitive to extend a master-antenna TV setup to a particular area or to install a satellite dish, TV aerial or cable-TV connection to a particular house.
In the portable realm, it extends especially to smartphones or mobile-platform tablets even where these devices may have a broadcast-radio or TV tuner. But broadcast reception using these tuners only becomes useful if you plug a wired headset in to the mobile device’s headset jack, because of a long-standing design practice with Walkman-type personal radio devices where the headset cable is the device’s FM or DAB+ antenna. Here, the smartphone could use mobile broadband or Wi-Fi for broadcast-radio reception if you use its speaker or a Bluetooth headset to listen to the radio.
Complementing traditional radio infrastructure
Broadcast content is repackaged in a manner to be delivered via Internet Protocol rather than broadcast means
In the same context, it is also being considered as a different approach to providing “broadcast-to-LAN” services where broadcast signals are received from radio infrastructure via a tuner-server device and streamed in to a local-area network. This could allow the client device to choose the best source available for a particular channel or station.
But even the “broadcast-to-LAN” approach can be improved upon by providing an equivalent user experience to a traditional RF-based broadcast setup. It would benefit buildings or campuses with a traditional aerial or satellite dish installed at the most optimum location but use Ethernet cabling, Wi-Fi wireless or similar technologies including a mixture of such technologies to distribute the broadcast signal around the development.
As well, some of these setups may be about mixing the traditional broadcast channels and IP-delivered content in to a form that can be received with that traditional broadcast user experience. Or it can be about seamlessly switching between a fully-Internet-delivered source and the broadcast stream provided by a broadcast-LAN server to the local network that is providing Internet service. This can cater towards broadcast-LAN setups based around devices that don’t have enough capacity to serve many broadcast streams.
Even a radio or TV device could maintain a traditional user-experience while content is delivered over both traditional RF infrastructure and Internet-based infrastructure. This could range from managing a situation where an alternative content stream is offered via the Internet while the main content is offered via the station’s traditional RF means. Or it could be about independent broadcast content being broadcast without the need to have access to RF infrastructure or spectrum.
Similarly, some digital-broadcast operators are wanting to implement networks typically used for Internet service delivery as a backhaul between a broadcaster’s studios and the transmitter. Here, it is seen as a cost-effective approach due to a reduced need to create an expensive pure-play wired or wireless link to the transmitter. Rather they can rely on a business-grade Internet service with guaranteed service quality standards for this purpose.
Even a master-antenna system that is set up to provide a building’s or development’s occupants access to broadcast content via RF coaxial-cable infrastructure could benefit this way. This could be about repackaging broadcasters’ content from Internet-based links offered by the broadcasters in to a form deliverable over the system’s RF cable infrastructure rather than an antenna or satellite dish to bring radio and TV to that system. It could be also seen as a way to insert extra content for that development through this system such as a health TV channel for hospitals or a tourist-information TV channel for hotels.
How is this approach being taken
Here, a broadcast-ready linear content stream or a collection of such streams that would be normally packaged for a radio-frequency transport is repackaged for a data network working to IP-compliant standards. This can be done in addition to packaging that content stream for one or more radio-frequency transports.
This approach is built on the idea of the ISO OSI model of network architecture where top-level classes of protocols can work on many different bottom-level transports, with this concept being applied to broadcast radio and TV.
The IP-based network / Internet transport approach can allow for a minimal effort approach to repackaging the broadcast stream or stream collection to an RF transport. A use case that this would apply to is using a business-standard Internet service as a backhaul for delivering radio or TV service to multiple transmitters.
It is different from the Internet-radio or “TV via app” approach where there is a collection of broadcasters streamed via Internet means. But these setups rely primarily on online content directories operated by the broadcasters themselves or third parties like TuneIn Radio or Airable.net. These setups don’t typically offer broadcast-like user experiences like channel-surfing or traditional channel-number entry.
At the moment, the DVB Group who have effectively defined the standards for digital TV in Europe, Asia, most of Africa, and Oceania have worked on this approach through the use of DVB-I (previous coverage on this site) and allied standards for television. This is in addition to the DVB Home Broadcast (DVB-HB) standard released in February this year to build upon SAT-IP towards a standardised broadcast-to-LAN setup no matter the RF bearer.
Another advantage that is also being seen is the ability for someone to get “on the air” without needing to have access to radio-frequency spectrum or be accepted by a cable-TV or satellite-TV network. This may appeal to international broadcasters or to those offering niche content that isn’t accepted by the broadcast establishment of a country.
What is it also leading to
This is leading towards hybrid broadcast and broadband content-delivery approaches. That is where content from the same broadcaster is delivered by RF and Internet means with the end user using the same user experience to select the online or RF-broadcast content.
One use case is to gain access to supplementary content from that broadcast via the Internet no matter whether the viewer or listener enjoys the broadcaster through an RF-based means or through the Internet. This could be prior episodes of the same show or further information about a concept put forward in an editorial program or a product advertised on a commercial.
For radio, this would be about showing up-to-date station branding alongside show names and presenter images. If the show is informational, there would be rich visual information like maps, charts, bullet lists and the like to augment the spoken information.
If it is about music, you would see reference to the title and artist of what’s playing perhaps with album cover art and artist images. For classical music where people think primarily of a work composed by a particular composer, this may be about the composer and the work, perhaps with a reference to the currently-playing movement. Operas and other musical theatre may have the libretti being shown in real time to the performance.
In all music-related cases, there may be the ability to “find out more” on the music and who is behind it or even to buy a recording of that music, whether as physical media like an LP record or CD, or as a download-to-own file.
For TV content, this would be about a rich experience for sports, news, reality and similar shows. For example, the Seven Network created an improved interactive experience for the 2021 Tokyo Olympics and Paralympics by using 7Plus to provide direct access to particular sports types during the Games. A true hybrid setup on equipment with a broadcast tuner would allow a user to select Channel 7 or 7Mate for standard broadcast feeds using the 7Plus user experience with the broadcast feeds supplied by the broadcast tuner or the Internet stream depending on the signal quality.
Issues to consider
There are issues that will be raised where broadcast radio and TV are delivered over Internet infrastructure with the goal of a broadcast-like user experience.
One of these is to assure users don’t pay extra costs for this kind of reception compared to delivery by RF-based means. Here, these Internet-based broadcast setups would have to be “zero-rated” so that users don’t incur data costs on metered Internet services like mobile broadband.
As well, broadband infrastructure providers would need to assure transparent access to Internet-based broadcast setups so that users have access to standard broadcasters without being dependent on service from particular retail ISPs or mobile carriers. It may also be about making sure that one can receive broadcast content with the broadcast user experience anywhere in a typical local network.
Another factor to be considered as far as DVB-I or similar technologies are concerned is whether this impacts on content providers’ liabilities regarding broadcast rights for music and sports content. Here, some sports leagues or music copyright collection bodies consider Internet-based distribution as different from traditional broadcast media and add extra requirements on this distribution approach.
It can be about availability of content beyond the broadcaster’s home country, in a manner to contravene a blackout requirement or to provide a competing source of availability to the one who has exclusive rights for that territory. It is also similar to “grey-importing” of music rather than acquiring it through official distribution channels, that also leads to bringing in content not normally available in a particular country.
These issues may be answered through a framework of various legal protections and universal-service obligations associated with providing free-to-air broadcast content. It would be driven more so by countries who have a strong public-service and/or commercial free-to-air broadcast lobby.
Internet-based technologies are effectively being seen as a way to extend the reach of or improve upon the broadcast-media experience without detracting from its familiar interaction approaches. This is thanks to research in to technologies that are about repackaging broadcast signals for an RF transport in a manner for Internet use.
Where Wi-Fi HaLow fits in to the Wi-Fi network spectrum
A Wi-Fi network technology that is being put on the map at the moment is Wi-Fi CERTIFIED HaLow a.k.a Wi-Fi Halow.
This network technology is based on IEEE 802.11ah wireless network technology and works on the 900MHz waveband. This is a waveband that has been allocated in a significant number of jurisdictions for a range of low-power wireless applications like baby monitors or cordless phones but other wavebands have been opened up for this kind of use. It is about long-range operation of approximately 1 kilometre from the access point and very low power operation that allows devices to run for a year on commodity batteries like a single 3V coin-size cell or a pair of AA-size Duracells.
The power requirement may be a non-issue for devices like HVAC thermostats that are wired to the heating system they control. But they may be an issue with devices like movement sensors or smart locks that are dependent on their own battery power. As well, the low power requirements that Wi-Fi HaLow offer could be of benefit towards devices that implement energy-harvesting technology like solar power or kinetic energy.
One use case will be smart locks and other “Internet of Things” devices
This low-bandwidth Wi-Fi specification is intended to complement the other Wi-Fi specifications used with your home or business network. But it is focused towards the Internet of Everything especially where the devices are to be operated across a wide radius like a farm or campus.
The network topography for a Wi-Fi HaLow network segment will be very similar to the standard Wi-Fi network. That is where multiple client devices link to an access point, but there should be the ability for a mobile device to roam between access points associated with the same Wi-Fi network.
Compared to the likes of 802.15 Zigbee, Z-Wave, DECT-ULE, Bluetooth LE and similar Internet-of-Things wireless technologies, this is meant to avoid the need for special routers when there is a desire to link them to IP-based networks.
This is because this technology effectively uses the same protocol stack as our Wi-Fi networks save for the layers associated with the radio medium. It also means that the same security, connectivity and quality-of-service protocols that are part of Wi-Fi nowadays like EasyConnect and WPA3 can be implemented in Wi-Fi HaLow devices.
At the moment, you would need to use a Wi-Fi HaLow access point to get any Internet-of-Things devices on to your network and the Internet. It may be a small device that plugs in to your existing home network router or network infrastructure. But a subsequent Wi-Fi access point or router design could have built-in support for this standard thus making it more ubiquitous.
The use cases being positioned for Wi-Fi HaLow technology would encompass the smart home, the smart building and the smart city where all sorts of “Internet-of-Things” devices are acting as controllers or sensors. It is also encompassing vertical use cases like agriculture, industry and medicine where sensors come in to play here.
At the moment, this kind of connectivity will exist as an alternative to Zigbee, Z-Wave and similar technologies especially where IP-level connectivity and functionality is wanted at the device. It may not have ready appeal in use cases where a direct connection to Internet-based technology may not be required.
On the other hand, a use case could allow for a “hub and spoke” approach to the Internet of Things where a device can connect to accessory peripheral devices using Zigbee or Bluetooth but link to the home network and Internet via WI-Fi HaLow. An example of this could be a retrofit-install smart lock which supports the use of accessory input devices like keypads, NFC card/fob readers and contact sensors.
Wi-Fi HaLow could be seen as a direction towards capable low-power long-distance wireless networking for Internet of Things, especially where direct Internet / LAN network connectivity is desired out of the application.
The Puget Sound area of Washington State in the USA now has two actors in the low-earth-orbit satellite broadband game.
This was initially Jeff Bezon’s Project Kuiper effort that is starting to pick up steam, but Boeing, associated with the likes of some well-known airliners which you most likely have flown on many times, is now getting the go-ahead to build a constellation of these satellites.
The initial FCC permit will allow Boeing to launch 147 LEO satellites which will be for civil-use cases like residential, commercial and institutional use initially within the USA then globally. The wavebands they will be licensed to work in are part of the V-band radio spectrum for both space-to-ground and inter-satellite communications. They have six years to develop the constellation and launch half of the satellites as part of the licence.
Here, it will be about Boeing joining a relatively-crowded market for LEO satellite broadband which will be a boon for use cases like real broadband in rural and remote areas; alongside broadband Internet within transport services.
Boeing could be more than those commercial airliners we have flown in but extend to satellite broadband Internet for rural communities
But how will Boeing join this market? Could this be through offering a retail service like SpaceX’s Starlink or to offer it as a wholesale service in a similar manner to OneWeb. That is where retail ISPs could resell Boeing’s service to local customers.
There will be the issues of having a retail service licensed for operation in multiple countries especially where some countries are particular about preferring companies chartered in their jurisdiction offer telecommunications and allied services. A wholesale approach can allow a country’s own telcos and ISPs to resell satellite broadband to all user classes.
There is also the question about Boeing being tempted to vertically integrate this service with their lineup of civil aircraft. This could mean that they could get more airlines who fly the likes of the 737 or the 787 Dreamliner to offer a high-bandwidth Internet service provided by their LEO satellite constellation as a passenger amenity.
If Boeing can get these low-earth-orbit broadband satellites off the ground and yielding a viable service, this could be a viably competitive market when it comes to satellite broadband.
The International Telecommunication Union’s G.Hn HomeGrid standard is expected to become a significant new direction in “wired no-new-wires” network technology. Such technology makes use of wiring infrastructure that is in place within a premises for purposes such as providing AC mains power, providing a telephone service or connecting a TV to an outdoor TV antenna or cable / satellite TV setup.
This is for both the in-premises local-area network and for the Internet / WLAN “access” network that brings your Internet service to your home-network router.
This technology primarily works as an alternate powerline / AC-wire network technology to the established HomePlug family of powerline-network technologies. But it is also competing with MoCA for the TV coaxial-cable medium and the G.Hn HomeGrid Forum took over the HomePNA standard for phone-line-based on-premises networks.
The HomePlug Alliance had effectively abandoned continual development of the HomePlug series of powerline-network standards. At the moment, the latest standard is the HomePlug AV2 MIMO which can go to 2000Mbps, That is although a significant number of device manufacturers and IT retailers are continuing to make and sell devices that work to the 1200Mbps bandwidth.
G.Hn HomeGrid has taken the powerline network further by offering the HomeGrid MIMO variant that cam move at least 2000Mbps of data. Like the HomePlug AV2 MIMO standards, this uses the active / phase / line, neutral and earth / ground wires of the mains-power plug to carry the data, thus assuring users of robust data transmission across a building’s general AC wiring infrastructure better.
The G.Hn HomeGrid powerline network standards have been refined also to increase data transmission robustness where there are many powerline networks operated together. This would be, perhaps, a situation that takes place within a large multiple-premises building like an apartment block, shopping centre or office block and would suit today’s urban-design expectations of mixed-use multi-premises developments. Some people would also hold this true for a dense neighbourhood of terrace / townhouse, semi-detached or similar homes.
A G.Hn HomeGrid powerline network can co-exist with a HomePlug AV2 powerline network in the same building but isn’t directly compatible with each other. This is similar to first-generation HomePlug powerline networks operating alongside second-generation HomePlug AV / AV2 powerline networks.
Personally I see G.Hn HomeGrid being used to “take the powerline network further” to higher bandwidths, increased robustness, further distances (500 metres compared to 400 metres for HomePlug AV2 MIMO) and other future needs.
At the moment, Devolo are investing in this technology with their Magic series of powerline network products including some Wi-Fi access points and offering some of these devices to consumers.
But some other network-equipment vendors who have retail-market presence are offering at least a powerline-Ethernet adaptor that works to G.Hn HomeGrid standards as part of their powerline-network product ranges. It is a way for them to put a foot in the door for higher-bandwidth powerline network segments.
TV coaxial cable
Another medium type that is supported by G.Hn HomeGrid is the TV coaxial-cable infrastructure. This would be associated with cable TV, an outdoor TV antenna (aerial) or a satellite dish and there may be extra TV coaxial-cable sockets installed around the house so you can have additional TV sets or use an easily-moveable TV in other rooms.
The Multimedia Over Coax Alliance have created a standard for using TV coaxial-cable infrastructure. But G.Hn HomeGrid have seen intention in using this same medium for the same purpose and could be working it to higher capacities or increased robustness.
Yet another medium type that is supported by this same standard is traditional telephone cabling. This was worked on by HomePNA but the HomeGrid Alliance took over that concern and embodied it in to the G.Hn HomeGrid standard.
This infrastructure would have come about for established homes where there are multiple phone sockets installed through the house’s lifetime. This would be due to the installation of extension phones or to allow one to move a corded phone between different locations easily before cordless phones became a cost-effective approach to flexible landline telephony.
The G.Hn standards implement two relevant use profile cases.
One is called HomeGrid which describes connecting network devices within the premises as part of a local area network.
All of the media connectivity types such as powerline, phone line or TV coaxial cable are able to work in these different use profiles. But there is a question about whether the same medium type could be used for access or in-home connectivity at the same time.
Media-type agnostic approach
The G.Hn HomeGrid standard is being underscored as a media-layer-level standard for the “wired no-new-wires” networks with the goal to make it easier to bridge between different media types.
This could be about the arrival of lower-level bridge devices that link between the different media types with these devices not needing higher-level processing to do so. Most likely such devices will have the bridge functionality but also have a Cat5 Ethernet connection of some sort.
Further evolution of this standard
Currently this standard is being implemented as a “wired no-new-wires” approach to creating a multi-gigabit home network that has a bandwidth of 2.5Gbps or greater. It is to complement multi-Gigabit Ethernet and Wi-Fi 6/6E wireless network technology in raising your home network’s bandwidth making it fit for multi-gigabit broadband Internet services.
But there is further work needed to come about from G.Hn HomeGrid Forum for certain issues. For example, there will be a need to support VLAN network setups using the “wired no-new-wires” technologies. This would come in to play with routers that support a “guest network” or “community network” in addition to a primary network; or for VLANs that are used as a quality-of-service measure for VoIP or IPTV setups.
They would also have to examine the use of an access network and an in-premises network working on the same media bearer.
This could work with a fibre-optic extension setup that would normally use a G.Hn HomeGrid access network on a particular media type like phone line, power line or TV coaxial cable to bring the WAN (Internet) link from a garage or basement where the fibre-to-the-premises optical-network-terminal is installed to the living area where the home network router is installed. But the situation would change where that same basement or garage is purposed as a living space of some sort and a G.Hn HomeGrid in-premises LAN segment is to be created to make that space part of your home network.
Or there is the idea with a multiple-premises building where G.Hn HomeGrid technology is used for the in-building access network to each premises but there is a desire to extend a premise’s LAN or Wi-Fi to a common living area within the building.
The G.Hn HomeGrid and GigaWire standards are something that we have to watch and consider when it comes to “wired no-new-wires” network links for our home networks. In some cases, this technology may be about a “clean-slate” approach to your “wired no-new-wires” network segments.
SpaceX Starlink have issued a second-generation user terminal for their low-earth-orbit satellite Internet service that is intended to help the rural broadband situation. This is due to it heading towards full service with 1800 LEO satellites currently in orbit.
Compared to the previous user terminal which used a dish, this unit uses a 11” x 19” rectangular panel as its antenna. There will be a different set of accessories including a pole you can set up in your yard for this antenna. As well there is increased tolerance for heat which will benefit areas that are really hot such as desert areas.
But you use this new satellite antenna with a new different Wi-Fi modem router that doesn’t have an Ethernet port. Here, if you want Ethernet connectivity, you would have to use an Ethernet adaptor accessory thanks to this design heading towards IP54 outdoor water-resistance requirements.
Some of the computing press reckoned that SpaceX Starlink should forego the IP54 weather-resistance goal and install at least one Ethernet LAN socket on the modem router. This is because the modem router would likely be installed within the house rather than outside.
The WI-Fi LAN for this device will be a three-stream MU-MIMO Wi-Fi 5 setup rather than the previous design’s two-stream MU-MIMO Wi-Fi 5 approach. This will most likely be about improved bandwidth for the LAN aspect of the setup.
SpaceX are still selling their Starlink user equipment as a loss-leader and this new device is about being more compact and cheaper to build.
But they could offer other modem options for the Starlink user terminals. This would include an indoor-optimised modem router with Wi-Fi and Ethernet support along with a pure-play modem that works with a broadband router of your own choice. The latter setup could come in to play with business-grade routers, dual-WAN routers or where a more flexible or better-performing network is desired.
As well, I would like to see some of these modem options able to work from a DC power supply of 12 to 24 volts. That would then come in to its own with Starlink setups in caravan / RV or small-craft marine setups where that voltage range is dominant in such vehicles or craft. It would also come in to its own in areas that don’t have or can’t afford to deploy a reliable mains-voltage power supply.
What is being shown here is that by offering a second-generation user terminal for the Starlink satellite Internet service, SpaceX are showing that there is strong interest in the idea of low-earth-orbit satellite Internet.
The recent news about new cybersecurity measures in response to the ransomware menace have shown that Australia is now acting to world standards when it comes to cybersecurity.
An interview and presentation that I wrote up about in 2011 about cybersecurity in the cloud computing era called out Australia’s lax attitude to cybersecurity. This is very much based on the “She’ll Be Right” laissez-faire attitudes traditionally associated with Australian society. This included not having proper data-protection legislation set in stone that is compliant with international expectations such as mandating businesses, governments and other entities to disclose to a central authority if they have faced a data-security incident.
But there have been recent high-profile ransomware attacks in Australia including some that have attacked significant health infrastructure. This has caused the Federal Government to wake up and pass legislation regarding mandating the reporting of ransomware incidents that businesses and similar entities are victim of.
As well, the proposed Federal-level legislation will criminalise all forms of cyber-extortion and include criminalising dealing in data stolen as part of a cyberattack or other separate offence. It will also encompass having the trade of malware, something that takes place on the Dark Web, dealt with as a criminal offence. There will also be the ability for the Australian governments to seize or freeze crypto transactions because of this being the currency of cyber attacks.
What Australia may have to do is take a holistic look at the issues of data-protection, data sovereignty and end-user data privacy for data collected by both the public and private sectors.
Australian governments creating special data-privacy regulations for these state-run QR-code-based contact-tracing systems won’t cut it anymore unless data security is taken holistically and seriously by Australian governments
Legislation and regulation that is risk-specific or data-collection specific like either the cyber-extortion laws being tabled by the Federal Government or the recent laws being tabled by governments to limit law-enforcement access to those QR-code-based contact-tracing platforms created by state governments won’t cut it any more if other risks aren’t addressed. This includes issues like distributed denial-of-service attacks against IT systems that can be masqueraded as a system facing peak usage by many end-users or foreign entities raiding data-rich IT systems like these above-mentioned platforms for the movements of exiled dissidents.
Australia then also has the added complexity of itself being a federation of states and territories where state and territory governments regulate certain aspects of life while the federal government regulates other aspects. In this case, the federal government regulates cybersecurity issues encompassing private-sector and federal-government public-sector systems. Then the state and territory governments regulate cyber issues encompassing their own public-sector systems. It could look towards what Canada, Germany or Switzerland are doing in this field because these countries have similar federation structures. This is especially where Canada and Switzerland were among the countries who called out the data-security and user-privacy weaknesses within Zoom, Skype and other popular videoconference platforms that came in to vogue in 2020 thanks to the COVID-19 plague.
Here, Australia and New Zealand could look at best-practice data-protection, data-sovereignty and end-user privacy legislation and regulation like some of the laws on the UK’s and European Union’s books and build up a strong cybersecurity regime within the Asia Pacific. This will have to encompass both public-sector and private-sector data-processing environments along with all risks that these environments and their end-users will face.
There will also have to be questions raised regarding the role of other jurisdictions when dealing with cyber-security issues and incidents due to such incidents having an international dimension. It will also include data sovereignty and allied issues where data is handled or stored in other countries or by companies based in other countries.
The ransomware issue and, to some extent, the COVID-19-driven data processing requirements may be the kind of wake-up call for Australia to face when it comes to data security and end-user privacy.
Leicester to have increased infrastructure-level competition for its broadband services Cultural Quarter, Leicester by Malc McDonald, CC BY-SA 2.0 <https://creativecommons.org/licenses/by-sa/2.0>, via Wikimedia Commons
Leicester, a mid-sized city in the UK’s East Midlands is to end up with plenty of infrastructure-level broadband competition thanks to Grain Connect starting to set up shop in that city.
Grain Connect expect to cover the UK over the next five years but are targeting mid-sized cities and large towns. They intend to offer retail service for GBP£14.99 per month for symmetric 50Mbps service to GBP£44.99 per month for a 900Mbps service with a minimum 12-month contract term. These kind of aggressively-low prices are there to encourage rapid service takeup.
Leicester already has Virgin Media, Openreach, Hyperoptic, Cityfibre, OFNL and Fibreloop building out Gigabit-capable service networks across the city. Then for Grain Connect to start setting shop and building infrastructure within that city will show how much infrastructure-level competition it can tolerate.
This kind of competition could lead to some keen-edge pricing for broadband or multiple-play online services and could also see the ISPs all increasing their value for money and trying to retain their customers.
But the question that can easily come about is how sustainable this kind of marketplace will be for a city of Leicester’s size. This may result in some market consolidation taking place in order to keep it in check. There will be the issue of how much market consolidation can take place to avoid a highly-concentrated US-style broadband and telecommunications market. It would be more so for small cities and large towns like Leicester.
What is going on in Leicester will show how sustainably a small-to-medium-size city or large town can handle infrastructure-level competition. This includes how competitive the market can be for Internet service.
2 x 2.5 Gigabit Ethernet (separate networks or high-throughput bonding)
USB 2.0 x 3, USB 3.2 Gen 1 x 1, HDMI 2.0 display
QNAP TBS-464 NAS – as big as a small book
QNAP have recently put on the market two highly-portable network-attached-storage units which are either slightly larger than a typical high-end home-network router or similar to the size of a small paperbook book or VHS videocassette.
This NAS product range, known as the NASBooks, is also designed to operate quietly and even to work on DC voltages between 10 to 20 volts. But how is this so? This is down to these devices using M.2 form-factor solid-state drives as their storage devices rather than using traditional 2.5” or 3.5” storage devices like hard disks.
The first-generation TBS-453 Series, which I had given some coverage to previously, use SATA-spec M.2 SSD sticks while the current generation units use the higher-performance NVMe-spec M.2 SSD sticks. This is in line with offering a wide range of performance improvements on the current-generation TBS-464 NASBook which is effectively a technological refresh of that prior model.
This leads to these QNAP NASBook units being quiet and compact units that can be transported very easily such as within a briefcase. I even came across information that the QNAP TBS-453A had a power voltage range between 10 and 20 volts which could make it feasible to work from the electrical system in most cars and boats.
For these units, there is the HybridDeskStation app which provides either a traditional console user interface with HDMI monitors and USB input devices or a network-linked user interface with a mobile-platform remote control app.
Even when it comes to audio, some of these units had a built-in audio subsystem but they all can support USB-connected audio input / output devices according to the USB Audio device class. They should also support Intel Display Audio so that an HDMI display’s speakers or HDMI-based audio device can play multimedia audio from these devices using that connection. As well, these units support QNAP IR remote controls so you can have the full set-top experience out of these boxes.
This is leading to QNAP dabbling in ideas like having these NAS boxes work not just as media storage boxes for the home network but as the equivalent of one of those PVR set-top boxes. This is being encouraged with inherent support for the Plex multimedia platform and QNap even has dabbled with a “karaoke box” app called OceanKTV.
What is being shown here by QNAP is the idea of highly-compact low-profile NAS boxes that aren’t toys but can stand amongst their lineup of other high-end “enthusiast” or business-grade NAS units. What needs to happen is that QNAP has to continue evolving this class of M.2 SSD-powered ultra-mobile NAS and demonstrate a wide range of use cases for that form factor.
Facebook is undertaking a rebranding exercise by calling itself Meta.
But this will apply to the company name rather than the social network service that was this company’s foundation stone. Here, it does not affect the brand name of each service they offer and has come about thanks to them acquiring Instagram, WhatsApp and Oculus Rift.
It is based upon their acquisition of Oculus Rift who is about augmented reality and virtual reality. It will also allow the creation of separate business and financial reports for their online services and their hardware investments.
This will be very similar to Google’s current company structure where the Alphabet company encompasses Google’s services, Nest’s smart-home portfolio, the YouTube online multimedia service portfolio and Jigsaw.
There are the four user-account pools within the Meta brand – Facebook, Instagram, WhatsApp and Oculus Quest. At the moment, the Facebook account pool is being used as a single sign-on for Instagram, Oculus Quest and for third-party apps wanting a social sign-on option. But there has been pressure within the company to bind these account pools together even if a user subscribes to one of these services.
The “Meta” brand is about the concept of a “metaverse” which is a shared virtual environment that is accessed by many people with different types of devices. It is seen by Silicon Valley as the next-generation Internet. The idea is to have a collection of shared 3D virtual spaces that are linked together to create a virtual universe with the focus being on augmented reality.
It capitalises on videoconferencing of the Zoom, Skype or Microsoft Teams kind; games like Minecraft, Fortnite or Pokemon Go with infinite open worlds; email and messaging; the Social Web; and live streaming. The user experience will be focused around augmented reality with devices like augmented-reality smart glasses or smartphones that run augmented-reality apps and games.
The company logo will have a blue “infinity” symbol that symbolises the metaverse which Facebook is showing strong keen interest within. Australian users may see this as being similar to the logo that the ABC public-service broadcaster has used for a very long time, representing a sine-wave associated with radio broadcasting.
But why is this change taking place now? A lot of cynics are concerned that Facebook is facing increased scrutiny by governments and media on a significant number of fronts including competition / anti-trust regulation. This is because of the social networks including WhatsApp’s group messaging being a hive for fake news and disinformation, end-user privacy concerns, mental health issues regarding social-media use and the Insta-influencer culture, how it deals with the news establishment and its part in an online display-ads duopoly amongst other things.
Here Facebook’s use of the new brand and underscoring the “metaverse” concept may be seen as a distraction from these realities. This is more so where the metaverse is becoming a talking point within the consumer and business IT world as a potential trend.
Personally, I see Facebook’s Meta and Google’s Alphabet effort as being endemic of Big Tech restructuring itself in to a business-flexible form in order to have their hands upon different aspects of the Internet. It can be about facilitating the spawning of different business sectors or taking over businesses in the IT and online-services in order to carve out areas of our online lives they want to conquer. Who knows when Amazon or another of the Big Tech companies will engage in a corporate restructure and rebranding exercise.
Send to Kindle
Privacy & Cookies Policy
Necessary cookies are absolutely essential for the website to function properly. This category only includes cookies that ensures basic functionalities and security features of the website. These cookies do not store any personal information.
Any cookies that may not be particularly necessary for the website to function and is used specifically to collect user personal data via analytics, ads, other embedded contents are termed as non-necessary cookies. It is mandatory to procure user consent prior to running these cookies on your website.