Author: simonmackay

Telstra steps to the fore with a 3-WAN carrier-supplied router

Articles

Telstra Gateway Frontier modem router press picture courtesy of Telstra

Telstra Gateway Frontier 4G/VDSL2/Ethernet modem router – ready for instant Internet or to provide failover service for the Internet Of Things

Telstra’s Gateway Frontier Modem Gives You A 4G Backup For Your ADSL Or NBN | Gizmodo

From the horse’s mouth

Telstra

Gateway Frontier (Product Page)

My Comments

Previously, I have written up an article about trends affecting carrier-supplied modem routers that customers receive when they sign up for Internet service but don’t order a “wires-only” or “BYO modem” deal.

One of the trends I was calling out was for a router to be equipped with an integrated mobile broadband modem along with a DSL modem and/or Ethernet connection as its WAN (Internet) connection options. The use cases for this include the ability to provide wireless “instant Internet” to subscribers while the wired connection is being established at their premises. But other use cases include a fail-over setup should the wired Internet connection fail or be in the process of being overhauled, to provide an increased “fat-pipe” for broadband connection or as a quality-of-service measure by redirecting particular traffic like emails or Web browsing to a slower path while video streaming or downloading goes the quicker path.

The wireless fail-over connection will have a strong appeal to households with building-security, personal-safety, medical-monitoring or similar technology that connects to a monitoring facility via the home network and Internet. Here, if the wired connection dies due to old and decrepit telephony infrastructure, there is the ability to maintain this essential link using the wireless link. This can extend to small businesses who need the Internet connectivity to be able to continue to trade.

I thought it would take a long time for this kind of equipment to show up as real consumer products but I had seen Telstra’s latest modem router on display at one of their shops in an outer-suburban shopping centre. I looked at some further details about this modem router and noticed that this device, the Gateway Frontier, was also equipped with a 4G mobile-broadband modem.

This device has a triple-WAN approach with the 4G mobile-broadband modem, ADSL2/VDSL2 modem and a separate Ethernet connection. This is intended to support the use of different NBN connection types – the VDSL2-based “fibre-to-the-node” or “fibre-to-the-curb” connections; or the fixed-wireless broadband, fibre-to-the-premises or HFC coaxial connections which rely on an external modem or ONT that uses an Ethernet connection to the router.

Personally, I would like to see the VDSL2 modem be a “software modem” that can be field-programmed to be a G.Fast modem for NBN FTTC (FTTdp) and FTTB deployments that implement G.Fast technology. This is in conjunction to the 4G mobile-broadband modem being able to become a femtocell to boost mobile-phone coverage in the modem-router’s operating area if you are using fixed broadband along with a continual software-maintenance approach for security, performance and stability.

This is a full “home-network” device with four Gigabit Ethernet connections along with an 802.11g/n/ac 4-stream dual-band Wi-Fi wireless network. It even supports NFC-based WPS connection that allows “touch-and-go” network enrolment for your NFC-equipped Android or Windows phone. This is in addition to push-button-based WPS setup that benefits open-frame computing devices that honour this function.

There is support for bandwidth sharing using the Telstra Air bandwidth-sharing platform along with support for the T-Voice VoIP “virtual cordless phone” function on your mobile phone. But this only works on a fixed-broadband (DSL / Ethernet) connection, and the mobile-broadband service is limited to a 6Mbps download and 1Mbps upload.

For a carrier-supplied consumer customer-premises-equipment router, the Telstra Gateway Frontier modem router, like the BT Smart Hub modem router that has Wi-Fi performance that is “beyond ordinary”, is showing that carriers can provide first-class equipment with up-to-date requirements rather than a piece of second-rate equipment they have to supply.

Send to Kindle

VOD content-search aggregation

Article

Netflix official logo - courtesy of Netflix

Netflix – one of many SVOD providers

Say Goodbye to Video on Demand Browsing Fatigue | LinkedIn Pulse

My Comments

A common situation that will happen is for us to sign up to Netflix as well as using our Apple ID and credit card to purchase or rent video content on iTunes. But we subsequently know of one or more other video-on-demand services that have a content library that appeals to us. This becomes more real as boutique video-on-demand enters the spotlight or newer operators join the video-on-demand scene.

SBS On Demand Windows 10 platform app

SBS On-Demand – an example of an advertising-funded boutique VOD provider

Here, we end up heading down a path where we have to switch between multiple user interfaces on our smart TV, video peripheral or tablet to find the content we want to watch. In some cases, it could end up with us acquiring extra video peripherals and selecting different sources on our TVs in order to go to different video-on-demand providers, because they don’t appear on the connected-video platform we primarily use.

But what can happen is that we are after a particular title or shows like it and want to know where it is available without spending a lot of time searching for it. This is where content-search-aggregation can come in handy.

Multiple VOD and catch-up TV providers on a smart-TV or set-top box vie for our attention

What is this? It is where you supply the name of a particular piece of video content and the content-search-aggregation engine lists which video-on-demand providers have the content you searched for. It would support the different business models that video-on-demand providers work on such as subscription, transactional and advertising.

It is very similar to the success of TuneIn Radio, vTuner and Radioline in providing an aggregated Internet-radio directory used in Internet radios, both of the “big set” (hi-fi system) and “small set” (table radio, portable) kind;  and  the Internet-radio apps available on every desktop, mobile or smart-TV platform.

I would like to see these aggregated-content-search engines have, as part of their personalisation efforts, the ability to provide a results view that is based on the services you deal with such as the subscription VODs you subscribe to, the transactional VODs you have registered with and the advertising VODs you regularly visit. In the case of transactional services including “download-to-own” or “download-to-view” storefronts, the results could be sorted by the cost to view in your local currency. But it could be feasible to provide an advertising service on these search engines that list other VOD services carrying the same kind of content in your area, especially boutique providers that run with this content. This can put new streaming or download-driven online-video providers “on the map” as far as the viewership is concerned.

Client-based implementations could work with your downloaded content library along with the streaming and download-based services in order to search through these catalogues for what you are after.

Similarly, these search engines can aid in the content-discovery process by allowing us to find content similar to a specific title or having specific attributes hosted by the providers you deal with. If you are using a VOD service that has an account system for payment or personalisation, it could be feasible for these search engines to “pass” the title to the service so you can put it in your viewing list or favourite-content list, instigate a purchase / rental transaction for that title in the case of a transaction-driven service or immediately have it playing.

To the same extent, the aggregated-content-search platform that links with your accounts on the various VOD services can provide the ability to show an aggregate view of the content recommendations that the services provide based on your past viewing.

Another factor that influences our viewing choices is the content recommended by film critics, radio hosts and other personalities that we follow. Here, some of these personalities or the publishers and broadcasters they work with maintain some form of Web presence, typically through a social-media account, blog or something similar. Here they may use this presence to provide a list of content they recommend or simply cite a particular film or TV series.

But these Web presences cam be made more powerful either through RSS feeds for “recommended-viewing” lists or the ability to link a film’s title to the search engine. These can be facilitated through an express hyperlink to the aggregated-content search engine’s entry for that title. On the other hand, the combination of standardised structured-data-markup and software that interprets this markup and passes this data to these search engines could provide for a competitive approach. In the case of “recommended-viewing” lists delivered as RSS feeds, aggregated-content-search engines could implement a mechanism similar to Web-based newsfeed readers of the Feedly kind for adding these lists.

To the same extent, these personalities could contribute their knowledge about titles to an aggregated-content-search engine to turn it in to a rich video-content portal that helps viewers choose the content they are after.

There are a few of these aggregated-content-search engines existing but these are primarily Web-based or mobile-based services, with Roku offering theirs as part of their set-top-box platform. They currently link with the main video-content resources like IMDB and RottenTomatoes along with the current popular VODs.

But they have to be able to work across multiple platforms including smart-TV / set-top-box platforms, support extensive end-user personalisation along with allowing users to follow content recommendations that their favourite personalities offer. As well, if the concept of “download-to-own” picks up, an aggregated-content-search platform could be used to find content that you have in your collection or could “pick up on” through “download-to-own” storefronts,

Send to Kindle

Software-defined microphone arrays–an idea worth considering

Creative Labs LiveCam Connect HD Webcam

A Webcam could end up being part of a multi-microphone array

Increasingly there are some setups where multiple microphone devices are becoming available to a regular or mobile computing device like a laptop or smartphone. Examples of these include:

  • A headset audio adaptor (whether USB or Bluetooth connected) that has an integrated mic but is used along with a headset that has its own microphone system
  • A (wired or Bluetooth) headset with an integrated mic connected to a computer that has its own mic or is also connected to a Webcam or similar peripheral that has its own mic
  • The use of one or more stereo-microphone setups, whether a single-piece (2-element) stereo microphone or a pair of mono microphones, connected to a computer.
Dell A2 Performance USB Headset

as could the microphone integrated in a feature headset

All these setups can lead to the creation of a multiple-microphone array which can lead to accurate voice capture and improved background-noise rejection. This becomes important for telecommunications but is also as important when you are dealing with voice-recognition applications like voice-driven personal assistants (Siri, Cortana, Google Now) or voice-to-text transcription.

Here, this would require that the microphone array is created at the operating-system level rather than the hardware level. It would require that the OS enumerate all microphone devices that are connected and active to establish a software-defined microphone array based on these mics.

This would lead to the software having to learn about the microphone-array setup including the proximity of the mics to each other as well as how they pick the sound up. This is to create an ideal “voice focus” that is required to gain benefit from the microphone array.

to improve Cortana’s speech recognition

In some cases, this may be achieved automatically but there are situations where it may require the operator to adjust the settings manually. These situations may come about with microphones that have different characteristics like pick-up ranges and sensitivities.

Another factor that affects this kind of setup is whether a microphone device will be active at all times during the usage session. This can happen with, for example, a headset that is connected to or disconnected from a tablet or laptop on an ad-hoc basis. Similarly, the user may move around with their headset while using the microphone-equipped tablet or laptop, an activity more feasible with a Bluetooth headset or adaptor. Here, this factor requires the software to re-define the microphone array so as to “catch” the user’s voice.

In the case of a user moving around between microphones, the requirement would be about readjusting the microphone array in real time to identify the key sounds to put the focus towards.

These are issues that may limit the idea of creating a software-defined microphone array, especially for voice recognition or telecommunications. Let’s not forget that a software-defined microphone array will also be demanding computer resources which can be very limiting with not-so-powerful setups.

But once the concept of software-defined microphone arrays is proven and able to be implemented at the operating system level, it could be a path towards allowing users to gain the benefits from a microphone array while being able to use a combination of existing microphone-equipped devices.

Send to Kindle

Netgem proposes to integrate the set-top box and soundbar in one unit

Article

Combining the STB, TV soundbar and Alexa means telcos can stand out from the crowd | VideoNet

From the horse’s mouth

Netgem

SoundBox set-top box and soundbar

Product Page

Video (Click / Tap to play in YouTube)

My Comments

Soundbars and TV speaker bases are becoming an increasingly-valid path for improving your TV’s sound because they provide the sound through just one box, perhaps along with a subwoofer enclosure. This is because the typical flat-panel TV is becoming more slim but doesn’t have much thought put in to its sound quality and most of us want to hear our shows through something a bit better than that.

As I mentioned in another article on this topic, they will appeal to people who have their TV set up in the traditional manner with it being in the corner of the lounge so as to avoid it competing with the view offered by a feature window or fireplace. They also will appeal to those of us who like our music via a dedicated stereo system with its own speakers, something that is considered to be important thanks to the “back to basics back to vinyl” trend.

In some countries where there is a competitive market for “triple-play” Internet service or subscription-based TV service, the features that a set-top box or PVR offers are seen as a selling point for each of the service providers. As well, most of these telcos or pay-TV providers want to be in a position to upsell customers to better services.

This has led Netgem, a French set-top-box designer to offer to these providers a device which has a soundbar and set-top box in the one housing. It will have the ability to work with a variety of online video and music services and can be controlled by the traditional remote control or a smartphone app. But this box is also being equipped with Amazon Alexa support which allows it to work in a similar vein to the Amazon Echo wireless speaker. The Amazon Alexa agent will also learn media-navigation skills pertaining to this device so you simply can select what you want to watch by voice.

Philips achieved a similar goal by offering a soundbar with an integrated Blu-Ray player,  2-band (FM / Internet) radio and network media player in order to provide a soundbar equivalent to the “home theatre in a box” systems.

The idea behind this box is to allow a telco or pay-TV provider to provide a device that is better than usual to differentiate itself from the others. This is more so where they are focused on selling a “solution” rather than selling a product or service. In most cases, it could be seen simply as an optional device that customers can request rather than as the standard device for a premium package. It is because there will be some customers who will have their own soundbar or home-theatre setup as the way to improve their TV’s sound and simply want a set-top box as the gateway to an IPTV service.

As well, implementing HDMI-ARC, DLNA MediaRenderer, AirPlay / Google Cast playback and similar functionality cam make sure that this device can earn its keep as part of your networked personal A/V setup.

What is showing up is that, especially in Europe’s competitive markets like France, there is a strong interest amongst whoever is offering triple-play broadband service to provide something that offers that bit extra.

Send to Kindle

The home-network gateway device to become advanced

D-Link Covr router and wireless extender package press image courtesy of D-Link

Expect a lot more out of the router that comes with your Internet service when Technicolor gets its way

The device that represents the network-Internet “edge” for your home network i.e. the router won’t just be serving that function in a standalone way anymore. Here, it will work in tandem with other Internet-side and network-side computing devices to become a highly-sophisticated “hub” for your home network.

One of these drivers is to provide a simplified customer-support process, especially for those of us who use carrier-provided equipment at the edge. Here, the support and provisioning process can be fulfilled by the router supplying information to your carrier or ISP regarding your Internet service’s and home network’s performance.without wasting time requiring the customer to supply this information during a support call. This may be considered controversial but has value regarding the support and troubleshooting process which can perplex those of us who aren’t competent with technology such as a lot of older people.

It also encompasses the fact that distributed Wi-Fi will be the “new norm” for the home network, whether through multiple access points connected to a wired or dedicated-wireless backbone, the use of one or more wireless range extenders or a mesh-driven distributed wireless network. Here, it may be about simplifying the process of commissioning the “satellite” wireless devices and making sure that they are performing as expected to assure maximum Wi-Fi coverage across your premises.

The other factor is for a call to provide for always-maintained software in these devices thanks to issues being raised regarding the security of our home networks and the Internet. It was underscored through the recent distributed denial-of-service attacks against various Internet services and blogs using the Mirai bot network that was running compromised software on routers, network cameras and the like which hosted poorly-maintained software to facilitate these attacks.

Let’s not forget that the home-network gateway device will be expected to do more in conjunction with cloud services. Here, they want to provide this kind of service in the same context as the “app-store” commonly associated with mobile computing platforms but increasingly associated with regular computing platforms, and an increasing number of dedicated-purpose devices like printers. It is where a customer can add on extra functionality to their home-network router after they have bought and installed that device rather than buying and installing a new device to achieve this goal.

I was learning about this thanks to a news release offered to me by Diego Gastaldi from Technicolor Connected home regarding this topic. Technicolor came in on this game thanks to buying in to Thomson who supplies a lot of the customer-premises equipment provisioned by telcos and ISPs for their broadband Internet service, especially the triple-play services. This company had presented at Mobile World Congress some of their new concepts for the home-network gateway devices that will be pitched to the likes of Telstra or Bouygues Télécom for their services along with how they can add that extra value.

This is in conjunction with Technicolor announcing their solutions for managed distributed Wi-Fi setups along with devices supporting wireline broadband and mobile wireless broadband on the Internet (WAN) side. The latter trend existed mainly with small-business equipment but its appeal for the home network is being underscored with the “quick-to-provide” goal for an interim wireless service before a wireline service is rolled out, a “fatter pipe” for broadband service by aggregating wireline and mobile broadband services; and always-available broadband for business, e-health / ageing-at-home and the smart home’s security.

The typical applications that will be called out would be to provide business-style “unified threat management” for the home network as a network security measure. Or they could be about joining a “community wireless” platform like Fon where they can share Wi-Fi bandwidth with guests or customers.

But they are also highlighting applications like monitoring elderly loved ones at home to be sure they are OK. Earlier on in 2010, I had a conversation with a representative from Ekahau regarding their Wi-Fi-based Real Time Location System in a residential or small-business environment. This was more so with their T301BD Wi-Fi Pager Tag, pitched primarily as a name tag with duress-alert abilities for healthcare and similar enterprise-level applications, being used as part of an “ageing at home” or similar home-based care scenario. Then I had noticed initial doubt about this kind of application in the home but such setups could be made real with distributed Wi-Fi and them being offered on a cloud-driven “as-a-service” model.

By using a multiple-computer “cloud” approach, there isn’t a requirement to overload a router device with extra processing circuitry which would require a large device to be designed. Typically this would be fulfilled by the use of one or more data centers connected to the Internet like the Amazon Web Services approach Technicolor are using. But, as the compact network-attached-storage maintains its appeal as an on-premises network storage hub with most of these devices offering “remote access” or “personal cloud” functionality, this kind of “cloud” approach could encompass these devices along with other “function-specific” hubs like smart meters or security systems.

But what is happening is that there will be more expectations out of the router device that sits between the home network and the Internet with it being a “gateway” to more online services.

Send to Kindle

Google provides a tool to forums and the like to combat trolls

Article

ThinkBroadband forum

ThinkBroadband Forum – an example of a forum where content moderation can be simplified using Google Perspective

Google’s new innovative technology aims to combat online trolls | Android Authority

My Comments

Previously, I wrote an article regarding the Internet troll problem faced by people who have an Internet presence on a social network, forum, chat-room or similar environment. It is where they face toxic comments and online harassment on these boards from miscreants who use their presence there to make trouble.

A small business’s Facebook page – Google Perspective can make the management of these pages simple when it comes to visitor comments left on these pages

This is made worse for those of us who operate a corporate Internet presence on these environments for their business or other organisation or moderate a forum, blog or similar online presence. In some cases, it has caused some of us not to run these forums at all or turn off commenting ability for the blogs or other online content we publish.

But Google has come to the fore by developing software that allows moderators and users to gain better control over toxic Internet comments. This is based on a human-driven review cycle for comments such as end-users reporting or moderators disallowing comments discovered to be toxic. The Google Perspective software described here learns from the identified troublesome comments in order to determine what happens with new comments.

It is being offered as a set of application-programming-interface “hooks” that can be integrated with content-management systems, social networks and the like. But who knows how long it would take for the APIs that support this functionality to be offered simply as “wrapper” plugins for the popular extensible content-management or forum-management platforms. Similarly, an “outboard” comment-host like Livefyre or Disqus could benefit through offering the Google Perspective technology as a feature to help moderators with manage incoming contents.

They promoted the ability for a moderator to use Perspective to conditionally manage comments that are to be held for moderation or to red-flag potential “flame-wars” and miscreants. But they also put forward the idea for users to filter or sort comments by toxicity which can be of use for most of us who simply don’t want to waste time reading that junk.

What I see of this is that the Google Perspective comment-management software is something that could make it easy for those of us involved in the Internet conversation to make it easy to dodge the troublemakers.

Send to Kindle

What will 802.11ax Wi-Fi wireless networking be about?

ASUS RT-AC5300 router press picture courtesy of ASUS

802.11ax will be the next Wi-Fi standard that will grace our routers, but this will require newer hardware

There is the impending plan to define the IEEE 802.11ax Wi-Fi wireless local-area-network standard which is intended to supplant the 802.11ac standard used for general-purpose Wi-Fi networks. Qualcomm are even offering an initial lot of silicon for this standard in order to have something that can be proven.

But what is it about?

One of the man benefits is wider bandwidth which allows for five times more bandwidth than what 802.11ac offers. But there is also the idea that we will see Gigabit throughput levels being offered for real rather than as headline speeds which are based on a “link-level” speed without any error correction.

This is brought about with increased MIMO multiple-antenna / multiple-front-end abilities such as MIMO-OFDM, which is expected to improve Wi-Fi’s robustness. The MU-MIMO functionality which effectively provides optimum bandwidth to each client device will work for downstream and upstream data.

Yarra's Edge apartment blocks

802.11ax Wi-Fi wireless will benefit apartments, hotels and trade shows where many Wi-Fi networks do co-exist

802.11ax Wi-Fi implements spatial frequency reuse to improve network reliability in high-density setups. Current Wi-Fi setups don’t really perform reliably when they are faced with a high-density setup like a trade show with connections dropping off too easily. But there is the ability to reuse frequencies and co-exist to assure improved reliability in these situations. It also answers a reality with Wi-Fi and high-density urban living where you will come across with each small apartment, office or shop in a large building ends up being equipped with its own Wi-Fi network, something that will be more so with next-generation broadband service being delivered to the premises.

Something more real that will underscore the robustness that 802.11ax provides

To the same extent, this level of robustness in dense Wi-Fi environments also applies to situations where Wi-Fi networks that have multiple access points including range extenders are being implemented by most people to assure optimum network coverage for their portable devices. It is a practice underscored by the reality that a Wi-Fi router is typically installed at one end of the premises because it has to be colocated with the connection that facilitates a wired broadband connection like a telephone or cable-TV socket.

Let’s not forget that the Wi-Fi WMM and WMM Power Save standards will be improved under this specification to assure continual throughput for streamed multimedia content; along with power-efficiency for battery operated devices. These standards will be improved to cater towards an increased volume of data.

The 802.11ax Wi-Fi standard is not intended to be set in stone before 2019 although there will be equipment being released to earlier drafts through the next few years. This is a practice that has happened with 802.11n and 802.11ac Wi-Fi, with the Wi-Fi Alliance even calling the standards before IEEE had the chance to call them. But it could be seen more or less as the wireless local network standard to complement next-generation fibre-optic or 5G wireless broadband Internet services that offer Gigabit or more bandwidth.

Send to Kindle

What’s inside your computer (INFOGRAPHIC)

Some of you who have a traditional “three-piece” desktop computer system where there is a separate box where all the activity takes place, may refer to this box of your computer setup as the “hard disk” even though it is known as a “system unit”. This is because the hard disk, amongst the other key computing subsystems like the CPU processor and the RAM exists in that box.

This infographic shows what the key parts of your computer are and is based on one of the newer small-form-factor designs that are common in the office and home.

Desktop computer system unit - inside view

What’s inside your computer

 

Send to Kindle

Fact-checking now part of the online media-aggregation function

Article – From the horse’s mouth

Google

Expanding Fact Checking at Google (Blog Post)

My Comments

ABC FactCheck – the ABC’s fact-checking resource that is part of their newsroom

Previously, we got our news through newspapers, magazines and radio / TV broadcasters who had invested significant amounts of money or time in journalism efforts. Now the Internet has reduced the friction associated with publishing content – you could set up an easily-viewable Website for very little time and cost and pump it with whatever written, pictorial, audio or video content you can.

Google News – one of the way we are reading our news nowadays

This has allowed for an increase in the amount of news content that is of questionable accuracy and value to be easily made available to people. It is exaggerated by online services such as search and aggregation services of the Google or Buzzfeed ilk and social media of the Facebook ilk being a major “go-to” point, if not the “go-to” point for our news-reading. In some cases, it is thanks to these services using “virtual newspaper” views and “trending-topic” lists to make it easy for one to see what has hit the news.

As well, with traditional media reducing their newsroom budgets which leads to reduction in the number of journalists in a newsroom, it gets to the point where content from online news-aggregation services ends up in the newspapers or traditional media’s online presence.

The fact that news of questionable accuracy or value is creeping in to our conversation space with some saying that it has affected elections and referenda is bringing forward new concepts like “post-truth”, “alternative facts” and “fake news” with these terms part of the lexicon. What is being done about it to allow people to be sure they are getting proper information?

Lately, a few news publishers and broadcasters have instigated “fact-checking” organisations or departments where they verify the authenticity of claims and facts that are coming in to their newsrooms. This has led to stories acquiring “Fact-check” or “Truth-meter” gauges along with copy regarding the veracity of these claims. In some cases, these are also appearing on dedicated Web pages that the news publisher runs.

In a lot of cases, such as Australia’s ABC, these “fact-checking” departments work in concert with another standalone organisation like a university, a government’s election-oversight department or a public-policy organisation. This partnership effectively “sharpens the fact-checking department’s knives” so they can do their job better.

But the question that is facing us is how are we sure that the news item we are about to click on in Google or share in Facebook is kosher or not. Google have taken this further by integrating the results from fact-check organisations in articles listed in the Google News Website or Google News & Weather iOS / Android mobile news apps and calling these “fact-check” results out with a tag. The same feature is also being used on the News search tab when you search for a particular topic. Initially this feature was rolled out in to the US and UK markets but is slowly being rolled out in to other markets like France, Germany, Brazil and Mexico.

Google is also underpinning various fact-check efforts through helping publishers build up their efforts or instigating event-specific efforts like the CrossCheck effort involving 20 French newsrooms thanks to the French presidential election. It is in addition to supporting the First Draft Coalition who helps with assuring the integrity of the news being put up on the Internet. It also includes the use of the Digital Initiative Fund to help newsrooms and others instigate or improve their fact-checking operations.

A question that will also be raised is how to identify the political bias of a particular media outlet and convey that in a search engine. This is something that has been undertaken by the Media Bias / Fact Check Website which is an independently-run source that assesses media coverage of US issues and how biased the media outlet is.

But a situation that needs to appear is the ability for fact-check organisations who implement those “accuracy gauges” to share these metrics as machine-useable metadata that can be interpreted through the rich search interfaces that Google and their ilk provide. Similarly, the provision of this metadata and its interpretation by other search engines or social-media sites can provide a similar advantage. But it would require the use of “news categorisation” metadata relating to news events, locations and the actors who are part of them to make this more feasible.

Similarly, a social network like Facebook could use the fact-checking resources out there to identify where fake news is being spread so that users can be certain if that link they intend to share is questionable or not.

To the same extent, engaging government election-oversight departments like the Australian Electoral Commission, the Federal Election Commission in the USA and the Electoral Commission in the UK in the fact-checking fabric can help with assuring that there are proper and fair elections.  This is more so as these departments perform a strong role in overseeing the campaigns that take place in the lead up to an election and they could use the fact-checking organisations to identify where campaigns are being run with questionable information or in an improper manner.

As part of our research in to a news topic, we could be seeing the fact-checking resources playing an important role in sorting the facts from the spin and conspiracy nonsense.

Send to Kindle

HP to introduce virtual-hardware security for Web browsing

Article

HP Elitebook x360 G2 press picture courtesy of HP USA

HP Elitebook x360 G2 – to be equipped for Sure Click

HP hardens EliteBook protection with Sure Click, a browser secured in virtual hardware | PC World

From the horse’s mouth

HP

Press Release

Bromium

Press Release

Video explaining the Bromium micro-virtualisation approach (Click / Tap to play)

My Comments

A very common attack gateway that has been identified for endpoint computing devices, especially regular desktop or laptop computers, is the Web browser. It is because the browser is essentially the “viewport” to the Internet for most reading-based tasks.

But most recent browser versions have implemented software-based “hardening” against the various Internet-based attacks. This is in conjunction with the main desktop operating systems being “hardened” through each and every update and patch automatically applied. These updates facilitate practices like “sandboxing” where software of questionable provenance is effectively corralled in a logical quarantine area with minimal privileges so it doesn’t affect the rest of the system.

HP and Bromium have developed a “virtual hardware” approach where a browsing session can take place in a separate “logical computer”, a concept being driven by the multi-core CPUs that are the hub of today’s computer systems. This can provide improved security by using the hardware approach that is effectively with its own operating system and has the data destroyed at the end of a session. Here, it restricts the effect of malware like ransomware picked up during a “drive-by” download because the software can only run within that separate “logical computer”.

At the moment, this feature is being initially rolled out to the Elitebook x360 G2 convertible business laptop but will trickle out across the next generation of “Elite” premium manageable business computers to be launched in the second half of the year. It will work only with Microsoft’s Internet Explorer and Google’s open-source Chromium browser at the moment. What I would like to see happen is that this feature is able to be “trickled-down” to HP’s consumer, education and small-business product ranges but in a more “self-service” manner because households, small businesses and volunteer-driven community organisations could equally benefit from this feature.

Send to Kindle