Dell jumps on the prosumer bandwagon with the XPS Creator Edition computers

Articles

Dell XPS 17 laptop press picture courtesy of Dell Australia

Dell is offering variants of the latest XPS 17 desktop-replacement laptop that will be pitched at prosumers and content creators

What is Dell’s XPS 17 ‘Creator Edition?’ | Windows Central

Dell Reveals Redesigned XPS 15 and Powerful New XPS 17 Aimed at Creators | Petapixel

Dell’s new XPS Desktop looks to be a premium powerhouse PC | PC World Australia

From the horse’s mouth

Dell

XPS 17 Series (USA product page with Creator Edition packages)

XPS Desktop series (USA product page with Creator Edition packages)

NVIDIA

RTX Studio program (Product Page)

My Comments

As I have previously reported, computer-equipment manufacturers are waking up to the realisation that prosumers and content creators are a market segment to address. This group of users was heavily courted by Apple with the MacOS platform but Windows-based computer vendors are answering this need as a significant amount of advanced content-creation and content-presentation software is being written for or ported to Windows 10.

Here, the vendors are shoehorning computer specifications for some of their performance-focused computers towards the kind of independent content creator or content presenter who seeks their own work and manages their own IT. This can range from hobbyists to those of us who create online content to supplement other activities towards small-time professionals who get work “by the job”. It can also appeal to small-time organisations who create or present content but don’t necessarily have their own IT departments or have the same kind of IT department that big corporations have.

Lenovo answered this market with a range of prosumer computers in the form of the Creator Series which encompassed two laptops and a traditional tower-style desktop. Now Dell is coming up to the plate with their Creator Edition computer packages. Here, this approach is to have computers that are specifiied for content creation or content presentation but aren’t workstation-class machines identified with a distinct “Creator Edition” logo.

The first of these are the Creator Edition variants of the latest Dell XPS 17 desktop-replacement laptop. These have, for their horsepower, an Intel Core i7-10875H CPU and a discrete GPU in the form of the NVIDIA GeForce RTX-2060 with 6Gb display memory, based on the NVIDIA Max-Q mobile graphics approach. This will run RTX Studio graphics drivers that are tuned for content-professional use and will be part of the RTX Studio program that NVIDIA runs for content professionals.

The display used in these packages is a 17” 4K UHD touch display that is rated for 100% Adobe RGB colour accuracy. The storage capacity on these computers is 1 Terabyte in the form of a solid-state disk. The only difference between the two packages is that the cheaper variant will run with 16Gb system RAM and the premium variant having 32Gb system RAM.

Dell is also offering a Creator Edition variant of its XPS-branded desktop computer products. This will be in the form of a traditional tower-style desktop computer but is equipped with the latest Intel Core i9 CPU, NVIDIA GeForce RTX 2070 Super graphics card and able to be specced with RAM up to 64Gb and storage of up to 2Tb. It has all the expandability of a traditional form-factor desktop computer, something that would come in handy for project studios where special audio and video interface cards come in to play.

What is being shown up here is that computer manufacturers are recognising the content-creator and prosumer market segment who wants affordable but decent hardware that can do the job. It will be interesting to see who else of the large computer manufacturers will come up to the plate and have a product range courting the content creators and prosumers.

Amazon to get property managers on the Alexa bus

Article

Alexa for Residential lets landlords create smart apartments | Engadget

From the horse’s mouth

Amazon

A new, easy way for properties to add Alexa to residential buildings (Blog Post)

Video – Click or tap to play on YouTube

My Comments

Amazon is wooing owners corporations, property managers, whole-building landlords and the like towards a customised Alexa experience for residential buildings.

This is expected to be about catering towards people who want the “smart home” within their rented apartment or condominium / strata-plan apartment. It will also be about courting the retirement living, supported accommodation and serviced apartment segments where there are people who support or provide services to residents who live in their own apartments.

This will involve the ability for a property manager or similar entity to purchase and deploy a fleet of pre-programmed Echo smart speakers that work with the pre-provisioned Wi-Fi network and smart-home devices. There will be the ability for these entities to have the Echo devices loaded with off-the-peg or custom Alexa Skills to suit the building’s and residents’ needs. Examples of these could include booking of communal facilities, paying rent or other dues, knowing when building-specific events are scheduled or providing feedback to the property manager or similar entity. It may also be about interlinking entryphone systems to the Alexa device so you can use it to communicate with your visitors and let them in if desired.

At the turnkey level, these Alexa devices will support what the property manager has pre-defined within them and support access to online information and audio services. But users can add their Amazon account to these devices to carry over all Alexa-platform customisations they have established to these speakers. That includes all of the Alexa Skills that the user is currently using with their Alexa platform devices.

As far as I know, these devices will keep users’ data away from the landlords or property managers, assuring some form of user privacy. For turnkey setups, the voice data is purged daily from the speakers, while a “brick wall” exists between the user’s Amazon account data and all pre-configuration data associated with the property. But there are still doubts about any IT service that the likes of Amazon, Google or Facebook offer due to their disdain for end-user privacy.

There will also be the ability for the property manager to remotely reset a device they are responsible for, something that would be important for whenever the residents move out. As well, there will be the ability to run custom skills while an apartment is vacant thus catering for things like guided tours or question-and-answer sessions for prospective tenants / purchasers.

A question that I would have regarding the Alexa for Residential platform is how this kind of setup would work with the “BYO Internet service” arrangement common in countries like Australasia, UK or Europe. It is where residents who are living in their apartments for the long term will choose and set up their own Internet service and home network rather than having their landlord, property manager or similar entity provide and set up this service. Here, it may be about having these devices able to work with the building’s services using the resident’s network and Internet service.

Similarly, how would it cope with residents installing additional Alexa-platform audio devices and wanting to “bind” them to both their own Amazon account and the Alexa For Residential deployment’s configurations. It may be about use of an additional Echo device in another room or to use something like the Echo Show in lieu of the standard Echo speaker that is part of the original setup. There may also be a requirement to support the concurrent use of two Amazon accounts for Alexa platform devices.

To the same extent, there would be the issue of residents bringing in smart appliances like lamps, A/V equipment, robotic vacuum cleaners and the like that suit their needs. In a lot of cases, it is about the users wanting to have their home how they want it and there may be expectations to have the resident-supplied equipment work as though it is part of the whole system.

At the moment, the Amazon Alexa for Residential platform needs to be worked out to answer different residential setup needs, especially to suit the needs of long-term residents.

Multi-gigabit wired network connections for small networks could be real

Articles

WD MyNet Switch rear Ethernet connections

The next affordable unmanaged Ethernet switch will soon appear as a multi-gigabit type

The cheapest multi-gigabit switches (2.5G, 5, & 10Gbps) you can buy now – Affordable 10GbE & 2.5GbE networking | Just Android (UK)

My Comments

A trend that is starting to appear is the increased availability of multi-gigabit wired network hardware at reasonable prices. This is a trend that will continue to appear over the next few years.

Examples of this include affordable PCI Express network interface cards for traditional desktop computers and USB3 Ethernet adaptors that support 2.5Gb network speeds.These will use Category 5 cable and RJ45 modular plugs.

It also extends to standard-form-factor motherboards for “three-box” desktop computers being pitched at the performance end of the market being equipped with multi-gigabit Ethernet connections.

As well, newer high-end Synology and QNAP network-attached-storage units are being equipped with the ability for users to upgrade their device’s network connection to 2.5Gb Ethernet at a reasonable price. This is in conformance with the way Synology and QNAP are designing their NAS units to be computers in their own right.

Let’s not forget that some affordable Ethernet switches are appearing with at least one 2.5Gb Ethernet connection like this 5-port unmanaged unit from QNAP. The use of extant Category 5 cabling infrastructure for a 2.5Gb Ethernet run means that you don’t have to pull new cabling through to upgrade an existing “wired-for-Ethernet” installation to that speed.

Of course the 10Gb idea will be seen as more expensive because of the use of newer cable types that support the higher bandwidth. A cabling upgrade of this kind can be done to an existing “wired-for-Ethernet” setup with the legacy cable being used to pull the newer cable type through. This avoids the need to drill through walls to replace new cable.

What do I see as driving the takeup of multiple-gigabit Ethernet networks for home and small business use?

One of these trends is Wi-Fi 6 and Wi-Fi 7 wireless networks having the possibility of multiple-gigabit speeds. Here, you could use high-performance Wi-Fi 6 access points, including distributed-wireless systems supporting that technology, with a multi-gigabit Ethernet as a wired-network backhaul for those access points. This is especially if you want stable operation from a multi-AP Wi-Fi 6 or Wi-Fi 7 network.

As well, some countries and neighbourhoods are laying the groundwork for high-speed Internet. This is through strong efforts to increase the penetration of fibre-optic next-generation broadband infrastructure through a neighbourhood, with cities and towns wanting to claim bragging rights to “Gigabit City” or “Gigabit Town” titles. That is where every household or business has the ability to have Internet bandwidth of at least 1Gbps.

The bar for these communities will then be raised to multiple-gigabit levels through “in-rack” upgrades done to existing fibre-optic networks. This is where a network is upgraded simply with the upgrading of network infrastructure electronics that exists in the equipment racks at ISP central offices, headends and exchanges. It is rather than rolling out trucks and digging up roads to pull new fibre-optic cable through a neighbourhood.

Another is the increased ubiquity of 4K UHDTV with an increased number of affordable sets with the right screen size pitched for the entry-level or  secondary-lounge-area/bedroom use appearing on the market. It would lead to multiple 4K UHDTV sets being installed around a house. This is underscored by an increased number of video-on-demand services delivering 4K UHDTV content with reasonable subscription prices in the case of SVOD services. This will lead to concurrent viewing of 4K video content in multiple-adult households.

Infact the multiple-adult household is being seen as the norm especially in urban areas where land prices are increasing rapidly. This is because housing, whether to own or rent, will become very expensive for a young couple in these areas. Similarly, there is the appeal of multiple-generation living with a family living with their older parents. It facilitates the concept of “ageing at home” which avoids the need for older parents who need extra care being sent to questionable aged-care facilities.

Another key driver is the rise of content creators working from home with their jobs involving large files. Examples of this would include video content with a resolution of 4K or higher, or multichannel / multitrack sound mixes. Such users, especially those who work for themselves on a “job-by-job” basis or use this to support a hobby or other endeavour are now considered a key market segment for personal IT. As well, it is even driven by the COVID-19 pandemic which has had us work from home more.

This latter use case is also being underscored with Matrix Audio developing and releasing a series of “audiophile-grade” multiple-Gigabit Ethernet switches that exploit the higher bandwidth associated with this kind of high-capacity Ethernet. Here it’s about using the necessity for stability associated with multiple-Gigabit Ethernet to protect lower bandwidth multimedia network traffic thanks to reduced timing errors.

What will hinder the takeup of this kind of connection

At the moment, the main hindrance to multiple-Gigabit wired Ethernet being ubiquitous is the current-generation Internet connection offered to most people. This includes the routers, modems and other equipment installed at the customers’ premises.

As well, use cases associated with multiple-gigabit Ethernet need to be demonstrated to the greater populace in order to justify this concept. This may be about including a higher-throughput backbone for Wi-Fi 6 distributed-Wi-Fi applications, having a network that handles multiple 4K UHDTV streams or simply being ready for higher-bandwidth broadband Internet service.

How should you go about this kind of upgrade?

A content professional, whether working for someone else or running their own shop, would justify this kind of network. It is more so where large multimedia files are the norm for the work. This can also extend to other professionals like architects and designers who are dealing with large files.

But it can be seen as a long-term wired-network upgrade goal especially if you are wanting to create a high-speed trunk link between multiple network-device clusters. This can be facilitated with a single few-port multiple-gigabit switch at the “hub” of your home network and a few Gigabit Ethernet switches which have one multiple-Gigabit Ethernet socket on them at each “branch” of the network. Here, this creates a “data freeway” between the different clusters. Even if you start out with the single few-port multiple-gigabit switch at the hub of your home network’s wired Ethernet segment, it will be about the switch creating its own “high-performance data freeway” within itself.

Such a setup can also come in to its own if you are upgrading a Wi-Fi 6 network to access points that are capable of using that kind of connection for a wired-backhaul option. As well, the new Wi-Fi 7 wireless-network technology will underscore more of a need to upgrade your wired Ethernet network towards multi-gigabit technology.

The 10 Gigabit tecbnology will also appeal to people who are considering an optical-fibre LAN link like a robust link between a house and an outbuilding. Here, such a link will satisfy future needs and avoid the problem of an inter-building link becoming unstable due to weather conditions. Such links could go up to 300 metres for multimode fibre or 40 kilometres for single-mode fibre which is more costly.

Conclusion

The idea behind the affordable multi-gigabit Ethernet technology for local area networks is to provide an upgrade path for wired network infrastructure to support higher bandwidth. It is more useful as a long-term upgrade approach or whenever you are dealing with many large files.

Updates

Originally posted on 7 September 2020 and updated 13 March 2024 to factor in the arrival of Wi-Fi 7 wireless network technology with multiple-Gigabit Ethernet gaining more relevance as a wired backhaul for Wi-Fi 7 wireless networks. A subsequent update posted on 5 September 2025 has been added to reflect Matrix Audio offering audiophile/multimedia-grade multi-Gigabit Ethernet switches that exploit the higher bandwidth to protect and assure reliable multimedia streaming.

How do I use an XBox One game controller with my Windows computer?

Article

XBox One games console press photo courtesy Microsoft

You can use the same XBox One controller with your Windows computer as well as that console

How to use an Xbox One controller with your PC | Windows Central

My Comments

If you do play games using your Windows computer, you may want to use a game controller as a better alternative to the keyboard and mouse or trackpad. This may be of importance with most fast-paced games where games-console-style controllers may suit you better.

The same game controllers that work with Microsoft’s XBox One games console can work with your Windows computer out of the box. This is without needing to add any extra drivers to your Windows setup.

It doesn’t matter what form factor the controllers come in so that flight-yoke or steering-wheel controllers designed for the XBox One can work with Windows computers. As well, the XBox Adaptive Controller which brought video gaming to those with limited abilities can also work with Windows computers.

Logitech G Adaptive Gaming Kit press picture courtesy of Logitech International

This also applies to the XBox Adaptive Controller and its custom switches that opens up gaming for those of us with limited mobility

For that matter, the XBox Adaptive Controller could open up the same kind of real-world-interface programming that was pitched during the 1980s for computers of the BBC Micro or Commodore 64 ilk. This would be feasible if you know how to write XBox controller functionality in to the software you are developing.

Physical connections

Wired setup

This is very simple if you have a full-function “charge and sync” Micro-USB cable on hand of the type that was used to charge or transfer data to “open-frame” mobile devices. These will typically have a standard USB Type-A connection which will work with most computers.

Dell WD19TB Thunderbolt dock product image courtesy of Dell

USB-C and Thunderbolt 3 docks and adaptors like the Dell WD19TB Thunderbolt 3 dock can allow you to use the XBox One game controllers with your newer laptop’s USB-C port

If your computer uses USB-C or Thunderbolt 3 as its only peripheral connection option, you can buy a USB-C to Micro-USB cable  (Amazon, Harvey Norman) and use that instead of the aforementioned USB-A to MicroUSB cable for connecting your controller. Or you simply use a USB-C adaptor with a USB-A female socket with your existing cable for your USB-C-based computer. As well, if you use the computer with a USB-C hub or dock that has at least one USB-A port, you can use the ordinary Micro-USB cable with that setup.

You may also find that connecting two controllers directly to the host or via a USB hub may work for local multiplayer games.

Wireless setup

Older XBox One game controllers will need to be used with a Microsoft XBox Wireless Adaptor which is a USB transceiver dongle for XBox One game controllers. This plugs in to any USB Type-A port on your computer but look at the notes in the above secton about wired setups if your computer just has USB-C ports.

Dell Inspiron 15 Gaming laptop

If your computer like this Dell Inspiron 15 Gaming high-performance laptop supports Bluetooth, you don’t need to use a receiver dongle if you use very recent XBox controllers

If your computer supports Bluetooth which most laptops, all-in-ones and low-profile computers do; current-generation game controllers will support Bluetooth connectivity. It also holds true if you are using a Bluetooth adaptor dongle or expansion card with your desktop computer that doesn’t have Bluetooth connectivity. In all these cases, you have to be sure that your computer’s Bluetooth functionality is turned on and active.

Pairing your games controller with your Windows computer is very simple, but you have to have the games controller turned off beforehand. This typically asks you to open “Settings” in Windows by clicking on the “gear” icon in the Start menu. Then you click the Devices option which has your computer able to listen for devices about to “come online”.

Press the large X button on your XBox games controller to turn it on. Subsequently, your Windows regular computer will list it as a newly-discovered device. Click this device to finish off the pairing process.

IF you are setting up multiple controllers for a local multiplayer game, you may have to repeat this process for each controller.

Further notes

You will need to install the XBox Accessories app from the Windows Store to get the most out of your controller. This is important for setting up different control layouts which may be necessary for different games and different players.

It will also enable a “co-pilot mode” to allow two controllers to manipulate the same character in a game. This can be of use in helping novice players get the hang of a game or can allow you to use two different controller types as complementary controllers.

What this will mean for people who play games using the XBox One console or a Windows computer is that they only need to have one set of controllers that can be used with both devices. That is rather than having to retain a controller for your PC and one for the XBox One.

Lenovo has premiered a lightweight slim performance-class laptop

Articles

Lenovo Legion Slim 7i gaming laptop press image courtesy of Lenovo

The Lenovo Legion Slim 7i gaming laptop that improves on portability for performance-class laptops

Lenovo Is Making a Gaming Laptop That Weighs Less Than 2 kg | Gizmodo

Lenovo’s Legion Slim 7i gaming laptop weighs less than four pounds | Engadget

Lenovo’s new Legion Slim 7i is ‘world’s lightest’ 15-inch gaming laptop with RTX graphics | Windows Central

From the horse’s mouth

Lenovo

Lenovo™ Reveals Smarter Innovation and Design with Holiday Consumer Lineup (Press Release – includes reference to Legion Slim 7i)

Legion Slim 7i (Product Page – PDF)

My Comments

A problem with laptop design is that you can’t effectively mix the idea of a portable aesthetically-pleasing computer with a performance-focused design. It is still the Holy Grail of laptop design to combine these aspects in one machine.

This comes down to the requirement to provide enough power to the computer’s main processors – the central processing unit and the graphics processor for them to work your data and “paint” your screen. In some applications, the graphics processor is tasked with performing supplementary processing activities like rendering or transcoding edited video files or calculating statistics. As well there is the need to remove waste heat generated by the processing silicon so it can perform to expectation even when working hard.

Lenovo Legion Slim 7i gaming laptop keyboard view press image courtesy of Lenovo

As well, there is the proper full-size full-function keyboard on this gaming laptop

What typically happens is that a lightweight highly-portable computer won’t be engineered for anything beyond everyday computing tasks. This is while a performance-focused computer fit for gaming, photo-video editing or CAD will be a heavier and thicker machine that doesn’t look as aesthetically pleasing as the lightweight. Some of these computers even convey the look equivalent to an American or Australian muscle-car of the 1970s but most convey a look very similar to medium or large family cars that appeared at the end of the 20th century.

Lenovo is getting close to this Holy Grail by designing a 15” gaming laptop that is slimmer and lighter than typical gaming or other high-performance laptops of the same screen size. This laptop, know as the Legion Slim 7i, has had a significant amount of hardware and firmware engineering to achieve this goal of combining portability and performance.

It will use 10th-generation Intel Core i-series CPU silicon and NVIDIA max-Q graphics silicon, with the latter known to avoid yielding too much waste heat for mobile use. But even the max-Q graphics silicon cannot handle excess waste heat and the Intel Core silicon will underperform if there is too much of that heat.

Lenovo is implementing Dynamic Boost technology to steer power to the graphics processor where needed during graphics-intensive tasks like fast-paced gaming. It is augmented by NVIDIA’s Advanced Optimus technology that allows for task-appropriate graphics processor switching – whether to work with Intel integrated graphics for everyday computing as a “lean-burn” approach or to work the NVIDIA GPU for graphics-intense activity.

There is also ColdFront 2.0 hardware-and-software-based thermal engineering which is about increasing airflow within the computer while under load. There are small perforations above the keyboard to allow the computer to draw in air for cooling along with a many-bladed fan that comes in when needed to move the air across three heat pipes.

The Legion Slim 7i gaming laptop will have the full-sized keyboard with a numeric keypad and media keys. This will have a feel similar to a desktop mechanical keyboard. There is a 71 watt-hour battery in the computer which could last up to 7.75 hours.

Lenovo Legion Slim 7i gaming laptop rear view press image courtesy of LenovoThe baseline variant will weigh in at 2 kilograms and cost $1329. But it can be specced up to Intel Core i9 CPU and NVIDIA RTX2060 Max Q graphics silicon. It can also have at the maximum 32Gb  of current-spec RAM and 2Tb of NVMe solid-state storage. The screens are available either as a 4K UHD 60Hz display, a Full HD 144Hz display or a Full HD 60 Hz display.

For connectivity, these units offer Thunderbolt 3 which means access to external graphics modules, along with Wi-Fi 6 and Bluetooth 5 support. You may have to consider using a USB-C or Thunderbolt 3 dock with an Ethernet connection if you are considering low-latency game-friendly Ethernet or HomePlug powerline network technology.

The Lenovo Legion Slim 7i gaming laptop is expected to be on the market by November this year in the USA at least. Personally, I could see this as a push towards performance being about beauty as well as grunt.

Dell designs their business USB-C docks for the long haul

Article – From the horse’s mouth

Dell WD19TB Thunderbolt dock product image courtesy of Dell

The Dell WD19TB Thunderbolt 3 dock – an example of the modular USB-C docks that Dell offers

Dell

WD-19TB Thunderbolt 3 dock (Product Page)

My Comments

Dell has defined a series of business USB-C docks that can have their host connectivity technology upgraded or replaced by the user.

What are these docks about?

This series of expansion modules, known as the Dell WD19 family have in common video connections in the form of a single HDMI, two DisplayPorts and a USB-C with DisplayPort alt mode connectivity. The above-mentioned USB-C DisplayPort-enabled port, along with another USB-C port located up front offer data transfer and Power Delivery power-source functionality. There are three USB 3.1 Type-A sockets with one up-front along with a Gigabit Ethernet network-adaptor function. As well, there is a basic USB sound module that has a headphone/microphone socket up front and a line-out socket behind, which may suit the use of a wired headset, powered speakers or that old stereo amplifier connected to those old speakers you use for computer sound.

The devices are pitched for business use, especially with large businesses who practice hot-desking a lot, using shared workspace setups where you connect a laptop computer to at least one large screen as well as a full-size keyboard, full-size  mouse and Ethernet network connection.  This leads to separate modules being available for USB-C connectivity, Thunderbolt 3 connectivity and dual-USB-C connectivity depending on the performance needs of the workspace’s user group.

The power available on these units is up to 90 watts for equipment adhering to the current USB Power Delivery specification. But Dell takes this further to 130W for their own products because this specification currently doesn’t address the likes of the XPS 17 which demand more power output. This may be something that will be investigated by the USB Implementers Forum for supporting USB Power Delivery on higher-powered devices namely powerful large-screen laptops or “next-unit-of-computing” desktops.

For that matter, the Thunderbolt 3 variant has another USB-C port that supports Power Delivery, USB-C and Thunderbolt 3 data transfer and DisplayPort alt mode.

If you are buying the docks, you can choose between the different units offering the different host connectivity types and pay appropriately for the connection type. But Dell sells these modules as a separate accessory so you can upgrade your dock to a better host-connectivity type like Thunderbolt 3.

What I like about this family of docks and the user-replaceable host-connectivity modules that Dell offers is if a host-connectivity module fails and the dock becomes useless, you can just replace that module. There is also the ability to upgrade your dock to newer expectations at a later time.

Although this is optimised to work primarily with Dell computers, the WD19 series of docks can work with any computer that has a USB-C or Thunderbolt 3 connection. This is in a totally “plug-and-play” manner without the need to install device drivers.

Room to innovate

But it could allow Dell to have a range of business-class docks ready for full-on USB4,  Thunderbolt 4 or any future host-peripheral connection technology. This is with the ability for users to upgrade them to that technology when the time comes.

Also having user-replaceable host-connectivity modules could open up to Dell the idea of external graphics modules with soldered-in graphics chipsets that can be added on to these docks. Most likely this idea would be limited to high-end mobile graphics chipsets that give a bit of “pep” to your Ultrabook’s graphics rather than desktop graphics chipsets that provide the full performance.

As well, having the dock part as a separate module can allow Dell to build on this system further. For example, it could also be about creation of a multimedia variant of this dock with a better sound module having line inputs or SPDIF connectivity along with more USB connections. Similarly, there could be a dock with multiple-Gigabit Ethernet connectivity that could appeal to “workstation-class” network computing.

Limitations that are identified

From all of the material I have seen on the Internet about these devices. there are some limitations that show up here.

For example, for the single-USB-C or Thunderbolt-3 connection modules, Dell could fit each module with a USB-C socket for the upstream (host-side) connection rather than using a captive USB-C cable. This could allow the user to use longer USB-C cables thus allowing for installation flexibility. It can also allow the user to replace a broken cable themselves, something that will become real if they frequently plug and unplug their laptop from the dock.

From a video review that I had seen, there could be the ability to support Thunderbolt-level multiple-screen display for three outputs for the Thunderbolt 3 variant. This could work better with the Apple Macintosh platform, but “open-platform” implementations like Windows don’t need to worry about this issue much. But it may not work properly with the modular approach behind this dock’s design.

Conclusion

But the Dell WD19 business USB-C dock family underscores the reality that you have to pay dearly for something that is robust and will last you in to the long term. It can also show that a design platform can be achieved for premium, business and multimedia docks where there is a goal to see them last longer and be future-proof.

NETGEAR brings back the electronic photo frame as a content source

Article

NETGEAR Meural Wi-Fi Photo Frame press image courtesy of NETGEAR

NETGEAR brings back the desktop digital photo frame with its Meural online content service and photo exchange

Meural’s New Digital Photo Frame Might Resurrect the Comic Strip Calendar | Gizmodo

From the horse’s mouth

NETGEAR Meural

Product Page

My Comments

Meural is a brand owned by NETGEAR who offer an online photo frame and content platform.

It is reinvigorating a product class that fell along the wayside thanks to the popularity of smartphones and mobile-platform tablets. But what is this product class?

It is the electronic photo frame that shows pictures held on, usually, removeable storage on a built-in screen. These devices would show the pictures for a pre-determined time period then bring up another picture automatically. It was seen initially to have your parents see digital images of their grandchildren but have also appealed to businesses as cost-effective digital signage that can be located on the reception desk.

There were a variety of these units that connected to your home network and worked with an online photo-exchange service like Ceiva so people can send digital photos to them. The users have control over who can send photos to them to avoid distasteful imagery appearing on these devices. Some of these photo frames even were tied with online content services so that stock photos, fine art and the like can be shown on them.

NETGEAR’s sub-brand Meural has continued the latter trend by offering a range of electronic photo frames that are centred around content services. This is about having these devices repositioned as a “digital art frame”, especially in the form of wall-mounted large screen devices.  As well, the Meural platform will do what Ceiva had done by having an online photo exchange where you and others whom you approve can post photos to appear on these frames.

But they have brought back the classic desktop electronic-photo-frame form factor and substantiated it with a comic-calendar content service. It is a throwback to desk calendar products that featured a comic strip for each day. All of the content services are available for USD$70 per year. But they are offering the Peanuts comic-strip archive including Snoopy for US$30 per year as a stand-alone package.

These electronic photo frames implement touch-free gestures as a way of interacting with them, avoiding the ugly look of fingerprints on the glass or having to grope around the back to press buttons to change images. As well, they work with voice-driven home assistant platforms.

They also use an ambient light sensor so they effectively blend in to the room’s lighting. As well, they turn themselves off overnight so they don’t become too bright while you sleep.

What NETGEAR are realising is that the electronic photo frame can be seen as a digital content distribution medium for art and photography. As well, they are encouraging us not to forget about the idea of the electronic photo frame as a device to display photographs and the like, along with keeping us interested in “digital photo exchange” services.

Should videoconference platforms support multiple devices concurrently

Zoom (MacOS) multi-party video conference screenshot

The idea of a Zoome or similar platform user joining the same videoconferences frp, multiple devices could be considered in some cases

Increasing when we use a videoconferencing platform, we install the client software associated with it on all the computing devices we own. Then we log in to our account associated with that platform so we can join videoconferences from whatever device we have and suits our needs.

But most of these platforms allow a user to use one device at a time to participate in the same videoconference. Zoom extends on this by allowing concurrent use of devices of different types (smartphone, mobile-platform tablet or regular computer) by the same user account on the same conference.

But why support the concurrent use of multiple devices?

There are some use cases where multiple devices used concurrently may come in handy.

Increased user mobility

Dell Inspiron 14 5000 2-in-1 - viewer arrangement at Rydges Melbourne (Locanda)

especially with tablet computers and 2-in-1s located elsewhere

One of these is to assure a high level of mobility while participating in a videoconference. This may be about moving between a smartphone that is in your hand and a tablet or laptop that is at a particular location like your office.

It can also be about joining the same videoconference from other devices that are bound to the same account. This could be about avoiding multiple people crowding around one computing device to participate in a videoconference from their location, which can lead to user discomfort or too many people appearing in one small screen in a “tile-up” view of a multiparty videoconference. Or it can be about some people participating in a videoconference from an appropriate room like a lounge area or den.

Lenovo Yoga Tablet 2 tablet

like in a kitchen with this Lenovo Yoga Tab Android tablet

Similarly, one or more users at the same location may want to simply participate in the videoconference in a passive way but not be in the presence of others who are actively participating in the same videoconference. This may simply be to monitor the call as it takes place without the others knowing. Or it could be to engage in another activity like preparing food in the kitchen while following the videocall.

As far as devices go, there may be the desire to use a combination of devices that have particular attributes to get the most out of the videocall. For example, it could be about spreading a large videoconference across multiple screens such as having a concurrent “tile-up” view, active speaker and supporting media across three screens.

Or a smartphone could be used for audio-only participation so you can have the comfort of a handheld device while you see the participants and are seen by them on a tablet or regular computer. As well, some users may operate two regular computers like a desktop or large laptop computer along with a secondary laptop or 2-in-1 computer.

Support for other device types by videoconferencing platforms

.. or a smart display like this Google-powered Lenovo smart display

Another key trend is for videoconferencing platforms to support devices that aren’t running desktop-platform or mobile-platform operating systems.

This is exemplified by Zoom providing support for popular smart-display platforms like Amazon Echo Show or Google Smart Display. It is although some of the voice-assistant platforms that offer smart displays do support videocall functionality on platforms own by the voice-assistant platform’s developer or one or more other companies they are partnering with.

Or Google providing streaming-vision support for a Google Meet videoconference to a large-screen TV via Chromecast. It is something that could reinvigorate videoconferencing on smart-TV / set-top box platforms, something I stand for so many people like a whole family or household can participate in a videoconference from one end. This is once factors like accessory Webcams, 10-foot “lean-back” user interfaces and the like are worked out.

It can also extend to the idea of voice-assistant platforms extending this to co-opting a smart speaker and a device equipped with a screen and camera to facilitate a videoconference.  This could be either with you hearing the videoconference via the smart speaker or the display device’s audio subsystem.

What can be done to make this secure for small accounts?

There can be security and privacy issues with this kind of setup with people away from the premises but operating the same account being able to join in a videoconference uninvited. Similarly, a lot of videoconferencing platforms who offer a service especially to consumers may prefer to offer this feature as part of their paid “business-class” service packages.

One way to make this kind of participation secure for a small account would be to use logical-network verification. This is to make sure that all devices are behind the same logical network (subnet) if there is a want for multiple devices to participate from the same account and in the same videoconference. It may not work well with devices having their own modem such as smartphones, tablets or laptops directly connected to mobile broadband or people plugging USB mobile-broadband modems in to their computers. Similarly, it may not work with public-access or guest-access networks that are properly configured to avoid devices discovering each other on the same network.

Similarly, device-level authentication, which could facilitate password-free login can also be used to authenticate the actual devices operated by an account. A business rule could exist to place a limit on the number of devices of any class but operated by the same consumer account able to concurrently join a videoconference at any one time. This could realistically be taken to five devices allowing for the fact that a couple or family may prefer to operate the same account across all the devices owned by the the members of that group, rather than have members maintain individual accounts just bound .

Conclusion

The idea of allowing concurrent multiple-device support for single accounts in a videoconference platform when it comes to videoconference participation is worth considering. This can be about increased mobility or user comfort or to cater towards the use of newer device types in the context of videoconferencing.

Gizmodo examines the weaponisation of a Twitter hashtag

Article

How The #DanLiedPeopleDied Hashtag Reveals Australia’s ‘Information Disorder’ Problem | Gizmodo

My Comments

I read in Gizmodo how an incendiary hashtag directed against Daniel Andrews, the State Premier of Victoria in Australia, was pushed around the Twittersphere and am raising this as an article. It is part of keeping HomeNetworking01.info readers aware about disinformation tactics as we increasingly rely on the Social Web for our news.

What is a hashtag

A hashtag is a single keyword preceded by a hash ( # ) symbol that is used to identify posts within the Social Web that feature a concept. It was initially introduced in Twitter as a way of indexing posts created on that platform and make them easy to search by concept. But an increasing number of other social-Web platforms have enabled the use of hashtags for the same purpose. They are typically used to embody a slogan or idea in an easy-to-remember way across the social Web.

Most social-media platforms turn these hashtags in to a hyperlink that shows a filtered view of all posts featuring that hashtag. They even use statistical calculations to identify the most popular hashtags on that platform or the ones whose visibility is increasing and present this in meaningful ways like ranked lists or keyword clouds.

How this came about

Earlier on in the COVID-19 coronavirus pandemic, an earlier hashtag called #ChinaLiedPeopleDied was working the Social Web. This was underscoring a concept with a very little modicum of truth that the Chinese government didn’t come clear about the genesis of the COVID-19 plague with its worldwide death toll and their role in informing the world about it.

That hashtag was used to fuel Sinophobia hatred against the Chinese community and was one of the first symptoms of questionable information floating around the Social Web regarding COVID-19 issues.

Australia passed through the early months of the COVID-19 plague and one of their border-control measures for this disease was to have incoming travellers required to stay in particular hotels for a fortnight before they can roam around Australia as a quarantine measure. The Australian federal government put this program in the hands of the state governments but offered resources like the use of the military to these governments as part of its implementation.

The second wave of the COVID-19 virus was happening within Victoria and a significant number of the cases was to do with some of the hotels associated with the hotel quarantine program. This caused a very significant death toll and had the state government run it to a raft of very stringent lockdown measures.

A new hashtag called #DanLiedPeopleDied came about because it was deemed that the Premier, Daniel Andrews, as the head of the state’s executive government wasn’t perceived to have come clear about any and all bungles associated with its management of the hotel quarantine program.

On 14 July 2020, this hashtag first appeared in a Twitter account that initially touched on Egyptian politics and delivered its posts in the Arabic language. But it suddenly switched countries, languages and political topics, which is one of the symptoms of a Social Web account existing just to peddle disinformation and propaganda.

The hashtag had laid low until 12 August when a run of Twitter posts featuring it were delivered by hyper-partisan Twitter accounts. This effort, also underscored by newly-created or suspicious accounts that existed to bolster the messaging, was to make it register on Twitter’s systems as a “trending” hashtag.

Subsequently a far-right social-media influencer with a following of 116,000 Twitter accounts ran a post to keep the hashtag going. There was a lot of very low-quality traffic featuring that hashtag or its messaging. It also included a lot of low-effort memes being published to drive the hashtag.

The above-mentioned Gizmodo article has graphs to show how the hashtag appeared over time which is worth having a look at.

What were the main drivers

But a lot of the traffic highlighted in the article was driven by the use of new or inauthentic accounts which aren’t necessarily “bots” – machine operated accounts that provide programmatic responses or posts. Rather this is the handiwork of trolls or sockpuppets (multiple online personas that are perceived to be different but say the same thing).

As well, there was a significant amount of “gaming the algorithm” activity going on in order to raise the profile of that hashtag. This is due to most social-media services implementing algorithms to expose trending activity and populate the user’s main view.

Why this is happening

Like with other fake-news, disinformation and propaganda campaigns, the #DanLiedPeopleDied hashtag is an effort to sow seeds of fear, uncertainty and doubt while bringing about discord with information that has very little in the way of truth. As well the main goal is to cause a popular distrust in leadership figures and entities as well as their advice and efforts.

In this case, the campaign was targeted at us Victorians who were facing social and economic instability associated with the recent stay-at-home orders thanks to COVID-19’s intense reappearance, in order to have us distrust Premier Dan Andrews and the State Government even more. As such, it is an effort to run these kind of campaigns to people who are in a state of vulnerability, when they are less likely to use defences like critical thought to protect themselves against questionable information.

As I know, Australia is rated as one of the most sustainable countries in the world by the Fragile States Index, in the same league as the Nordic countries, Switzerland, Canada and New Zealand. It means that the country is known to be socially, politically and economically stable. But we can find that a targeted information-weaponisation campaign can be used to destabilise a country even further and we need to be sensitive to such tactics.

One of the key factors behind the problem of information weaponisation is the weakening of traditional media’s role in the dissemination of hard news. This includes younger people preferring to go to online resources, especially the Social Web, portals or news aggregator Websites for their daily news intake. It also includes many established newsrooms receiving reduced funding thanks to reduced advertising, subscription or government income, reducing their ability to pay staff to turn out good-quality news.

When we make use of social media, we need to develop a healthy suspicion regarding what is appearing. Beware of accounts that suddenly appear or develop chameleon behaviours especially when key political events occur around the world. Also be careful about accounts that “spam” their output with a controversial hashtag or adopt a “stuck record” mentality over a topic.

Conclusion

Any time where a jurisdiction is in a state of turmoil is where the Web, especially the Social Web, can be a tool of information warfare. When you use it, you need to be on your guard about what you share or which posts you interact with.

Here, do research on hashtags that are suddenly trending around a social-media platform and play on your emotions and be especially careful of new or inauthentic accounts that run these hashtags.

Intel to offer integrated graphics fit for newer video games

Article

Intel Xe graphics strategy slide courtesy of Intel Corporation

Intel’s GPU strategy is rooted in Xe, a single architecture that can scale from teraflops to petaflops. At Architecture Day in August 2020, Intel Chief Architect Raja Koduri, Intel fellows and architects provided details on the progress Intel is making. (Credit: Intel Corporation)

Intel’s Xe Graphics Might Mean You No Longer Need a Separate Graphics Card to Play Games | Gizmodo Australia

Intel Xe Graphics: Release Date, Specs, Everything We Know | Tom’s Hardware

From the horse’s mouth

Intel

Intel Delivers Advances Across 6 Pillars of Technology, Powering Our Leadership Product Roadmap (Press Release)

My Comments

When one thinks of Intel’s graphics processor technology, they often think of the integrated graphics processors that use system RAM memory to “paint” the images you see on the screen. Typically these graphics processors are not considered as great as dedicated graphics processors of the like NVIDIA or AMD offer which use their own display memory.

Such processors are often associated with everyday business and personal computing needs like Web browsing, office productivity applications or consuming video content. They could be useful for basic photo editing or playing casual or “retro” games that aren’t graphically demanding, but wouldn’t do well with high-demand tasks like advanced photo/video editing or today’s video-game blockbusters.

Integrated graphics technology is typically preferred for use within laptops, tablets and 2-in-1s as an everyday graphics option for tasks like word-processing, Web surfing, basic video playback and the like. This is  especially because these computers need to run in a power-efficient and thermal-efficient manner, due to them being designed for portability and to be run on battery power. Let’s not forget that laptops with discrete graphics also implement integrated graphics for use as a power-efficient “lean-burn” option.

This same graphics technology also appeals to low-profile desktop computers including some “all-in-ones” and “next unit of computing” systems due to the chipsets yielding less heat and allowing for the compact design.

But typically most regular computers running desktop operating systems are nowadays specified with at least 8Gb of system RAM memory, if not 16Gb. Here, it may be inconsequential about the amount of memory used by the integrated graphics for some graphics tasks using the computer’s own screen. Let’s not forget that the Full HD (1080p) screen resolution is often recommended for a laptop’s integrated screen due to it being a power-efficient specification.

Intel has defined its new Xe graphics infrastructure platform that will be part of the Tiger Lake computing platform to be more capable than this. These GPU chips will maintain the same physical die size as prior Intel integrated graphics chips so as to avoid the need to reengineer computer designs when a silicon refresh to Tiger Lake is needed.

The more powerful Intel Xe variants will be offered with more powerful Tiger Lake CPUs. It will be similar to the current-issue Intel Iris Plus integrated graphics processors, and will be pitched for content creators. But I would say that these will simply appear in products similar to the former “multimedia laptops” that have increased multimedia performance.

One of the design goals for the Intel Xe LP (low power / low performance) integrated GPUs, especially the higher-performance variants is to play a graphically-rich AAA action game at Full HD resolution with a good frame rate. Being able to play such a game at Full HD that way would cater towards the preference for Full HD displays within 13”-15” laptops and similar portable computers due to this display specification being more power efficient than 4K UHD displays for that screen-size range.

A question I would raise is whether the frame rate would approach the standard of 60 Hz or how much of a power load this places on the computer’s batteries. As well, one would also need to know how much of the game’s “eye-candy” is being enabled during play on an Intel Xe LP integrated graphics setup.

Intel Xe-HP graphics chipset presentation slide courtesy of Intel Corporation

Xe-HP is the industry’s first multitiled, highly scalable, high-performance architecture, providing data center-class, rack-level media performance, GPU scalability and AI optimization. It covers a dynamic range of compute from one tile to two and four tiles, functioning like a multicore GPU. At Architecture Day in August 2020, Intel Chief Architect Raja Koduri, Intel fellows and architects provided details on the progress Intel is making. (Credit: Intel Corporation)

Intel will also intend to offer a dedicated graphics processor in the form of the Xe HP chipset codenamed DG1. It will be their first dedicated GPU that Intel has offered since 1998-2000 with a graphics card that they partnered for use with Pentium III and Celeron CPUs. This GPU will be capable of doing ray-tracing amongst other high-end gaming activities and it could be interesting to see how this chipset stands up to AMD or NVIDIA performance gaming GPUs.

The Intel Xe HP graphics platform will primarily be pitched at data center and server applications. But Intel is intending to offer a “client-computing” variant of this high-performance graphics platform as the Xe HPG. Here, this will be pitched at enthusiasts and gamers who value performance. But I am not sure what form factors this will appear in, be it a mobile dedicated GPU for performance-focused laptops and “all-in-ones” or small external graphics modules, or as a desktop expansion card for that gaming rig or “card-cage” external graphics module.

But Intel would need to offer this GPU not just as a “contract install” unit for computer builders to supply on a line-fit basis, but offer it through the “build-it-yourself” / computer-aftermarket sectors that serve hobbyist “gaming-rig” builders and the external graphics module sector. This sector is where NVIDIA and AMD are dominating within.

The accompanying software will implement adaptive graphics optimisation approaches including “there-and-then” performance tuning in order to cater towards new high-performance software needs. This would be seen as avoiding the need to update graphics driver software to run the latest games.

It could be seen as an attempt by Intel to cover the spread between entry-level graphics performance and mid-tier graphics performance. This could be a chance for Intel to make a mark for themselves when it comes to all-Intel computers pitched for everyday or modest computing expectations.

I also see Intel’s Xe graphics processor products as a way for them to be a significant third force when it comes to higher-performance “client computer” graphics processing technology. This is with NVIDA and AMD working on newer graphics silicon platforms and could definitely “liven up” the market there.

But it could lead to one or two of these companies placing a lot of effort on the high-end graphics technology space including offering such technology to the aftermarket. This is while one or two maintain an effort towards supplying entry-level and mid-tier graphics solutions primarily as original-equipment specification or modest aftermarket options.