Desktop security moves from virus-hunting to more tasks according to Symantec

Article

“Antivirus is dead” says maker of Norton AntiVirus | PC World

Antivirus Is Dead — Long Live Antivirus | Krebs On Security

My Comments

What did anti-virus software do?

McAfee LiveSafe desktop security program

A typical desktop-security program in action

Previously, an anti-virus program was regularly vetting software against a known signature-based list of virus software or, in some cases, Trojan-Horse software. Better programs of this class also implemented “heuristics-based” detection that observed software behaviour for known virus-like characteristics.

The software authors beihind the anti-virus programs were playing cat-and-mouse with the malware authors who are trying to get their rotten software on to our computers. For example, malware authors use “crypting” services to hide their software from the gateway software, typically through the use of obfuscation.

What have the anti-virus software programs evolved to?

These have evolved to robust “desktop security” software suites that perform many different security functions for the computers they are protecting.

Firstly they work with your email client software to vet your incoming email for spam and phishing emails. This will typically work with client-based email setups like Outlook, Apple Mail, Windows Live Mail and others rather than Webmail setups like GMail or Hotmail.

As well, they implement a desktop firewall that  verifies traffic coming to and from the Internet and home network so that malware can’t easily “report to sender” to fulfill its task.

They also implement a wider malware-checking mandate such as catching out rootkits, adware and spyware. Sometimes this is done on a “software reputation” mechanism or observing for particular behaviour traits.

Another function is to implement a “reputation check” for the websites that you visit. This checks whether a Website is a host for questionable software or implementing other questionable practices. This may also be included with a desktop content-filtering function which filters against pornography, hatred and other undesireable content.

They also work as a privacy watchdog by monitoring Websites or social-media services for improper activity that threatens your privacy or that of your child or other vulnerable person.

But, wait, there’s more!

Some of these programs offer extra functionality in the form of a password vault which looks after the passwords for the Websites and other resources you visit.

They may offer a client-server VPN so you can use the Web from other networks like your friends’ and relatives homes or public networks in a secure manner. Similarly, they offer a secure file-storage option, whether on the cloud or on your local machine.

Different levels of functionality available

Most desktop security suites pitched at the home or small-business user tend to be sold with client-focused manageability where you set their parameters to manage that particular client computer. If you have multiple computers, you have to manually replicate that same setup across those computers. As well, they are priced either “per machine” or in a licence-pack that covers up to five or, in some cases, ten machines. You may be lucky to have the software provided as a site-licence that covers equipment owned by a particular household.

Conversely, desktop-security software that is targeted at the big business or at some small businesses is set up for management of multiple machines from one logical point. This includes the ability to deploy the same software across multiple machines yet have the same standards preserved across the multiple machines. They are typically priced in licence-packs that encompass many machines or may also offer a site-licence deal which covers all equipment kept at a particular location or by a particular organisation.

The BASIC computer language turns 50

Article

BASIC, The 50-Year-Old Computer Programming Language For Regular People | Gizmodo

How Steve Wozniak wrote BASIC for the original Apple from scratch | Gizmodo

My Comments

Those of us who ever had a chance to tinker with personal computers through the 1980s or were taught computer studies through that same time dabbled in a computer programming language called “BASIC”. This language was provided in an “interpreter” form with nearly all of the personal computers that were sold from the late 1970s and is celebrating its 50th anniversary.

It was developed by two Dartmouth professors who wanted a simplified language to program a computer with in the early 1960s because mainframe-type computers had more difficult ways to program them. The language was built around words common to the the English language along with the standard way mathematical formulae was represented. It was initially represented as a compiler for the mainframes, which turned the source code in to object code or an executable image in one pass, but was eventually written as an interpreter which executed each line of source code one at a time.

Bill Gates and Paul Allen worked on a successful BASIC interpreter for the Altair microcomputer in 1975 and used this as the founding stone for Microsoft with it initially being implemented in a variety of microcomputers and some manufacturers implementing slight variations of it in to various personal computers like the Tandy TRS-80. Similarly, Steve “The Woz” Wozniak wrote the BASIC interpreter for the Apple II computers from scratch in 1976, a path followed by other computer manufacturers like Commodore, Acorn (BBC Micro), Sinclair (ZX80, ZX81, ZX Spectrum) and Amstrad.

This language was not just taught in the classrooms, but people taught themselves how to program these computers using the manuals supplied with them and many articles printed in various computing and electronics magazines. There were even books and magazines published through the 1980s replete with “type-through” BASIC source code for various programs where people could transcribe this source code in to their computers to run these programs.

BASIC – the cornerstone of the hobby computing movement of the 1980s – turns 50

How this relates to the networked connected lifestyle is that the BASIC language gave us a taste of home computing and computer programming as a hobby. Even as Microsoft evolved the language towards QuickBASIC and Visual BASIC for the DOS / Windows platform, it exposed us to the idea of an easy-to-understand programming language that was able to get most of us interested in this craft.

Underriver to benefit from Gigabit broadband Internet courtesy of Gigaclear

Article

Gigaclear starts installation of its network for Underriver in Kent | ThinkBroadband

From the horse’s mouth

Gigaclear

Press Release

Product Page – Underriver

My Comments

Underriver, a small affluent village just on the southwest of Sevenoaks in Kent has now started working towards a fibre-based Gigabit next-generation broadband service courtesy of Gigaclear. Like other Gigaclear projects and similar projects, it is more about achieving a value-priced real broadband service to the small towns and villages around the UK.

There is a goal to have the service pass 2000 homes and businesses and is to provide a symmetrical Gigabit broadband service which would please a lot of small businesses, professionals working from home or intending to do so and people who have long-distance relationships. This is because the upload speed is as quick as the download speed which would satisfy cloud-computing needs, online storage, Web content creation, VoIP amongst other needs.

Of course, one of these “fibre-to-the-door” deployments is considered a value-added feature for a premises that is being sold or rented out at a later time. This was something I touched on when RightMove were adding this factor to what their customers were searching on when they were seeking out property to buy.

Who knows what other villages and small towns in the “Garden Of England” could duplicate what is happening up in Underriver?

Purchasing and Specification Journal–A new playout computer for our church

New desktop comptuer at church

New desktop comptuer at church

As I had mentioned in a previous article, I had moved to a new church congregation and, a few services later, my new pastor had approached me for advice about specifying a new computer for the church. This was because the then-existing computer that was being used to show the song lyrics during worship and to sometimes show video material during a service or similar church event was nearly on its way out.

A risk I often identify with non-profit organisations of any size is where they could end up buying capital equipment that is undersized for their needs or is very likely to fail too frequently. They are also likely to fall for purchasing mistakes where they buy from a vendor who offers the goods for cheap but doesn’t offer good-quality after-sales service and support. In a lot of cases, these organisations are likely to source goods from a “friend of a friend” or “my friend’s boss” where they are not likely to get the best deal and this can place a toll on friendships and relationships.

Identifying the application

I identified that this computer is to be used for AV playout during services and other church activities. One activity that this church also engaged in very regularly is a concert outreach with band members playing the appropriate Christian songs as part of this concert. In these concerts, it would earn its keep with playing out video material or backing tracks for the performances.

These requirements placed an emphasis on multimedia work thus requiring a computer that can handle this kind of work very smoothly. As well, we were moving towards a newer media-playout practice which is to handle file-based media that is provided on a “transfer now, play later” method. This means that the pastor or one of the church elders can receive the media via an Internet path or create the media themselves at home and transfer it to a USB stick to take to church. Then they copy it to the computer’s hard disk for playback and work from the file that is on the hard disk when the time comes to play the material.

The existing system was an orthodox “tower-style” white-box desktop computer that was running Windows XP but was underperforming for today’s requirements due to small RAM and hard-disk space. This is connected to a local screen at the sound desk for cue/monitor purposes as well as a “front-of-house” video projector for the congregation to see the material.

For that matter, a “white-box” computer is a computer, typically a desktop computer, that is built by a value-added reseller or independent computer store using components that the reseller purchases. This can be a custom-built system or a package that is available “off-the-rack” for a known price like this computer.. It was infact the way most small businesses and home users bought their personal desktop computers since the 1990s.

What can benefit this application

For this application, I have identified certain key features that are important. These are increased processor capability and speed along with a dedicated graphics subsystem so as to allow the system to work with the local monitor and the projector in a highly responsive way.

As well, I placed importance on a computer having as much RAM and hard-disk capacity as the church can afford with the minimum being 4Gb RAM and 750Gb to 1 Terabyte hard disk capacity. One of the computer dealer also recommended in to their quote the use of a solid-state drive which can give the computer some speed especially when loading the software such as during startup.

I made sure that the computer came with a legitimately-licensed copy of Windows 7 so that most of those in the AV ministry don’t need to learn new skills if Windows 8.1 was in place. This was assuming that most of the people were operating computers running Windows 7 on their home network or at work.

Obtain competitive quotes

Before any money changed hands, I made sure that the church obtained quotes from a few different vendors. This has an advantage of knowing how much a computer system of this standard was to cost and it also allowed for the pastor to use these quotes as a bargaining tool to get the best value for money.

I made sure that the vendors we had on our shortlist had a local “bricks-and-mortar” storefront because of the issue of service and support. Here, we would be able to talk with the vendor rather than an offshore call centre if the machine did break down. It also allows one of the church elders to put the computer in their car and take it to the store if it needed repairs.

The kind of vendors we went for were national computer-store chains or independent computer stores who were able to build a system to the specifications or have one that was already built. For that matter,smaller independent or local computer vendors are likely to supply a “shop-built” white-box system for better value with local support.

The new system in place

We purchased a small “white-box” system to the specification, installed the necessary software on to it such as EasiSlides and set it up for use in the church. As I was worshipping God through the first Sunday morning service after the computer was installed, I had noticed that there was very little “lag” with the song-lyrics display.

There were still a few issues with the operators getting used to Windows 7 on the new computer after being used to handling Windows XP on the previous computer which I found out after that service and is something that I notice when one is confronted with new equipment.

Conclusion

As I had mentioned in my previous article about purchasing technology for a small business or community organisation, it is important to spend some time “doing your homework” when purchasing the technology. This is to make sure you are buying the equipment that represents the best value for money and can serve you in the long run.

In this case, it involved defining a set of baseline specifications that you won’t go below along with a price range that suits your budget, then seeking different quotes on systems that meet the baseline specifications from a few different vendors for the best price within your range before buying the actual equipment. As well, placing importance on vendors with a local physical shopfront allows for one to be able to obtain prompt service and support if the equipment malfunctions.

A convertible or a detachable–It’s Acer’s Switch 10

Articles

Acer announces new devices including new 2-in-1 laptop and 23-inch All-in-Ones | Windows Experience Blog

From the horse’s mouth

Acer

Press Release

Previous Coverage

Convertible Or Detachable – Where To Go?

My Comments

A detachable of the ilk of HP’s x2 Series or ASUS Transformer Prime series is either a conventional laptop when clipped with its keyboard base or a tablet that lies flat on the table or cradled in your hands.

But Acer has changed this view with the Switch 10 detachable tablet. This is one which can be positioned in a manner not dissimilar to most convertibles like the Lenovo Yoga series or the Sony VAIO Fit 13a where you can arrange the screen to be positioned at an angle for convenient touchscreen operation or viewing of pictures and video.

This is implemented with Acer’s Snap Hinge which is a special hinge that clips the keyboard base and tablet together like normal or can simply allow the tablet to be swiveled with the screen facing out. This means that the tablet be in a “tent” mode or an angled display mode as well as the laptop or tablet modes. As well, this 10” detachable runs on an Intel Bay Trail chipset with 2Gb RAM and 64Gb SSD storage and uses Windows 8.1 as its operating system.

But what I see of this is that it could be come a way to present a computer that offers the advantages of a detachable tablet in the form of lightweight operation and a convertible laptop which can be swiveled around for viewing or creating content. It is another way of making sure that the portable computer idea doesn’t forget that the keyboard has relevance for creating content.

Prototyping electronics the inkjet way–to come soon

Article

Home inkjet printer fabricates circuit boards on photo paper | ComputerWorld

Video

My Comments

There have been a few methods for building electronic-circuit prototypes which involved a “breadboard” of some form and having the components anchored down with either mere friction, with wires inserted in a spring or screwed down with a screw and washer. This was either used to teach electrical and electronics concepts or to “rough out” an electronic-circuit idea and having it work properly before spending time on building a printed circuit board for permanent deployment.

Now Microsoft have come about with a method of making printed-circuit boards using the common inkjet printing method that most printers (including a lot of the ones reviewed here) implement. Here, users could design a circuit using a regular computer and print this out on the photo paper. But the inkjet printer would have to be equipped with a cartridge that holds a special conductive ink and think it is printing something in black and white.

You could then think that you have to solder down the various regular components like diodes, resistors, capacitors and transistors, punching their wires through the paper. But this could prove to be difficult with heat-based soldering and wires attached to the components. Instead, Microsoft and 3M are implementing “stick-on” components that are conductive through the tape as a way to build the circuit that work in a similar way to a SIM card where there is conductive areas on that card.

There would be the requirement to use card-grade paper for improved mechanical reliability as well as the ability to connect regular wire to these circuits whether to connect regular batteries, switches and the like or connect between two circuits. Another issue would be to provide the conductive-ink cartridges for most of the currently-issued inkjet printers so that one can get going with using these printers for turning out the printed circuit boards.

But what I see of this is that equipment used by most computer users could come in to its own with learning electronics or building electronics circuits, especially “short-order” circuits. At the moment, the idea hasn’t been commercialised but the kind of people who could make it sell would include the educational sector or electronics shops of the Maplin, Jaycar or Radio Shack kind.

Fibre broadband passing your business? What it means for you

Article

Guest blog: Fibre broadband: what does it mean for your business? | Go E-Sussex

My Comments

A situation may occur for your small business where fibre-optic next-generation broadband passes your office (including your home). This is due to efforts in place to head towards the concept of this technology being made available in most places through progressive rollouts by differing companies.

In some cases, the next-generation broadband rollouts are public-private efforts with national, state or local governments putting money towards the efforts as a way of investing in their constituencies. It is very similar to improving infrastructure like roads, rails or utilities in a neighbourhood to make it worth investing or doing business in that area.

How could this benefit my business?

Use of remote storage and cloud services

One obvious application us the increased use of off-premises computing services.  Typically these are in the form of remote storage and backup services like Dropbox or Box.com, often marketed as “cloud storage” services. Some of these services essentially function as an off-premises data-backup tool or they can simply work as an invitation-only file-exchange service, whether between you and business partners or simply as a way to shift files between your regular computer and your mobile devices while “on the road”.

For organisations with a Web presence, this will encompass uploading Web content as you maintain your Web page or even backing up that Web page.

Even if a business implements on-premises storage technology such as a NAS or server, there is also an increased desire for remote “on-the-road” access to these resources. Similarly, it could be feasible for a business with two or more locations to have the ability to shift data between these locations such as storing data that is worked on at home using a NAS or server at the shop or caching data between two shops.

Another key direction is to head towards cloud-computing where the software that performs business tasks is hosted remotely. This will typically have you work either with a Web page as the software’s user interface or you may be dealing with a lightweight “peripheral bridge” for industry-specific peripherals. In some cases, various programs such as some business-grade security programs implement cloud computing in order to offload some of the processing that would normally be required of the local computing system. This is being pitched as a way for small business to “think like big business” due to the low capital-equipment cost.

The next-generation broadband services can improve this kind of computing by reducing the time it takes to transfer the files and allows for silky-smooth cloud-computing operations.

IP Telecommunications

A very significant direction for business Internet use is IP-based telecommunications. This gives businesses some real capabilities through the saving on operational expenditure costs while also opening up some newer pathways that have been put out of small business’s reach.

One application is IP-based videocalls using Skype and Lync technology – totally real, not science-fiction anymore. These technologies have the ability to provide for video-based real-time teleconferencing even to high-quality visual displays and some of them even allow for multi-party videocalls. The next-generation broadband services can exploit this technology by permitting smooth reliable videocalls with the high-resolution video display.

Another IP-telecommunications application appealing to small business is the concept of the IP-based business telephone system. This can be facilitated with an on-premises IP-based PBX server that is linked to the outside world via an IP-based “trunk” or a hosted IP-telephony system which is ran simply as a service. The phones that sit on the desks are primarily IP-based extensions or legacy phones connected via analogue-telephone adaptor devices with DECT cordless phones linked up to an IP DECT base.  In some cases, a regular computer or a mobile device (smartphone or tablet) could run a “softphone” application which uses the device’s control surface and audio infrastructure to make it become an extension.

These appeal to businesses due to access to low telephony costs especially for long-distance calls or, for that matter, free calls between multiple business locations through the use of a tie-line that the business doesn’t need to rent.

The next-generation broadband can allow IP-based telecommunications to take place while data is being transferred or the Internet is being used without impeding data-transfer speed or voice / video call quality.

IP-based video surveillance

Shopkeepers and other small-business owners would find greater justification to install an IP-based video-surveillance system or upgrade an existing video-surveillance system to IP-based technology. This could allow, for example, one to watch over another location from one location or permit backup video recording of the footage at another location whether it be a storage provider that you rent space on or a NAS installed at home or another business location.

This also allows for use of newer cameras that implement higher-resolution sensors and support on-camera video analytics. The increased bandwidth means that more of the video footage from these cameras can be streamed at once to a remote location.

Working from home

If you work from home, whether to telecommute or to operate your business operation from home, you will find that the next-generation broadband service is important for you. This is more so if you are dealing with graphics, CAD or multimedia content or even using a VoIP or videocall service as your communications technology.

As well, you benefit from reduced Internet-service contention with other household members when you have the next-generation broadband service. This is because the increased bandwidth could allow you to do intense cloud-based work computing or a large file transfer while they do something like engage in a VoIP voice call or stream video content.

Offering your customers or guests public Wi-Fi Internet service

If you run a bar, café, hotel or similar venue, you could offer your customers or guests a public Internet service and not worry that this service will cramp your business Internet style. In some cases, you could handle more customers’ data-transfer needs at once  which would happen at “peak occupancy” or allow for them to engage in high-bandwidth applications like business data transfer, videocalls or enjoying online video content and have the best experience with these activities.

To the same extent, the public Wi-Fi Internet service is being seen by mobile telephony carriers as an “offload” service to increase their mobile network capacity. As well, some mobile carriers are even implementing femtocells which are mobile base stations that cover a small area like a home or business premises as a method of improving indoor mobile coverage in a particular premises or increasing mobile capacity in a popular location.

Is your small business’s network ready?

There are some things you would have to do to get your small business’s network ready for the next-generation-broadband service.

Your router

One would be to make sure you have a small-business router that is optimised for next-generation very-high-speed broadband. One critical feature it would need is to have an Internet (WAN) connection with Gigabit Ethernet. This would allow for use with an optical-network modem that provides for the high-speed throughput these networks provide. As well, having Gigabit Ethernet for each LAN Ethernet connector and 802.11n/ac dual-band Wi-Fi wireless where applicable would be considered important for the local network side of the equation.

Most of the current-issue high-end or “small-business-grade” routers would cut the mustard when it comes to having this kind of connectivity and this goal could be achieved with the current network-equipment replacement cycle.

Your network

As well, you would need to bring your Ethernet infrastructure to Gigabit standard while also evolving your Wi-Fi wireless infrastructure to 802.11n or 802.11ac standard with simultaneous dual-band operation i.e. N600 or better. A HomePlug segment that you operate in this network could be brought to HomePlug AV2 standard preferably due to higher throughput and improved robustness than HomePlug AV500 or HomePlug AV.

Wireless hotspots

Those of you who run a wireless hotspot or similar public-internet service which is managed by a special router dedicated to this task may have to evolve it to a component-based system so you can implement high-throughput networking technology.

Component-based hotspot systems use a wired hotspot router with Ethernet connections as the only network connections. Then you connect a VDSL2 modem, optical-network terminal or similar appropriate device to your hotspot router’s WAN / Internet port and a dedicated 802.11n access point with a standard of at least N300 for a 2.4GHz single-band unit or N600 for a simultaneous dual-band unit.  This is also a good time to make sure you have optimum public Wi-Fi coverage across your business premises.

You may also have to make sure that the system has improved quality-of-service support for multimedia-based tasks especially if you business happens to be a hotel or similar type. This is because a lot of people are increasingly using smartphones, tablets and ultraportable laptops to engage in Skype videocalls or stream video content from catch-up TV or video-on-demand services and poor quality-of-service severely ruins the user experience with these services.

Considering the full-fibre option

Another issue that can be worth considering for small-business fibre broadband is the “full-fibre” option in a fibre-copper setup. This is being offered by some UK next-generation broadband services and is also being offered In Australia as an option under the Coalition’s preferred fibre-copper National Broadband Network setups.

Here, the same service provider who would normally provide a fibre-copper service like fibre-to-the-cabinet / fibre-to-the-node would also provide a fibre-to-the-premises service as an extra-cost option. A small business could opt for these services especially if they are using cloud services a lot or uploading data to online storage frequently.

Even a business using a fibre-copper setup could look at the feasibility of the full-fibre option as a long-term goal as they make more use of the Internet. As well, this would be a valuable option for premises that are underserved by VDSL-based fibre-copper services. 

It is also worth noting that when you have a “full-fibre” install to a premises, the sale or lease value can increase because of the availability of really high-speed broadband service at that location.

Conclusion

When fibre-based next-generation broadband passes your business, it would become a valuable business option to sign up to one of these services due to costs saved through the higher throughput available with these services. It also allows the business to “grow up” and adapt to increased data-throughput needs.

CableLabs have given their blessing for DLNA CVP-2 standards for premium-content delivery in the home

Article – From the horse’s mouth

CableLabs

DLNA CVP-2: Premium Content to Any Device in Any Room

My Comments

Sony PS3 games console

Consoles like these could be able to pick up pay TV from a DLNA CVP-2 gateway device

CableLabs have cemented their approval for the current iteration for DLNA Commercial Video Profile 2 to provide for improved in-home pay-TV setups using the home network. This leads effectively to an FCC goal that requires device-independence for cable-TV setups in the home rather than users being required to lease a set-top box for each TV in the home or install a “TV Everywhere” app provided by the cable company on each mobile device if they want cable TV on the extra screens.

What is DLNA CVP-2?

This is a super-standard defined by DLNA which uses a group of standards to assure pay-TV networks that their content is being delivered securely and surely to the display device via the home network. Here, the display device can be a Smart TV or video peripheral with “Connected TV” capabilities or software in a regular desktop / laptop computer or mobile device (tablet / smartphone) to show the TV content on the screen.

Sony BDP-S390 Blu-Ray Disc Player

.. as could these Blu-Ray players

It will typically require a so-called “gateway device” connected to the cable system, satellite dish and/or Internet service, such as a broadcast-LAN tuner, router with broadcast-LAN capabilities or a PVR in the customer’s home while display devices and software would have to authenticate over the home network with the standards that are part of the package. The PVR solution may typically be connected to the main TV set in the lounge or family room where most TV viewing is done while TVs installed in other rooms like the bedroom can use the home network to “pull down” live or recorded TV content using “smart-TV” abilities integrated in the set or a games console / Blu-ray player.

DLNA media directory provided by server PC

.. as could these Smart TVs

There is the use of DTCP-IP secure-content-delivery specifications for IP-based home networks to authenticate the access of content to cable-TV / content-studio / sports-league requirements. As well setups that implement DLNA CVP-2 implement RVU which provides the same kind of user interface expected when you use pay-TV services, which could facilitate things like access to video-on-demand and pay-per-view content, access to the service provider’s TV-hosted storefront and magazine, or ability to schedule PVR recordings.

Another benefit provided by DLNA CVP-2 is to support endpoints that implement a very-low-power standby mode and allow them to use wakeup and network-reservation mechanisms to allow the efficient-power modes to operate but provide for proper useability and serviceablility. This avoids service issues that are likely to happen if a device goes to an ultra-low-power quiescent mode when not needed and finds that it has to create a brand new connection to the network and its peers when it is needed.

Do I see this as a change for delivery of the multichannel pay-TV service?

One reality is that DLNA CVP-2, like other technologies affecting TV, won’t change the calibre of the content offered on pay-TV services. You will still end up with the same standard of content i.e. a lot of channels with nothing worth viewing.

But it will affect how a pay-TV company delivers services pitched towards a multiple-TV household. They could offer, either as part of the standard service, as part of an upsaleable premium service or as an optional item, a “multiple-TV” service. This would allow a person to have the pay-TV service appear on all suitably-equipped screens instead of paying for each TV to be equipped with a set-top box.

Similarly, the main device could change from an ordinary set-top box with PVR abilities to either one with the “gateway” abilities integrated in to it or a “headless” gateway device with broadcast-LAN and PVR abilities. In this case, the main TV would either be a suitably-equipped Smart TV or be connected to a video peripheral that has this kind of “connected TV” functionality built in. It could also change the focus of the value of the customer’s bill towards the content services rather than the customer-premises equipment.

For consumers, it could be a path for those of us who move between pay-TV or triple-play services whether due to moving location or moving to a better offer. This is because there isn’t the need to mess around with set-top boxes or create infrastructure for a pay-TV service that implements different methodologies.

Windows 8–How about apps that exploit both the Desktop and Modern UI?

After upgrading to Windows 8 on my main computer and utilising Windows 8 on review-sample laptop computers, I had a good chance to use the classic Desktop user interface along with the newer Modern user interface for a lot of computing needs.

Windows 8 Modern UI start screen

Windows 8 Modern UI has some benefits for some tasks

What I had found was that each of the “views” appealed to different tasks and working conditions. For example, I could use the Desktop View for applications that required detailed work and were more mouse / keyboard focused. This is although I had used the touchscreen with this interface for coarse navigation tasks like selecting functions on a toolbar or hyperlinks on a Webpage.

The Modern view, previously known as the Metro view, came in handy when I wanted a simpler user experience for the task like viewing a PDF or photograph. Even using Skype or Facebook with the Modern View gave that “dashboard” look which has everything at a glance, This worked well with the mouse on my main computer and with touchscreen setups on suitably-equipped laptops but was a bit of a pain when using just the trackpad on laptops that didn’t come with a touchscreen.

Windows 8.1 Update 1 has integrated Modern UI apps and Desktop apps into the Desktop user interface by allowing users to pin the Modern UI apps to the Desktop UI’s taskbar. This is augmented with the Modern UI apps also having a control strip that can be brought up to minimise or close these apps.

The current problem

Application with Desktop user interface

Skype with uncluttered Modern user interface

Skype with uncluttered Modern user interface

The current problem with the way applications are written for Windows 8 is that two different programs need to be delivered by different channels if you want to perform the same function on both interfaces. Firstly, I would have to install one application through the traditional paths for a regular computer i.e. install it from a CD or other removeable medium or download it from the developer’s site and install that download file. Then, if I want to have the “full” Modern user experience, I would have to visit the Windows Store to download a separate app that exploits that interface.

How could we improve on this?

One direction that Microsoft could offer for this is to allow developers to deliver a Desktop and Modern UI package as part of a single Windows 8.1 application install package. Here, the user just installs this one package as one action and finds both a Desktop-view application and Modern-view application for the same task on their machine.

This could come in the form of separate apps for each of the user experiences or a monolith app that presents in one way for the simplified Modern user interface and another way for the detailed Desktop user interface. This could also cater for a “live tile” option to show always-updating data. The user then has the choice of seeing a simplified user interface that works well with the touchscreen or mouse-based operation or a detailed user interface.

There also has to be the ability to be assured of data continuity between both the Desktop view and the Modern view, which is important for a lot of tasks. Some tasks like VoIP or working on a document can play a difficult hand if you switch between views whereas other “read-only” tasks which relate to a common data source can play properly with a user-interface switch.

The only problem about this ideal is having the ability for a user to determine the view they want to run because it is possible for a Desktop-view app launched from the (Modern-view) Start Screen. Similarly, from Windows 8.1 Update 1, it is possible to put a Modern-view Windows Store app on the Taskbar and launch it from there.

Conclusion

If Microsoft could provide a single-install single-update experience for those of us who run Windows 8 and newer operating systems, this could encourage software developers to work the Modern UI as a clean “dashboard” user experience while the regular Desktop view serves as a “detailed” user experience for those of us who want more control.

50 years ago was the first public demonstration of the videophone concept

Article

50 years ago today, the public got its first taste of video calls | Engadget

My Comments

When we use Skype, Apple FaceTime, 3G mobile telephony or similar services for a video conversation where we see the other caller, this concept was brought to fruition 50 years ago courtesy of Bell Telephone.

Here, a public “proof-of-concept” setup was established between the site of the World’s Fair in Flushing Meadows in New York City and Disneyland in Los Angeles. People who wanted to try this concept sat in special phone booths where they talked in to a box with a small TV screen and camera as well as the speaker and microphone. They were able to see their correspondent as a 30-frames-per-second black-and-white TV image on this device and many people had a chance to give it a go for the duration of that World’s Fair.

Bell had a stab at marketing the “Picturephone” concept in different forms but the cost to purchase and use was prohibitive for most people and it got to a point where it could have limited corporate / government videoconferencing appeal. As well, a lot of science-fiction movies and TV shows written in the 1960s and 1970s, most notably “2001 A Space Odyssey” sustained the “Picturephone” and video telephony as something look forward to in the future along with space travel for everyone. For me, that scene in “2001 A Space Odyssey” with Dr. Heywood Floyd talking to his daughter on the public videophone at the space station stood out in my mind as what it was all about.

But as the IP technology that bears the Internet made it cheaper to use Skype and FaceTime, there are some of us who still find it difficult to make eye-contact with the correspondent due to having to know where the camera is on each side of the call.

In essence, the Bell public demonstration certainly has proven the concept from fiction to reality by allowing people to try it as part of a “world expo”.