Category: Milestones to the Connected Lifestyle

Google celebrates Hedy Lamarr who is behind how Bluetooth works

Articles

Hedy Lamarr: Five things you didn’t know about the actress and inventor in today’s Google doodle | The Independent

From the horse’s mouth

Google

Celebrating Hedy Lamarr (Blog post)

Video

My Comments

Google is celebrating Hedy Lamarr who was an Austrian film actress who moved to France then America where she gained her footing as one of Hollywood’s most beautiful film stars. They have done this using their animated “Google Doodles” which shows you the pathway from a film star to the home network.

But it wasn’t all about being the most beautiful woman in Europe or starring in these films that made her significant. When Hedy was with her first husband who was a German arms dealer, she had taken an interest in applied science which was part of what made military technology work.

After fleeing Germany, she also wanted to contribute to the Allied war effort during World War II in a more useful way than selling war bonds. She realised that Hitler was using his U-Boats to sink passenger liners and wanted to do something about that.

Hedy Lamarr worked with George Antheil, an avante-garde composer who was her neighbour in Hollywood, to invent a frequency-hopping system for radio communications. This was a tool to allow successful communications to take place without being at risk of radio jamming which the Germans were good at. 

The proof of concept was based around a player piano a.k.a. a pianola and its perforated rolls that had the music and taught the instrument how to play. Here, a piano roll was used to unpredictably change radio equipment between 88 different frequencies with the setup only operating on the frequencies for a short time. The sequence was only known between the controlling ship and the torpedo and it would require a lot of power in those days to jam a large swathe of frequencies. 

This was not implemented by US Navy until the early 1960s where it earnt its keep as part of a blockade of Cuba. But this technology and the spread-spectrum ability that it allowed for ended up as an integral part of today’s digital-radio communications techology. Examples of this include Bluetooth, CDMA used in some cordless phone and mobile phone applications and COFDM which is used in Wi-Fi wireless networking which nearly every home network runs with, DAB and digital satellite radio and DVB-T digital TV.

Here, it is about a film star who appeared through Hollywood’s Golden Era but also contributed to the features essential for mobile computing and home networking.

Send to Kindle

Windows 95–20 years old

Previous coverage

Special Report – Windows 95 now 15 years old and a major change to the PC computing platform

Video

“Start Me Up” video ad

To sum up

Windows 95 launch campaign billboard poster

Start It Up campaign billboard for Windows 95

Windows 95 was an operating system that led to a revolution as far as PC-based computing was concerned and was treated as such when it was launched. It actually revised how one thought of the Microsoft DOS / Windows computing platform towards something that was on a par with other computing platforms.

One of the key features that was highlighted was to have the Windows GUI front-end and MS-DOS integrated in to one package. Firstly, you didn’t have to buy Windows as a program nor did you have to type WINDOWS to pull up the graphical user interface. As well, there wasn’t the need to run various menu utilities to provide a user-friendly operating interface where programs are easy to find and run.

Instead, you used Start Menu to find programs and Windows Explorer to know your way around the computer’s file system. There was even the ability to give files and folders a meaningful file name rather than a very short name that wasn’t meaningful at first glance.

Another key feature was to do away with the need to run extra software to add functionality to a computer. Previously, if you were to run a CD-ROM, network abilities or any type of added functionality, you had to run certain memory-resident programs and this became very awkward for most people.

This led to an operating system that was “ready for the Internet and the network” out of the box, thus opening up the possibility for small organisations and households to set up easy-to-administer networks for sharing files and printers, and gaining access to the Internet.

Happy 20th Birthday, Windows 95! Start Me Up!

Send to Kindle

The Commodore Amiga turns 30 creating a turning point for desktop computing

Article

Iconic computer and game system Amiga turns 30 | The Age

Video

Amiga Demos of the late 80s

My Comments

In the late 1980s, Commodore released the Amiga series of computers which brought forward the concept of advanced graphics, video and music to the desktop computer.

These computers had the necessary hardware like the Motorola 68000 series RISC processors and graphics and sound chipsets that were advanced for their day. Initially, there was the Amiga 1000 computer but the popular machines that represented the Amiga platform at its peak were the Amiga 500 and the Amiga 2000.

They could generate high-resolution advanced moving graphics which put them on the platform for CGI animated video. As well, they were capable of turning out music which was either synthesised or sampled and this ability became very important during the “Acid House” era of the late 80s where house, techno and other electronic dance music came on the scene.

For that matter, if you ever seen a Commodore Amiga in action or used one of these computers yourself, you may have dabbled with the “demos”. These were self-running programs that showed a moving-graphics display on the screen set to music, typically electronic dance music of the day. I have linked in a YouTube clip of some of these “demos” so you can see what this computer was about.

The fact that the Amiga was popular in Europe instigated the European game-development scene where a lot of graphic-rich game genres that we take for granted were being exposed courtesy of this computer that, at times, was called the “game machine of all time”. For business applications, the Amiga platform even became the heart of some public-facing computer applications where a graphically-rich user interface was considered important, along with it being used to create computer graphics for film and video content.

This computer demonstrated the concept of a desktop computer being able to serve graphically-rich applications whether it be games, video content or the like and other computer platforms acquired this ability through the 1990s and now serve this purpose.

Send to Kindle

The BASIC computer language turns 50

Article

BASIC, The 50-Year-Old Computer Programming Language For Regular People | Gizmodo

How Steve Wozniak wrote BASIC for the original Apple from scratch | Gizmodo

My Comments

Those of us who ever had a chance to tinker with personal computers through the 1980s or were taught computer studies through that same time dabbled in a computer programming language called “BASIC”. This language was provided in an “interpreter” form with nearly all of the personal computers that were sold from the late 1970s and is celebrating its 50th anniversary.

It was developed by two Dartmouth professors who wanted a simplified language to program a computer with in the early 1960s because mainframe-type computers had more difficult ways to program them. The language was built around words common to the the English language along with the standard way mathematical formulae was represented. It was initially represented as a compiler for the mainframes, which turned the source code in to object code or an executable image in one pass, but was eventually written as an interpreter which executed each line of source code one at a time.

Bill Gates and Paul Allen worked on a successful BASIC interpreter for the Altair microcomputer in 1975 and used this as the founding stone for Microsoft with it initially being implemented in a variety of microcomputers and some manufacturers implementing slight variations of it in to various personal computers like the Tandy TRS-80. Similarly, Steve “The Woz” Wozniak wrote the BASIC interpreter for the Apple II computers from scratch in 1976, a path followed by other computer manufacturers like Commodore, Acorn (BBC Micro), Sinclair (ZX80, ZX81, ZX Spectrum) and Amstrad.

This language was not just taught in the classrooms, but people taught themselves how to program these computers using the manuals supplied with them and many articles printed in various computing and electronics magazines. There were even books and magazines published through the 1980s replete with “type-through” BASIC source code for various programs where people could transcribe this source code in to their computers to run these programs.

BASIC – the cornerstone of the hobby computing movement of the 1980s – turns 50

How this relates to the networked connected lifestyle is that the BASIC language gave us a taste of home computing and computer programming as a hobby. Even as Microsoft evolved the language towards QuickBASIC and Visual BASIC for the DOS / Windows platform, it exposed us to the idea of an easy-to-understand programming language that was able to get most of us interested in this craft.

Send to Kindle

50 years ago was the first public demonstration of the videophone concept

Article

50 years ago today, the public got its first taste of video calls | Engadget

My Comments

When we use Skype, Apple FaceTime, 3G mobile telephony or similar services for a video conversation where we see the other caller, this concept was brought to fruition 50 years ago courtesy of Bell Telephone.

Here, a public “proof-of-concept” setup was established between the site of the World’s Fair in Flushing Meadows in New York City and Disneyland in Los Angeles. People who wanted to try this concept sat in special phone booths where they talked in to a box with a small TV screen and camera as well as the speaker and microphone. They were able to see their correspondent as a 30-frames-per-second black-and-white TV image on this device and many people had a chance to give it a go for the duration of that World’s Fair.

Bell had a stab at marketing the “Picturephone” concept in different forms but the cost to purchase and use was prohibitive for most people and it got to a point where it could have limited corporate / government videoconferencing appeal. As well, a lot of science-fiction movies and TV shows written in the 1960s and 1970s, most notably “2001 A Space Odyssey” sustained the “Picturephone” and video telephony as something look forward to in the future along with space travel for everyone. For me, that scene in “2001 A Space Odyssey” with Dr. Heywood Floyd talking to his daughter on the public videophone at the space station stood out in my mind as what it was all about.

But as the IP technology that bears the Internet made it cheaper to use Skype and FaceTime, there are some of us who still find it difficult to make eye-contact with the correspondent due to having to know where the camera is on each side of the call.

In essence, the Bell public demonstration certainly has proven the concept from fiction to reality by allowing people to try it as part of a “world expo”.

Send to Kindle

Len Deighton had set the standard for creating the magnum opus using a computerised word processor

Article

Len Deighton’s Bomber, the first book ever written on a word processor. – Slate Magazine

My Comments

Before the arrival of the word processor, whether in today’s office suites that you can buy for your computer or even the basic text editor programs that came with most operating systems, the creation of a novel or other book required a lot of work.

This work was especially in the form of a continuous cycle of typing and reviewing the chapter drafts for that magnum opus. Len Deighton had a few IBM Selectric “golfball” typewriters which he used to work out his manuscripts and his personal assistant was ending up typing the chapter drafts as part of the review cycle for his novels. But in 1968, Eleanor Handley mentioned about this hard work to an IBM technician who worked under contract to this author to service these workhorses for him.

At that time, IBM had just launched the MT/ST which was a Selectric typewriter that worked alongside a tape drive system so the documents can be stored on reels of magnetic tape. The technician mentioned that IBM had this machine that could help him with this and future magnum opuses. He had leased the machine and the delivery men had to remove his front window so they could hoist it in by crane.

The features and tricks that this device had allowed him to review, insert and substitute text for each of the chapters that existed on each reel of tape. He exploited the “marker codes” that could be placed on the tape so he could locate the passages he was working on and this machine came in to play in him writing “Bomber” which was a novel set in World War II.

These devices evolved from the dedicated word-processing machines that worked alongside electronic or, in some cases, electric typewrites to the dedicated computer programs that you bought for most desktop computer systems from the early to mid 1980s.

With the arrival of the Graphic User Interface in 1984 courtesy of the Apple Macintosh, the ability to turn out copy that was hard to distinguish from published copy became a key feature. Here, the computer could draw out the typeface without the printer needing support for that font. As well, the word-processor became part of integrated “office” computing packages that had this function along with database, spreadsheet, presentation / business graphics and email functionality.

For me, the concept of creating a document from “go to whoa” using a computer’s word-processor software rather than handwriting it was a marked change. Here, the time it took to turn out a polished document was significantly reduced and I realised that I could easily work “from mind to document” without “scratching out” text with a black line or “wedging in” text using the caret symbol.

As for novelists and other authors, the amount of time it took to create copy for the paper or electronic destination was significantly reduced and you were able to make sure that the copy was “how you wanted it”.

Similarly the fact that Len Deighton had the MT/ST equipment at his home was a prediction of things to come in the form of desktop and personal computing which came about 10 years later.

Send to Kindle

The evolution of the mobile lifestyle summed up in a video

GSMA YouTube videoclip

http://youtu.be/wOryfTLTc1oClick to view (if you can’t see it below or want to “throw” to a Smart TV with a TwonkyMedia solution)

My Comments

This video sums up in two minutes and thirty seconds how the online lifestyle has evolved from the late 70s with the hobbyist computers through video games being popular to the brick-like mobile phones and desktop computers coming in to every office.

It then also shows how the mobile online life is becoming integrated in to everyone’s daily lifestyle and workstyle. Have a chance to look at this clip!

Send to Kindle

Even the 2012 London Olympics honours the founder of the World Wide Web

Article

Berners-Lee, Web take bow at Olympics | CNet

Video

http://youtu.be/KHmF14LaX5g

My Comments

Those of you who have watched the Opening Ceremony for the 2012 London Olympics may have thought that the obvious factors associated with Britain like the cottages, Industrial Revolution or the Beatles would be honoured in this ceremony.

But think again!. As part of a celebration of the recent popular history that was centred around life in an archetypal UK semi-detached house, there was a chance to celebrate the foundation of what has made the Internet-driven life tick. Here, the house fell away to reveal Sir Tim Berners-Lee, the founder of the World Wide Web, tap out a Tweet that celebrated this milestone to the connected lifestyle on the invention that he stood behind.

This ceremony definitely integrated the foundation stone of the home network and the connected life, with some of us perhaps watching it through the Web rather than on regular broadcast TV.

Send to Kindle

Farewell Steve Jobs–one of the pillars of the personal computer

Initially when I heard that Steve Jobs was to permanently resign from Apple due to ill-health, I thought it was simply retirement from one of the pillar companies of the personal-computing age.

Now, the man responsible for the Macintosh computing platform which commercialised and legitimised the “WIMP” (windows, icons, mouse, pointer) user-interface style and the iPhone and iPad devices which also did the same for touchscreen computing, has now passed away.

Many will remember his style of commercialising these technologies through a vertically-integrated method which requires the use of Apple products and services for full benefit, but this let the competitors implement systems that implemented these usage metaphors on their own platforms.

This was all from him and Steve Wozniak turning the proceeds from selling that VW Bus (Kombi-van) into capital for the Apple company. Here, Steve Jobs and Steve Wozniak worked on the development of the Apple II which became one of the beacons of the personal-computing age in the late 1970s.

A lot of commentators had said that Steve Jobs, through his efforts at Apple with the Apple II, the Macintosh and the iPhone and iPad devices had personalised computing. I have observed this through the demonstration software that came with Apple II computers in the 1980s, the boot sequence that was used in all the incarnations of the Macintosh platform and the design of computing products from the iMac onwards.

Whether its through the evolution of a computing technology or the passing of one of the people who influenced the direction of personal computing and communications; I would see this simply as a milestone to the connected lifestyle.

Send to Kindle

Now it’s firm – Steve Jobs to resign from chief executive at Apple

Articles

Steve Jobs resigns as Apple Chief Executive | SmartCompany.com.au

Steve Jobs steps down from Apple | CNet

Steve Jobs quits as Apple CEO | The Age (Australia)

My comments

There has been a lot of press about Steve Jobs intending to resign from Apple’s chief-executive position due to ill health. Now it had to happen that he is resigning. He is still able to maintain his position in Apple’s board of directors, both as a director and as the chairman of the board.

I see it as something that had to happen for another of personal-computing’s “old dogs”. These are the people who had founded companies that had been very instrumental to the development and marketing of commercially-viable personal computers. A few years ago, Bill Gates had resigned from Microsoft which he had founded.

This is more about a “change of the guard” at the top of these “pillar companies” as the technology behind these computers leads to highly-capable equipment for the home and business. This includes affordable mobile tablet computers that are operated by one’s touch and the smartphone which becomes a “jack of all trades”, working as a phone, personal stereo, handheld email terminal, handheld Web browser and more.

It is so easy to cast doubt over a company once a figurehead relinquishes the reins but I have seem may companies keep their same spirit alive and continue demonstrating their prowess at their core competencies.

As well, even though people may criticise him for how he manages the iTunes App Store and the Apple platforms, as in keeping them closed, Steve Jobs and Apple are in essence milestones to the connected lifestyle.

Send to Kindle