Computer Games Are Older Than You Think

I read with interest the recent news reports of the death of Ralph H Baer, the German-born electronics engineer who is credited with inventing the first video game console in 1966.  Baer, who died aged 92 on 6 December, pioneered the concept of a unit which would allow two people to play a selection of simple interactive video games using a domestic television set as the display device.  He and two colleagues developed a prototype unit which incorporated two controllers, each with two input dials and a pushbutton, and a bank of switches for selecting which game to play from a choice of 12.  This was subsequently licensed to US consumer electronics firm Magnavox and introduced as the Magnavox Odyssey in August 1972.

The success of the Odyssey prompted other companies such as Atari to introduce similar products and by 1976 the video game console market was worth over $240 million per year in the US alone.  The market has continued to grow at an astounding rate and is now worth an estimated $49 billion per year in worldwide sales.  Ralph Baer’s invention was the first successful hardware implementation of an interactive video game and his contribution to the birth of an industry was recognised by his adopted country in 2004 when he was awarded the US National Medal of Technology.

Of course, you no longer have to buy a dedicated game console in order to play interactive video games.  Most popular console games are also available as software versions which can be installed and run on a standard personal computer (providing that it meets the minimum hardware specification required by the game itself).  In conducting the research for my book, I was amazed to discover that the earliest known example of such a game predates the early video game consoles of the 1970s by more than 20 years.  It was a bouncing ball simulation written in 1950 for the MIT Whirlwind computer by Charles W Adams and John T Gilmore, two of the eight programmers on the Whirlwind development team.  Adams and Gilmore created a program that employed three differential equations to calculate the trajectory of a bouncing ball and display it on the computer’s oscilloscope screen.  By adjusting the bounce frequency using a control knob, the ball could be made to pass through a gap as if it had gone down a hole in the floor.

MIT WhirlwindThis was not the only notable achievement of the MIT Whirlwind project.  The machine itself is one of the most significant of the first generation of stored-program computers.  It provided a platform for MIT’s contribution to the development of magnetic core memory, a revolutionary type of memory technology which transformed the speed and reliability of early electronic computers.  More importantly, it is also the source of virtually all of the earliest developments in computer graphics and its design was later adopted by the US Air Force for the largest computer system ever built, the Project SAGE (Semi-Automatic Ground Environment) air defence system.


Not Turing Again!

Following my earlier post on the legacy of computer pioneer Alan M Turing (Turing’s Legacy), Turing’s achievements have again hit the headlines with the news last week that members of the Institution of Mechanical Engineers (IMechE)  have voted the Bombe at Bletchley Park as their favourite Engineering Heritage Award winner from the past 30 years.

Turing's BombeThe Bombe was an electromechanical code-breaking device which emulated the rotors of an Enigma machine, the typewriter-like device used by the German military to encrypt radio messages during World War II.  It employed a motorised mechanism to step rapidly through every possible combination of rotor settings and apply a logical test to each one to determine if it met the conditions for a possible solution which had been wired into the machine before operation.  The Bombe was closely based on an earlier device developed by the Polish Cipher Bureau in 1938.  Alan Turing improved the Polish design by incorporating 36 sets of rotors to allow multiple checks to be made in parallel.  Turing’s design was then turned into a fully-engineered device by the British Tabulating Machine Company under the direction of chief engineer Harold H Keen.

I must confess to being somewhat surprised by the result of the IMechE vote, as the Bombe was a workmanlike device which employed a brute force approach to cryptanalysis and lacks the sophistication and ingenuity of later developments at Bletchley Park (although these later developments employed electronic technology which probably renders them ineligible).  It also ranks low in terms of Alan Turing’s achievements, as the concept was not entirely his.

There have been 100 winners of the Engineering Heritage Award to date, so the IMechE members had many other remarkable examples of engineering excellence to choose from, including the world’s first supersonic airliner, Concorde, which came second in the vote.  The fact that they chose something designed by Alan Turing may be due in part to the high level of media attention he is receiving at the moment.  This attention is likely to increase further with the release next month of the Hollywood film The Imitation Game, a dramatised account of Turing’s cryptanalysis work at Bletchley Park during World War II.  Let’s hope the filmmakers have done their research and that Turing’s achievements are portrayed accurately.  I certainly plan to go see it and will report back in a future post.


The recent events held to mark the 70th anniversary of the Normandy Landings during World War II (a.k.a. D-Day) set me thinking about computer-related anniversaries.  Here are a few worth noting that have occurred over the past few months:-

  • 5 February was the 70th anniversary of the introduction of Colossus, the first large-scale electronic digital calculator.  Colossus was a massive step forward in the development of electronic computation but it was not the world’s first programmable electronic digital computer as is often reported.  It was a special-purpose machine created to perform Boolean operations on alphanumeric characters represented as 5-bit data.  For further information on Colossus, see Chapter 4 of my book.
  • 14 February was the 90th anniversary of the birth of International Business Machines.  IBM was actually established several years earlier in June 1911 under the name Computing-Tabulating-Recording Company (C-T-R).  The company’s early history is covered in Chapter 3 of my book.
  • 7 April was the 50th anniversary of the introduction of IBM’s System/360 family of medium and large scale computers (see my earlier post on the impact of the System/360 here).
  • 1 May was the 50th anniversary of the birth of the BASIC (Beginners All-purpose Symbolic Instruction Code) programming language.  BASIC was incredibly important in helping to establish a market for microcomputers in the late 1970s and it also contributed to the early success of Microsoft.  BASIC also has personal significance for me, as it was one of the first programming languages I learned to use.  I was also using FORTRAN during the same period and BASIC, though far less powerful, was much quicker and easier to use.
  • 7 June was the 60th anniversary of the death of computer pioneer Alan Turing (see my earlier post on Turing’s legacy here).

There are also a number of computer-related anniversaries coming up in the next few months which I’ll highlight in future posts.

IBM’s Game Changer

I was pleasantly surprised to see the BBC and several technology web sites pick up on this week’s 50th anniversary of the introduction of IBM’s System/360 family of medium and large scale computers.  The System/360 doesn’t often receive the attention it deserves but it really was a game changer for the industry, as it was the first range of computers to address the issue of software compatibility between different classes of machine, allowing code developed for one System/360 model to run on any other model in the family.  This was achieved through a technique known as microprogramming which was conceived by British computer scientist Maurice Wilkes at Cambridge University in 1951.

IBM System/360Before System/360, organisations looking to upgrade their computer system would have to rewrite their software to run on the new hardware, unless there happened to be an emulator available for the new model which could automatically translate and run code written for the old machine.  However, emulators were difficult to come by in the 1960s and many organisations remained locked in to ageing or inadequate systems as they could not afford the time or investment required to rewrite their software.  System/360 changed all this by allowing organisations whose needs had outgrown a particular class of machine to migrate to a larger model without having to worry about software compatibility.

The development of the System/360 range was one of the largest projects ever undertaken by IBM (or any computer company for that matter).  Development costs were estimated at a whopping $5 billion, equivalent to over $37 billion today.  The development of the operating system was so problematic that it inspired the manager responsible, Fred Brooks, to write his seminal book on software project management, The Mythical Man-Month: Essays in Software Engineering, which was first published in 1975.  Also, as the new family would replace IBM’s entire data processor product range and its development would necessitate a halt to all existing product development projects, including mid-life upgrades to existing models, it put the company in a highly vulnerable position until the new line came on stream.  One IBM executive summed it up perfectly by telling Fortune magazine, “We call this project ‘you bet your company’”.

Fortunately for IBM, the System/360 range was a huge success and the System/360 architecture would form the basis of the company’s data processor products for the next two decades.  In the 1960s where IBM led, the rest of the computer industry followed and other firms soon began offering their own families of compatible machines.  Some of these companies, such as RCA and Sperry Rand, went even further by also making their machines software compatible with System/360.

The impact of System/360 on the computer industry was immense, rivaled only by the IBM PC and Apple Macintosh in the 1980s.  It’s good to see this finally reflected in the media attention System/360 is now receiving as a result of its 50th anniversary.

Turing’s Legacy

It was good to hear the recent news that computer pioneer Alan M Turing has been granted a posthumous royal pardon for his 1952 conviction for gross indecency (as a result of a homosexual act, which at that time was illegal in the UK).  His punishment, for which he chose chemical castration rather than a prison sentence, and subsequent loss of security clearance for his cryptanalysis work are thought to have led to his untimely death by suicide two years later in 1954.  However, Turing’s pardon has also caused considerable controversy, as the thousands of other men who were convicted of homosexual acts during the same period are unlikely to receive the same treatment.  It seems that Turing has been singled out for a pardon as a result of his status as a national hero rather than for the degree of injustice that he suffered.

Alan TuringAlan Turing is one of the few British computer pioneers who is a household name.  This is well-deserved and an accurate reflection of the magnitude of his achievements.  However, many of the comments made in the media in the wake of the royal pardon announcement credit Turing with having a much greater influence on the development of the computer than is perhaps deserved.  Turing’s greatest achievements were in the field of mathematical logic and relate to the theory of computation rather than the practical development of computers.  His philosophy on the design of computers was to keep the hardware as simple as possible and rely on clever programming instead.  Consequently, his only computer design, for the NPL ACE (Automatic Computing Engine), made very little impact.

Alan Turing’s influence on the development of the computer is discussed in an excellent article by Professor Simon Lavington written last year as part of the celebration to mark the centenary of Turing’s birth.  The article, which is entitled Alan Turing: Is he really the father of computing?, questions how influential Turing’s pioneering work on early computers proved to be in their later development.  Lavington sums this up as follows:-

… his codebreaking work at Bletchley Park, and indeed Turing’s Ace design, exerted little influence on commercially viable computers as markets began to open up in the late 1950s.  Nevertheless, Alan Turing’s ideas remain to this day embedded in the theories of both codebreaking and computing.

Like Charles Babbage a century earlier, Alan Turing’s achievements went largely unrecognised during his lifetime and can only be fully appreciated when put into the context of later developments.  However, we must be careful not to place credit where it is not due.  Turing didn’t invent the modern computer anymore than Babbage did but his work did provide a solid theoretical foundation for later developments and for this he deserves to be honoured.

Computing Heroes

It was good to see Ada Lovelace Day attracting a large amount of press coverage this year.  The international day of celebration, which was held on 15 October, has been running for only 4 years and aims to raise the profile of women in science, technology, engineering and maths by encouraging people around the world to talk about the women whose work they admire.  This year’s Nobel Prize also attracted similarly high levels of press attention, particularly here in the UK as a result of the Physics prize having been awarded to Professor Peter Higgs of Edinburgh University.

Unfortunately, there isn’t a Nobel Prize in Engineering.  The closest we have to it is probably the Queen Elizabeth Prize for Engineering which was awarded for the first time in June.  The inaugural prize was jointly awarded to Robert Kahn, Vint Cerf and Louis Pouzin for their contributions to the protocols that make up the fundamental architecture of the Internet, Sir Tim Berners-Lee, for inventing the World Wide Web, and Marc Andreessen, who wrote the Mosaic web browser.  All five richly deserve their award, as the Internet and World Wide Web have in the words of the judges “initiated a communications revolution which has changed the world“.

Ada LovelaceAda Lovelace is also worthy of honour.  Known as the Enchantress of Numbers, she was the person who contributed most to our understanding of Charles Babbage’s Analytical Engine through her notes of 1843.  Her contribution may have had less of an impact than that of the five people mentioned above but she is an excellent role model and there is little doubt that she would have played a pivotal role in putting the machine to work had Babbage actually succeeded in constructing the Analytical Engine.

J Presper EckertThis got me thinking about who I would pick as my own hero amongst the many hundreds of contributors to the development of the computer featured in my book.  In terms of engineering, it would have to be the American electrical engineer J Presper Eckert.  Eckert’s name is associated with several of the most significant innovations in the early years of electronic computation, including the mercury delay line, magnetic drums and disks, the electrostatic storage tube, magnetic core storage and the stored-program concept.  He and Moore School of Electrical Engineering colleague John W Mauchly were also responsible for the development of the first general-purpose electronic digital calculator, ENIAC, and the UNIVAC I, which was the most influential of the early large-scale computers to reach the market.

Presper Eckert was not only a highly successful engineer.  He also co-founded the Eckert-Mauchly Computer Corporation, one of the earliest computer companies which was later absorbed into Remington Rand.  Eckert died in 1995 as a result of complications from leukemia.  Though not exactly an unsung hero, most of the awards he received during his lifetime were shared with John Mauchly and it is high time that he was recognised in his own right as one of the greatest engineers of all time.

A Fine Collection

I’ve recently returned from a short holiday in London where I was able to spend a couple of hours at the Science Museum in South Kensington.  This was my third visit to the Museum but the previous two visits were hurried attempts to cover the entire Museum as quickly as possible so this was my first opportunity to take my time and concentrate on examining the Museum’s fine collection of computer-related exhibits in the Computing Gallery.

The highlight of the collection is probably the full-scale replica of Charles Babbage’s Difference Engine No. 2 which contains over 8,000 components and weighs 5 tonnes.  Seeing this mechanical marvel up close really brings home the astonishing achievements made by Babbage more than a century before the birth of the computer industry.  Other Babbage items on display include the trial model and some of the engineering drawings for the Analytical Engine, Babbage’s general-purpose programmable calculating machine which was never built.  These provide a poignant reminder of a lost opportunity and food for thought on how much more advanced computer technology would have been had Babbage succeeded in completing this incredible machine.

Jesse Ramsden's Circular Dividing EngineI was pleasantly surprised to see one of Jesse Ramsden’s circular dividing engines on display.  Ramsden’s work on scientific instruments and machine tools in the 1770s led to major advances in precision engineering.  These advances not only allowed Babbage to create the intricate mechanisms for his Engines but they also provided the foundation for the successful mass production of mechanical calculating machines in the latter part of the 19th century.

The Museum’s collection does an excellent job of covering the mechanical and electromechanical eras and it was reassuring to see analogue computing well represented.  The electronic age is less well represented, however, with only a handful of medium and large scale electronic computers on display.  I’m aware that the Museum has many more items in storage than it can possibly display in the space available but it would be good to see a few more examples of electronic computers from recent times.

Unlike many of London’s tourist attractions, entry to the Science Museum is free.  Despite visiting at the height of the tourist season, there were no long queues at the doors, possibly as a result of stiff competition from the Natural History Museum and Victoria & Albert Museum which are both located in the same area.  If you are ever in South Kensington and have an hour or two to spare, I would highly recommend a visit.  If not, you could try reading Chapter 1 of my book which covers the work of Jesse Ramsden and Charles Babbage.

History in the Making

Earlier this week I attended a conference on the history of computing held at the Science Museum in London.  The conference was entitled Making the History of Computing Relevant and was organised by the International Federation for Information Processing (IFIP) in conjunction with the Science Museum and the Computer Conservation Society.

Conference1This was the first event of its kind to be held in the UK for several years and attracted participants from all over the country plus a large contingent of overseas visitors from as far afield as Australia, Japan and Hawaii.  The main theme of the conference was exploring what could be done to make the history of computing relevant and interesting to the general public.  The programme featured 28 presentations spread over 2 days.  These were structured into a number of sub-themes, each of which included a generous allocation of time for questions and discussion.

The presentations were uniformly excellent but the two stand out ones for me were ‘The Voice of the Machine’ by Dr Tom Lean and ‘Reconstruction of Konrad Zuse’s Z3’ by Dr Horst Zuse:-

  • Dr Lean’s presentation featured video and audio clips of interviews with some of the pioneers of computing in the UK which he has been gathering as part of the British Library’s ‘Voices of Science’ oral history project.  Listening to these fascinating clips gave the subject matter a more human dimension which should certainly help in making the history of computing more compelling to a non-technical audience.
  • Dr Zuse’s presentation was a hilarious account of his efforts to build a replica of the legendary Z3 electromechanical computer developed by his father in 1941.  It included clips from a video diary Dr Zuse had kept on the project, featuring such comic scenes as DHL couriers struggling to deliver heavy boxes containing thousands of electromagnetic relays to Dr Zuse’s top floor flat.  It was by far the funniest conference presentation I have ever had the pleasure of witnessing.

Conference2A reception was held in the Science Museum’s Alan Turing exhibition on the evening of Day 1.  This only added to the general air of conviviality and shared enthusiasm for the subject which permeated the whole event, although there was one moment of heated debate on Day 2 regarding whether museums should make more of an effort to give working demonstrations of their exhibits in situations where the equipment is still in working order.

I was also struck by the number of septuagenarian delegates sporting the latest smartphones, tablets and laptops and who were clearly much more skilled at using them than I am.  Proof perhaps of the adage “once a computer geek, always a computer geek“?

I am indebted to Google for sponsoring the conference, without which I probably could not have afforded to attend, as the delegate fees for conferences of this quality are usually several hundred pounds.  Thanks also to Dr Tilly Blyth (pictured above) and her team from the Science Museum for organising such an enjoyable and well run event.  Hopefully, the programme committee will decide to make this a regular event.  I’m already looking forward to attending the next one!

Teashop Technology

In my day job as a Business Development Manager at the University of Glasgow, I’m always looking for good case studies on the benefits of business-university collaboration.  One industry sector that has benefited enormously from university expertise is the computer industry.  A fascinating example of this is a project which dates back to the very start of the industry in the UK.  It involved a collaboration between catering firm J Lyons & Company and the University of Cambridge to develop the first computer specifically for business use.  Amazingly, this landmark project in the history of technology grew out of a simple requirement for the company to more accurately monitor sales of cakes throughout its chain of teashops.

The project started in May 1949 when the Lyons board took the decision to build an electronic computer to serve the company’s growing data processing needs.  This bold decision had been prompted by a fact-finding visit to the USA by two senior managers of the company, Raymond Thompson and Oliver Standingford, two years earlier in May 1947.  Despite the mundane nature of the company’s products, Lyons had a refreshingly forward-thinking attitude to business and had pioneered innovative management techniques in its quest for office efficiency.  Thompson and Standingford’s original remit had been to investigate wartime developments in office systems and equipment but this was extended to include computers following tantalising snippets of information that were beginning to appear in the press about US developments in electronic computation.

During their trip Thompson and Standingford met with Herman Goldstine, a leading figure in the US computer research community, who informed them of efforts to develop a computer taking place much closer to home at the University of Cambridge.  The Cambridge group was led by the pioneering computer scientist Maurice Wilkes.  Wilkes had been one of a handful of non-US academics invited to attend a series of lectures held at the University of Pennsylvania’s Moore School of Electrical Engineering in the summer of 1946 to disseminate the results of US government-sponsored research into electronic digital computers.  Top of the agenda was one of the Moore School’s own projects, a radical new computer design based on a stored-program architecture.  This was the breakthrough which would turn electronic computers from sequence-controlled calculators into powerful machines capable of carrying out all manner of complex tasks.  Fired up by what he had seen and heard on his visit to the US, Wilkes began working on his own version of this design during the voyage home.  The machine would be called the Electronic Delay Storage Automatic Calculator (EDSAC).

Goldstine arranged for Thompson and Standingford to be contacted by the Cambridge group on their return to the UK.  Following positive discussions, Lyons agreed to donate £3,000 in cash plus the services of an electronics technician for 1 year to support the Cambridge group in the development of EDSAC in return for advice on the company’s plans for computerisation of its data processing operation.  These extra resources helped the Cambridge group to not only catch up with but to overtake the far more experienced Moore School team in the race to construct the world’s first full-scale stored-program computer.  The fruitful relationship between Lyons and Cambridge continued beyond the initial 1-year agreement and when the company finally took the decision to build its own computer, it was obvious to all concerned that this should be based on EDSAC.  The result was the Lyons Electronic Office (LEO), the world’s first business computer.

LEOBuilt at a cost of £150,000, LEO featured a 17-bit word length and a clock speed of 526 KHz.  Like other full-scale computers of the pre-transistor era, LEO was the size of a room.  Its electronic circuits were constructed using thousands of bulky thermionic valves and its capacious 2,048-word memory was made up of 64 individual acoustic delay lines, long cylinders filled with liquid mercury that stored data in the form of sound waves.  Input and output was via punched cards and paper tape.

LEO ran its first live data processing job in November 1951, calculating the value of cakes, pies and pastries for dispatch to Lyons’ retail and wholesale outlets.  As one of the first electronic computers in the UK, LEO also attracted considerable attention from organisations interested in using the machine for scientific applications.  Lyons was able to take advantage of this situation by making LEO available to a number of these organisations on a fee paying basis in what was probably the earliest example of a computing bureau service.

LEO was conceived for internal use only but the enormous interest in the project prompted Lyons to enter the computer industry and a subsidiary company, LEO Computers Limited, was created in November 1954 to manufacture computers “for sale or hire”.  LEO Computers later merged with the Data Processing & Control Systems Division of English Electric to form English Electric Leo Computers which eventually became part of International Computers Limited (ICL).  On its formation in 1968, ICL was the largest non-US manufacturer of computers with a workforce of over 34,000 employees.

So are there any lessons that businesses today can learn from the success of the LEO project?  Here are three I can think of:-

  1. Play to each other’s strengths.  Lyons took a lead from the Cambridge group on the design of the hardware for LEO.  This allowed the company to focus its efforts on developing the application software, a task which made the best use of in-house expertise in office systems.  The company also subcontracted much of the construction work to other firms.
  2. Don’t be tempted to reinvent the wheel.  The LEO developers did not try to create the most advanced machine possible.  LEO’s conservative design was closely based on EDSAC which itself was closely based on the Moore School’s design.  Proven components and subsystems were chosen over newer technologies.  This combination of modest design goals and the use of readily available technology facilitated speed to market.
  3. Seconding staff can be one of the most effective methods of transferring knowledge from one organisation to another.  The technician chosen for the secondment, Ernest Lenaerts, quickly became a valued member of the Cambridge group.  When Lenaerts later returned to the company, the knowledge he had gained transferred back with him, making the job of building LEO much easier.

If you want to read more about LEO, I can recommend the book A Computer Called LEO by Georgina Ferry.  Also worth reading is Memoirs of a Computer Pioneer by Maurice V Wilkes.  My own book will also cover the development of EDSAC in Chapter 5 and LEO in Chapter 6.

The Military Origins of Electronic Computers

The BBC News web site carried an article last week about the tragic death of a British soldier in Afghanistan.  What made this story different from the depressingly constant stream of similar reports was that his death was caused by a data error.  He was killed by a smoke shell fired from a British artillery weapon which fell short of its target due to an error in the data used to compensate for the effects of weather conditions on the shell’s flight.  The computer system which would normally provide this data automatically was not working so the gunners had to input the data manually.  Unfortunately, they used the wrong data, with tragic consequences.

Ironically, one of the driving forces behind the development of electronic computers was the need for accurate artillery firing tables.  In the days before fully computerised fire control systems, these tables provided the settings that gunners require to compensate for the effects of external factors (such as wind speed and direction, air temperature and the rotation of the Earth) on the trajectory of an artillery shell.  However, the preparation of such tables was very labour intensive, with a single trajectory taking two full working days to calculate using a mechanical desk calculator, and the increased demand for them during World War II prompted the US Army to fund the development of calculating machinery which could speed up this process.  This resulted in the development of the first general-purpose electronic digital calculator, the Electronic Numerical Integrator and Computer (ENIAC).


ENIAC was developed at the University of Pennsylvania’s Moore School of Electrical Engineering in Philadelphia.  The completed machine weighed over 30 tonnes and its cabinets lined the walls of a room 50 feet long by 30 feet wide.  Although ENIAC’s main purpose was to perform calculations for artillery firing tables, it also possessed the flexibility to cope with a wider range of computational problems.  The insights which the ENIAC project team gained through providing this extra flexibility led directly to the development of the stored-program computers that we use today.

The two men who led the development of ENIAC, John Mauchly and Presper Eckert, went on to create the UNIVAC I, one of the first computers to reach the market.  UNIVAC I was designed primarily for business data processing but its origins, and the origins of all stored-program electronic digital computers, can be traced back to the US Army’s need for artillery firing tables during World War II.