Posts

A Juicy Pair of Apples

Following my last post on the subject, The Apple Falls, two more Apple-1 computers have come up for auction in the past few months, with a puzzlingly wide disparity between the prices fetched.  In October 2014, the Henry Ford Museum in Dearborn, Michigan, smashed the previous record by paying $905,000 for a working example of an Apple-1 at an auction held by Bonham’s in New York.  Two months later, another working example was auctioned by Christie’s in New York for $365,000, less than half the price paid only weeks before.  Both machines included a full complement of accessories and peripherals, although the higher priced example was said to be in “superb overall condition“.

Apple-1 Computer Auctioned by Bonham'sThese latest sales bring the total number of Apple-1 computers which have appeared at auction in recent years to 13.  There are also a similar number of examples held in other museums around the world plus around 30 known to be in private collections.  By my reckoning, that still leaves a few undiscovered examples in circulation from the 63 which are thought to have survived out of an estimated total of 175 Apple-1 machines originally built.  However, it’s very unlikely that one of these undiscovered treasures will turn up in a junk shop or car boot sale in my neighbourhood anytime soon, as Apple had no UK sales distribution channels at that time.

The original batch of 50 units were sold through the Byte Shop, an early computer retail store based in Mountain View, California, and the remaining machines were sold direct by Jobs and Wozniak to their fellow Homebrew Computer Club members located around the Silicon Valley area.  A handful were also sold as a result of adverts placed by Apple in two computer hobbyist magazines, Byte and Dr Dobb’s Journal, but none are known to have gone to overseas buyers.  A significant proportion of these machines were subsequently taken out of circulation as a result of a generous trade-in allowance offered on the Apple ][ following its introduction in April 1977 which tempted many Apple-1 owners into trading in their ‘bare bones’ machine for the sleek new model, hence the relatively low figure of 63 which are thought to have survived.

Given the high prices paid in recent years for Apple-1 computers, there are likely to be some fakes in circulation and authenticating genuine examples will be an extremely difficult task as Apple did not use serial numbers.  It will be very interesting to see what happens when the next Apple-1 comes up for auction.  With around 30 examples still in private hands, it should only be a matter of time before one of these owners succumbs to temptation and puts his or her precious machine up for sale.

Advertisement

Computer Games Are Older Than You Think

I read with interest the recent news reports of the death of Ralph H Baer, the German-born electronics engineer who is credited with inventing the first video game console in 1966.  Baer, who died aged 92 on 6 December, pioneered the concept of a unit which would allow two people to play a selection of simple interactive video games using a domestic television set as the display device.  He and two colleagues developed a prototype unit which incorporated two controllers, each with two input dials and a pushbutton, and a bank of switches for selecting which game to play from a choice of 12.  This was subsequently licensed to US consumer electronics firm Magnavox and introduced as the Magnavox Odyssey in August 1972.

The success of the Odyssey prompted other companies such as Atari to introduce similar products and by 1976 the video game console market was worth over $240 million per year in the US alone.  The market has continued to grow at an astounding rate and is now worth an estimated $49 billion per year in worldwide sales.  Ralph Baer’s invention was the first successful hardware implementation of an interactive video game and his contribution to the birth of an industry was recognised by his adopted country in 2004 when he was awarded the US National Medal of Technology.

Of course, you no longer have to buy a dedicated game console in order to play interactive video games.  Most popular console games are also available as software versions which can be installed and run on a standard personal computer (providing that it meets the minimum hardware specification required by the game itself).  In conducting the research for my book, I was amazed to discover that the earliest known example of such a game predates the early video game consoles of the 1970s by more than 20 years.  It was a bouncing ball simulation written in 1950 for the MIT Whirlwind computer by Charles W Adams and John T Gilmore, two of the eight programmers on the Whirlwind development team.  Adams and Gilmore created a program that employed three differential equations to calculate the trajectory of a bouncing ball and display it on the computer’s oscilloscope screen.  By adjusting the bounce frequency using a control knob, the ball could be made to pass through a gap as if it had gone down a hole in the floor.

MIT WhirlwindThis was not the only notable achievement of the MIT Whirlwind project.  The machine itself is one of the most significant of the first generation of stored-program computers.  It provided a platform for MIT’s contribution to the development of magnetic core memory, a revolutionary type of memory technology which transformed the speed and reliability of early electronic computers.  More importantly, it is also the source of virtually all of the earliest developments in computer graphics and its design was later adopted by the US Air Force for the largest computer system ever built, the Project SAGE (Semi-Automatic Ground Environment) air defence system.

Publishers Rejoice!

I’ve done it.  After 11 years and god knows how many hours of work, I’ve finally completed the first draft of my book on the history of the computer.  Weighing in at just over 235,000 words, it covers the entire 400-year history of computing technology, from humanity’s earliest efforts to mechanise calculation in the 17th century to the latest devices which merge portable computing and mobile telecommunications technologies to enable new methods of social interaction.

Researching and writing the book has taken up most of my spare time over the past 11 years but, rather than the expected feeling of huge relief, finishing the book has been something of an anticlimax so far.  I’m already missing the discipline of dragging myself away from the TV or the Internet every night in order to write a few more paragraphs.  I could start another book but there isn’t really any point in doing so unless I can do something useful with this one.  Therefore, I now need to make an effort to get the book published.

In my earlier post on the subject (The Self-Publishing Dilemma), I identified three options; (i) securing a publishing deal with a book publisher, (ii) self-publishing the book, or (iii) putting the entire book up on the Web as a free download.  I am under no illusions as to how difficult it will be to secure a deal with a mainstream publisher, especially for someone who has no track record as an author.  Most mainstream publishers won’t accept unsolicited book proposals in any case.  It might be possible to interest one of the academic publishers in my book (e.g. Oxford University Press, MIT Press, etc.), as these publishers do accept unsolicited proposals but they also expect their authors to have the relevant academic credentials and I am not an academic.  The size of the book could also discourage all but the most enthusiastic of publishers, as it would appear to be more than double the word count of a typical non-fiction title.

The sensible option is probably to self-publish but, before I venture down this costly and time-consuming path, it might be worthwhile making at least one attempt to secure a publishing deal.  Therefore, I’ve compiled a ‘hit list’ of non-fiction publishers who accept unsolicited proposals and whose catalogues include history, popular science and/or business titles.  The next stage will be to produce and submit proposals for each of them in the specified format (which seems to be different for each publisher) then cross my fingers and see what happens.  I’ll let you know how I get on.

Not Turing Again!

Following my earlier post on the legacy of computer pioneer Alan M Turing (Turing’s Legacy), Turing’s achievements have again hit the headlines with the news last week that members of the Institution of Mechanical Engineers (IMechE)  have voted the Bombe at Bletchley Park as their favourite Engineering Heritage Award winner from the past 30 years.

Turing's BombeThe Bombe was an electromechanical code-breaking device which emulated the rotors of an Enigma machine, the typewriter-like device used by the German military to encrypt radio messages during World War II.  It employed a motorised mechanism to step rapidly through every possible combination of rotor settings and apply a logical test to each one to determine if it met the conditions for a possible solution which had been wired into the machine before operation.  The Bombe was closely based on an earlier device developed by the Polish Cipher Bureau in 1938.  Alan Turing improved the Polish design by incorporating 36 sets of rotors to allow multiple checks to be made in parallel.  Turing’s design was then turned into a fully-engineered device by the British Tabulating Machine Company under the direction of chief engineer Harold H Keen.

I must confess to being somewhat surprised by the result of the IMechE vote, as the Bombe was a workmanlike device which employed a brute force approach to cryptanalysis and lacks the sophistication and ingenuity of later developments at Bletchley Park (although these later developments employed electronic technology which probably renders them ineligible).  It also ranks low in terms of Alan Turing’s achievements, as the concept was not entirely his.

There have been 100 winners of the Engineering Heritage Award to date, so the IMechE members had many other remarkable examples of engineering excellence to choose from, including the world’s first supersonic airliner, Concorde, which came second in the vote.  The fact that they chose something designed by Alan Turing may be due in part to the high level of media attention he is receiving at the moment.  This attention is likely to increase further with the release next month of the Hollywood film The Imitation Game, a dramatised account of Turing’s cryptanalysis work at Bletchley Park during World War II.  Let’s hope the filmmakers have done their research and that Turing’s achievements are portrayed accurately.  I certainly plan to go see it and will report back in a future post.

Inching Towards Completion

In my last post on progress with the book (Only One More to Go) which I posted back in February, I mentioned that I was about to begin the final chapter and had set myself an ambitious target date for completion of 31 July this year.  Well, that date has now passed but I’m not quite there yet.

With the home straight in sight, initial progress was indeed swift but work and summer holidays have gotten in the way in recent weeks, with the result that I’m still several weeks from completion.  I’ve written just over 14,000 words out of an estimated total of around 17,000.  I also have to finish the introduction to the book, which will require an additional 1,500 words or so, plus a much-needed edit of Chapter 10 which I’ve been working on intermittently over the past few months.

The final chapter, ‘Getting Personal – The World According to Wintel’, has been reasonably straightforward to write in comparison with the tortuous Chapter 12, helped along by the plentiful source material available for this part of the story.  Unlike the other chapters, I’ve also lived through the entire period covered and have followed the events closely as they unfolded so I had a fairly clear idea of what I wanted to write from the outset.

The final chapter covers the emergence of the ‘Wintel’ platform in the 1980s and 1990s, and how a software company, Microsoft, came to dominate the industry.  It also brings the story up to date by including the development of the World Wide Web, the ensuing Browser Wars and how the incredible advances in portable computing over the past decade have led to an explosion in the use of Information Technology throughout the developed world.

With the total word count already well over 200,000 words, it has not been possible to cover the development of portable computing devices in any detail.  To do this topic justice would require an entire book.  Therefore, if I can get The Story of the Computer published, this would be the perfect topic for my next book.  Now all I need to do is find a willing publisher…

 

Anniversaries

The recent events held to mark the 70th anniversary of the Normandy Landings during World War II (a.k.a. D-Day) set me thinking about computer-related anniversaries.  Here are a few worth noting that have occurred over the past few months:-

  • 5 February was the 70th anniversary of the introduction of Colossus, the first large-scale electronic digital calculator.  Colossus was a massive step forward in the development of electronic computation but it was not the world’s first programmable electronic digital computer as is often reported.  It was a special-purpose machine created to perform Boolean operations on alphanumeric characters represented as 5-bit data.  For further information on Colossus, see Chapter 4 of my book.
  • 14 February was the 90th anniversary of the birth of International Business Machines.  IBM was actually established several years earlier in June 1911 under the name Computing-Tabulating-Recording Company (C-T-R).  The company’s early history is covered in Chapter 3 of my book.
  • 7 April was the 50th anniversary of the introduction of IBM’s System/360 family of medium and large scale computers (see my earlier post on the impact of the System/360 here).
  • 1 May was the 50th anniversary of the birth of the BASIC (Beginners All-purpose Symbolic Instruction Code) programming language.  BASIC was incredibly important in helping to establish a market for microcomputers in the late 1970s and it also contributed to the early success of Microsoft.  BASIC also has personal significance for me, as it was one of the first programming languages I learned to use.  I was also using FORTRAN during the same period and BASIC, though far less powerful, was much quicker and easier to use.
  • 7 June was the 60th anniversary of the death of computer pioneer Alan Turing (see my earlier post on Turing’s legacy here).

There are also a number of computer-related anniversaries coming up in the next few months which I’ll highlight in future posts.

IBM’s Game Changer

I was pleasantly surprised to see the BBC and several technology web sites pick up on this week’s 50th anniversary of the introduction of IBM’s System/360 family of medium and large scale computers.  The System/360 doesn’t often receive the attention it deserves but it really was a game changer for the industry, as it was the first range of computers to address the issue of software compatibility between different classes of machine, allowing code developed for one System/360 model to run on any other model in the family.  This was achieved through a technique known as microprogramming which was conceived by British computer scientist Maurice Wilkes at Cambridge University in 1951.

IBM System/360Before System/360, organisations looking to upgrade their computer system would have to rewrite their software to run on the new hardware, unless there happened to be an emulator available for the new model which could automatically translate and run code written for the old machine.  However, emulators were difficult to come by in the 1960s and many organisations remained locked in to ageing or inadequate systems as they could not afford the time or investment required to rewrite their software.  System/360 changed all this by allowing organisations whose needs had outgrown a particular class of machine to migrate to a larger model without having to worry about software compatibility.

The development of the System/360 range was one of the largest projects ever undertaken by IBM (or any computer company for that matter).  Development costs were estimated at a whopping $5 billion, equivalent to over $37 billion today.  The development of the operating system was so problematic that it inspired the manager responsible, Fred Brooks, to write his seminal book on software project management, The Mythical Man-Month: Essays in Software Engineering, which was first published in 1975.  Also, as the new family would replace IBM’s entire data processor product range and its development would necessitate a halt to all existing product development projects, including mid-life upgrades to existing models, it put the company in a highly vulnerable position until the new line came on stream.  One IBM executive summed it up perfectly by telling Fortune magazine, “We call this project ‘you bet your company’”.

Fortunately for IBM, the System/360 range was a huge success and the System/360 architecture would form the basis of the company’s data processor products for the next two decades.  In the 1960s where IBM led, the rest of the computer industry followed and other firms soon began offering their own families of compatible machines.  Some of these companies, such as RCA and Sperry Rand, went even further by also making their machines software compatible with System/360.

The impact of System/360 on the computer industry was immense, rivaled only by the IBM PC and Apple Macintosh in the 1980s.  It’s good to see this finally reflected in the media attention System/360 is now receiving as a result of its 50th anniversary.

Only One More to Go

I’ve finally completed Chapter 12 of my book (Bringing It All Together – The Graphics Workstation), which leaves only one more chapter left to write.   As the title suggests, Chapter 12 covers the development of the graphics workstation from the earliest efforts to create a high performance personal computer system for scientific and technical applications at MIT in the 1960s to the establishment of a commercial market for graphics workstations by companies such as Apollo Computer and Sun Microsystems in the early 1980s, and the subsequent adoption of workstation technology by Apple for the Macintosh.  It also includes Networking, an important building block which led to the creation of the global system of interconnected computers that we now call the Internet.

Apple MacintoshWeighing in at nearly 25,000 words, Chapter 12 is the longest chapter in the book and writing it took me 5 months longer than I’d originally estimated.  The main reason for this was the dearth of source material on this subject.  Unlike other areas of computer history, the development of the graphics workstation is not well documented so I had to conduct more research for this chapter than with previous chapters.  I also wanted to describe the contribution of Xerox PARC in some detail, as this was where so much of the graphical user interface technology we now take for granted originated.

The final chapter of the book will cover the development of Microsoft Windows and the emergence of the so-called ‘Wintel’ platform as the dominant platform for personal computing from the mid 1990s onwards.  Given the copious amounts of source material available on this topic, the research required should take less time than for Chapter 12 so I’ve set myself an ambitious target date for completion of 31 July this year.  I’ll keep you posted on progress.

Turing’s Legacy

It was good to hear the recent news that computer pioneer Alan M Turing has been granted a posthumous royal pardon for his 1952 conviction for gross indecency (as a result of a homosexual act, which at that time was illegal in the UK).  His punishment, for which he chose chemical castration rather than a prison sentence, and subsequent loss of security clearance for his cryptanalysis work are thought to have led to his untimely death by suicide two years later in 1954.  However, Turing’s pardon has also caused considerable controversy, as the thousands of other men who were convicted of homosexual acts during the same period are unlikely to receive the same treatment.  It seems that Turing has been singled out for a pardon as a result of his status as a national hero rather than for the degree of injustice that he suffered.

Alan TuringAlan Turing is one of the few British computer pioneers who is a household name.  This is well-deserved and an accurate reflection of the magnitude of his achievements.  However, many of the comments made in the media in the wake of the royal pardon announcement credit Turing with having a much greater influence on the development of the computer than is perhaps deserved.  Turing’s greatest achievements were in the field of mathematical logic and relate to the theory of computation rather than the practical development of computers.  His philosophy on the design of computers was to keep the hardware as simple as possible and rely on clever programming instead.  Consequently, his only computer design, for the NPL ACE (Automatic Computing Engine), made very little impact.

Alan Turing’s influence on the development of the computer is discussed in an excellent article by Professor Simon Lavington written last year as part of the celebration to mark the centenary of Turing’s birth.  The article, which is entitled Alan Turing: Is he really the father of computing?, questions how influential Turing’s pioneering work on early computers proved to be in their later development.  Lavington sums this up as follows:-

… his codebreaking work at Bletchley Park, and indeed Turing’s Ace design, exerted little influence on commercially viable computers as markets began to open up in the late 1950s.  Nevertheless, Alan Turing’s ideas remain to this day embedded in the theories of both codebreaking and computing.

Like Charles Babbage a century earlier, Alan Turing’s achievements went largely unrecognised during his lifetime and can only be fully appreciated when put into the context of later developments.  However, we must be careful not to place credit where it is not due.  Turing didn’t invent the modern computer anymore than Babbage did but his work did provide a solid theoretical foundation for later developments and for this he deserves to be honoured.

Are References Necessary in Non-Fiction Books?

A few people have asked me if I intend to include references in my book.  References are common in non-fiction books and usually take the form of a superscript number at the end of a sentence which links to a numbered list at the foot of the page, end of the chapter or back of the book containing the references to the source material.

Personally, I find references of this kind very annoying when reading a book.  They are difficult to ignore but severely interrupt the flow of the text when followed.  I appreciate that they are necessary for academic publications, but are they really necessary for books aimed at a more general readership?  Few non-academic readers will want to check out a reference and those who do can easily look it up on the Web.  References shouldn’t be required in order to support the author’s credibility, as the publisher will have made sure that the author knows his or her subject thoroughly before agreeing to publish the book in the first place.

If you look at the sample chapters of my own book, you will see that I’ve taken a slightly different approach.  At the end of each chapter there is a section entitled Further Reading which lists a selection of the source material for that chapter plus any related publications which may be of interest.  Any readers who want to delve deeper into the subject can do so by obtaining copies of this material, much of which can be found on the Web.

Of course, a publisher may take a different view and insist that I include full references for every scrap of source material used in the book.  I hope this won’t be the case but I guess it would be a small price to pay for the privilege of having my book published!