Posts

Publishers Rejoice!

I’ve done it.  After 11 years and god knows how many hours of work, I’ve finally completed the first draft of my book on the history of the computer.  Weighing in at just over 235,000 words, it covers the entire 400-year history of computing technology, from humanity’s earliest efforts to mechanise calculation in the 17th century to the latest devices which merge portable computing and mobile telecommunications technologies to enable new methods of social interaction.

Researching and writing the book has taken up most of my spare time over the past 11 years but, rather than the expected feeling of huge relief, finishing the book has been something of an anticlimax so far.  I’m already missing the discipline of dragging myself away from the TV or the Internet every night in order to write a few more paragraphs.  I could start another book but there isn’t really any point in doing so unless I can do something useful with this one.  Therefore, I now need to make an effort to get the book published.

In my earlier post on the subject (The Self-Publishing Dilemma), I identified three options; (i) securing a publishing deal with a book publisher, (ii) self-publishing the book, or (iii) putting the entire book up on the Web as a free download.  I am under no illusions as to how difficult it will be to secure a deal with a mainstream publisher, especially for someone who has no track record as an author.  Most mainstream publishers won’t accept unsolicited book proposals in any case.  It might be possible to interest one of the academic publishers in my book (e.g. Oxford University Press, MIT Press, etc.), as these publishers do accept unsolicited proposals but they also expect their authors to have the relevant academic credentials and I am not an academic.  The size of the book could also discourage all but the most enthusiastic of publishers, as it would appear to be more than double the word count of a typical non-fiction title.

The sensible option is probably to self-publish but, before I venture down this costly and time-consuming path, it might be worthwhile making at least one attempt to secure a publishing deal.  Therefore, I’ve compiled a ‘hit list’ of non-fiction publishers who accept unsolicited proposals and whose catalogues include history, popular science and/or business titles.  The next stage will be to produce and submit proposals for each of them in the specified format (which seems to be different for each publisher) then cross my fingers and see what happens.  I’ll let you know how I get on.

Advertisement

Not Turing Again!

Following my earlier post on the legacy of computer pioneer Alan M Turing (Turing’s Legacy), Turing’s achievements have again hit the headlines with the news last week that members of the Institution of Mechanical Engineers (IMechE)  have voted the Bombe at Bletchley Park as their favourite Engineering Heritage Award winner from the past 30 years.

Turing's BombeThe Bombe was an electromechanical code-breaking device which emulated the rotors of an Enigma machine, the typewriter-like device used by the German military to encrypt radio messages during World War II.  It employed a motorised mechanism to step rapidly through every possible combination of rotor settings and apply a logical test to each one to determine if it met the conditions for a possible solution which had been wired into the machine before operation.  The Bombe was closely based on an earlier device developed by the Polish Cipher Bureau in 1938.  Alan Turing improved the Polish design by incorporating 36 sets of rotors to allow multiple checks to be made in parallel.  Turing’s design was then turned into a fully-engineered device by the British Tabulating Machine Company under the direction of chief engineer Harold H Keen.

I must confess to being somewhat surprised by the result of the IMechE vote, as the Bombe was a workmanlike device which employed a brute force approach to cryptanalysis and lacks the sophistication and ingenuity of later developments at Bletchley Park (although these later developments employed electronic technology which probably renders them ineligible).  It also ranks low in terms of Alan Turing’s achievements, as the concept was not entirely his.

There have been 100 winners of the Engineering Heritage Award to date, so the IMechE members had many other remarkable examples of engineering excellence to choose from, including the world’s first supersonic airliner, Concorde, which came second in the vote.  The fact that they chose something designed by Alan Turing may be due in part to the high level of media attention he is receiving at the moment.  This attention is likely to increase further with the release next month of the Hollywood film The Imitation Game, a dramatised account of Turing’s cryptanalysis work at Bletchley Park during World War II.  Let’s hope the filmmakers have done their research and that Turing’s achievements are portrayed accurately.  I certainly plan to go see it and will report back in a future post.

Inching Towards Completion

In my last post on progress with the book (Only One More to Go) which I posted back in February, I mentioned that I was about to begin the final chapter and had set myself an ambitious target date for completion of 31 July this year.  Well, that date has now passed but I’m not quite there yet.

With the home straight in sight, initial progress was indeed swift but work and summer holidays have gotten in the way in recent weeks, with the result that I’m still several weeks from completion.  I’ve written just over 14,000 words out of an estimated total of around 17,000.  I also have to finish the introduction to the book, which will require an additional 1,500 words or so, plus a much-needed edit of Chapter 10 which I’ve been working on intermittently over the past few months.

The final chapter, ‘Getting Personal – The World According to Wintel’, has been reasonably straightforward to write in comparison with the tortuous Chapter 12, helped along by the plentiful source material available for this part of the story.  Unlike the other chapters, I’ve also lived through the entire period covered and have followed the events closely as they unfolded so I had a fairly clear idea of what I wanted to write from the outset.

The final chapter covers the emergence of the ‘Wintel’ platform in the 1980s and 1990s, and how a software company, Microsoft, came to dominate the industry.  It also brings the story up to date by including the development of the World Wide Web, the ensuing Browser Wars and how the incredible advances in portable computing over the past decade have led to an explosion in the use of Information Technology throughout the developed world.

With the total word count already well over 200,000 words, it has not been possible to cover the development of portable computing devices in any detail.  To do this topic justice would require an entire book.  Therefore, if I can get The Story of the Computer published, this would be the perfect topic for my next book.  Now all I need to do is find a willing publisher…

 

Anniversaries

The recent events held to mark the 70th anniversary of the Normandy Landings during World War II (a.k.a. D-Day) set me thinking about computer-related anniversaries.  Here are a few worth noting that have occurred over the past few months:-

  • 5 February was the 70th anniversary of the introduction of Colossus, the first large-scale electronic digital calculator.  Colossus was a massive step forward in the development of electronic computation but it was not the world’s first programmable electronic digital computer as is often reported.  It was a special-purpose machine created to perform Boolean operations on alphanumeric characters represented as 5-bit data.  For further information on Colossus, see Chapter 4 of my book.
  • 14 February was the 90th anniversary of the birth of International Business Machines.  IBM was actually established several years earlier in June 1911 under the name Computing-Tabulating-Recording Company (C-T-R).  The company’s early history is covered in Chapter 3 of my book.
  • 7 April was the 50th anniversary of the introduction of IBM’s System/360 family of medium and large scale computers (see my earlier post on the impact of the System/360 here).
  • 1 May was the 50th anniversary of the birth of the BASIC (Beginners All-purpose Symbolic Instruction Code) programming language.  BASIC was incredibly important in helping to establish a market for microcomputers in the late 1970s and it also contributed to the early success of Microsoft.  BASIC also has personal significance for me, as it was one of the first programming languages I learned to use.  I was also using FORTRAN during the same period and BASIC, though far less powerful, was much quicker and easier to use.
  • 7 June was the 60th anniversary of the death of computer pioneer Alan Turing (see my earlier post on Turing’s legacy here).

There are also a number of computer-related anniversaries coming up in the next few months which I’ll highlight in future posts.

IBM’s Game Changer

I was pleasantly surprised to see the BBC and several technology web sites pick up on this week’s 50th anniversary of the introduction of IBM’s System/360 family of medium and large scale computers.  The System/360 doesn’t often receive the attention it deserves but it really was a game changer for the industry, as it was the first range of computers to address the issue of software compatibility between different classes of machine, allowing code developed for one System/360 model to run on any other model in the family.  This was achieved through a technique known as microprogramming which was conceived by British computer scientist Maurice Wilkes at Cambridge University in 1951.

IBM System/360Before System/360, organisations looking to upgrade their computer system would have to rewrite their software to run on the new hardware, unless there happened to be an emulator available for the new model which could automatically translate and run code written for the old machine.  However, emulators were difficult to come by in the 1960s and many organisations remained locked in to ageing or inadequate systems as they could not afford the time or investment required to rewrite their software.  System/360 changed all this by allowing organisations whose needs had outgrown a particular class of machine to migrate to a larger model without having to worry about software compatibility.

The development of the System/360 range was one of the largest projects ever undertaken by IBM (or any computer company for that matter).  Development costs were estimated at a whopping $5 billion, equivalent to over $37 billion today.  The development of the operating system was so problematic that it inspired the manager responsible, Fred Brooks, to write his seminal book on software project management, The Mythical Man-Month: Essays in Software Engineering, which was first published in 1975.  Also, as the new family would replace IBM’s entire data processor product range and its development would necessitate a halt to all existing product development projects, including mid-life upgrades to existing models, it put the company in a highly vulnerable position until the new line came on stream.  One IBM executive summed it up perfectly by telling Fortune magazine, “We call this project ‘you bet your company’”.

Fortunately for IBM, the System/360 range was a huge success and the System/360 architecture would form the basis of the company’s data processor products for the next two decades.  In the 1960s where IBM led, the rest of the computer industry followed and other firms soon began offering their own families of compatible machines.  Some of these companies, such as RCA and Sperry Rand, went even further by also making their machines software compatible with System/360.

The impact of System/360 on the computer industry was immense, rivaled only by the IBM PC and Apple Macintosh in the 1980s.  It’s good to see this finally reflected in the media attention System/360 is now receiving as a result of its 50th anniversary.

Only One More to Go

I’ve finally completed Chapter 12 of my book (Bringing It All Together – The Graphics Workstation), which leaves only one more chapter left to write.   As the title suggests, Chapter 12 covers the development of the graphics workstation from the earliest efforts to create a high performance personal computer system for scientific and technical applications at MIT in the 1960s to the establishment of a commercial market for graphics workstations by companies such as Apollo Computer and Sun Microsystems in the early 1980s, and the subsequent adoption of workstation technology by Apple for the Macintosh.  It also includes Networking, an important building block which led to the creation of the global system of interconnected computers that we now call the Internet.

Apple MacintoshWeighing in at nearly 25,000 words, Chapter 12 is the longest chapter in the book and writing it took me 5 months longer than I’d originally estimated.  The main reason for this was the dearth of source material on this subject.  Unlike other areas of computer history, the development of the graphics workstation is not well documented so I had to conduct more research for this chapter than with previous chapters.  I also wanted to describe the contribution of Xerox PARC in some detail, as this was where so much of the graphical user interface technology we now take for granted originated.

The final chapter of the book will cover the development of Microsoft Windows and the emergence of the so-called ‘Wintel’ platform as the dominant platform for personal computing from the mid 1990s onwards.  Given the copious amounts of source material available on this topic, the research required should take less time than for Chapter 12 so I’ve set myself an ambitious target date for completion of 31 July this year.  I’ll keep you posted on progress.

Turing’s Legacy

It was good to hear the recent news that computer pioneer Alan M Turing has been granted a posthumous royal pardon for his 1952 conviction for gross indecency (as a result of a homosexual act, which at that time was illegal in the UK).  His punishment, for which he chose chemical castration rather than a prison sentence, and subsequent loss of security clearance for his cryptanalysis work are thought to have led to his untimely death by suicide two years later in 1954.  However, Turing’s pardon has also caused considerable controversy, as the thousands of other men who were convicted of homosexual acts during the same period are unlikely to receive the same treatment.  It seems that Turing has been singled out for a pardon as a result of his status as a national hero rather than for the degree of injustice that he suffered.

Alan TuringAlan Turing is one of the few British computer pioneers who is a household name.  This is well-deserved and an accurate reflection of the magnitude of his achievements.  However, many of the comments made in the media in the wake of the royal pardon announcement credit Turing with having a much greater influence on the development of the computer than is perhaps deserved.  Turing’s greatest achievements were in the field of mathematical logic and relate to the theory of computation rather than the practical development of computers.  His philosophy on the design of computers was to keep the hardware as simple as possible and rely on clever programming instead.  Consequently, his only computer design, for the NPL ACE (Automatic Computing Engine), made very little impact.

Alan Turing’s influence on the development of the computer is discussed in an excellent article by Professor Simon Lavington written last year as part of the celebration to mark the centenary of Turing’s birth.  The article, which is entitled Alan Turing: Is he really the father of computing?, questions how influential Turing’s pioneering work on early computers proved to be in their later development.  Lavington sums this up as follows:-

… his codebreaking work at Bletchley Park, and indeed Turing’s Ace design, exerted little influence on commercially viable computers as markets began to open up in the late 1950s.  Nevertheless, Alan Turing’s ideas remain to this day embedded in the theories of both codebreaking and computing.

Like Charles Babbage a century earlier, Alan Turing’s achievements went largely unrecognised during his lifetime and can only be fully appreciated when put into the context of later developments.  However, we must be careful not to place credit where it is not due.  Turing didn’t invent the modern computer anymore than Babbage did but his work did provide a solid theoretical foundation for later developments and for this he deserves to be honoured.

Are References Necessary in Non-Fiction Books?

A few people have asked me if I intend to include references in my book.  References are common in non-fiction books and usually take the form of a superscript number at the end of a sentence which links to a numbered list at the foot of the page, end of the chapter or back of the book containing the references to the source material.

Personally, I find references of this kind very annoying when reading a book.  They are difficult to ignore but severely interrupt the flow of the text when followed.  I appreciate that they are necessary for academic publications, but are they really necessary for books aimed at a more general readership?  Few non-academic readers will want to check out a reference and those who do can easily look it up on the Web.  References shouldn’t be required in order to support the author’s credibility, as the publisher will have made sure that the author knows his or her subject thoroughly before agreeing to publish the book in the first place.

If you look at the sample chapters of my own book, you will see that I’ve taken a slightly different approach.  At the end of each chapter there is a section entitled Further Reading which lists a selection of the source material for that chapter plus any related publications which may be of interest.  Any readers who want to delve deeper into the subject can do so by obtaining copies of this material, much of which can be found on the Web.

Of course, a publisher may take a different view and insist that I include full references for every scrap of source material used in the book.  I hope this won’t be the case but I guess it would be a small price to pay for the privilege of having my book published!

The Apple Falls

The downward trend in prices paid for working examples of the rare Apple-1 microcomputer continued last week when the latest example to be sold at auction went for only $330,000, a fall of over $57,000 from the previous Apple-1 sold by Christie’s in July and less than half the record price of $671,400 paid for a similar example in May.  The reason for this isn’t clear, as the computer was in excellent condition and included the original box plus monitor, software and peripherals.  It may be that the Apple-1 is no longer seen as quite so rare, as this was the fifth to come up for auction in only 18 months.

The auction, which was held by Auction Team Breker in Cologne, Germany, also featured an Arithmometer manufactured by Thomas de Colmar in Paris in the 19th century.  This rare example of the first mass-produced mechanical calculating machine sold for $313,000, a new world record price for an Arithmometer.  The date of manufacture was given by the auction house as 1835 but this is almost certainly incorrect, as Thomas did not finalise the design of his machine until 1848 and the presence of a serial number (No. 541) on the front panel suggests that it was one of a later batch of machines manufactured between 1867 and 1870.

Thomas de Colmar's Arithmometer

I’m a huge fan of early Apple computers, having used an Apple II and an Apple Macintosh extensively in the 1980s.  However, I always felt that they were overvalued by collectors in comparison to genuine antiques such as the Arithmometer, which are much older and in most cases rarer than early microcomputers, so it’s heartening to see signs that this disparity in prices may be coming to an end.

Computing Heroes

It was good to see Ada Lovelace Day attracting a large amount of press coverage this year.  The international day of celebration, which was held on 15 October, has been running for only 4 years and aims to raise the profile of women in science, technology, engineering and maths by encouraging people around the world to talk about the women whose work they admire.  This year’s Nobel Prize also attracted similarly high levels of press attention, particularly here in the UK as a result of the Physics prize having been awarded to Professor Peter Higgs of Edinburgh University.

Unfortunately, there isn’t a Nobel Prize in Engineering.  The closest we have to it is probably the Queen Elizabeth Prize for Engineering which was awarded for the first time in June.  The inaugural prize was jointly awarded to Robert Kahn, Vint Cerf and Louis Pouzin for their contributions to the protocols that make up the fundamental architecture of the Internet, Sir Tim Berners-Lee, for inventing the World Wide Web, and Marc Andreessen, who wrote the Mosaic web browser.  All five richly deserve their award, as the Internet and World Wide Web have in the words of the judges “initiated a communications revolution which has changed the world“.

Ada LovelaceAda Lovelace is also worthy of honour.  Known as the Enchantress of Numbers, she was the person who contributed most to our understanding of Charles Babbage’s Analytical Engine through her notes of 1843.  Her contribution may have had less of an impact than that of the five people mentioned above but she is an excellent role model and there is little doubt that she would have played a pivotal role in putting the machine to work had Babbage actually succeeded in constructing the Analytical Engine.

J Presper EckertThis got me thinking about who I would pick as my own hero amongst the many hundreds of contributors to the development of the computer featured in my book.  In terms of engineering, it would have to be the American electrical engineer J Presper Eckert.  Eckert’s name is associated with several of the most significant innovations in the early years of electronic computation, including the mercury delay line, magnetic drums and disks, the electrostatic storage tube, magnetic core storage and the stored-program concept.  He and Moore School of Electrical Engineering colleague John W Mauchly were also responsible for the development of the first general-purpose electronic digital calculator, ENIAC, and the UNIVAC I, which was the most influential of the early large-scale computers to reach the market.

Presper Eckert was not only a highly successful engineer.  He also co-founded the Eckert-Mauchly Computer Corporation, one of the earliest computer companies which was later absorbed into Remington Rand.  Eckert died in 1995 as a result of complications from leukemia.  Though not exactly an unsung hero, most of the awards he received during his lifetime were shared with John Mauchly and it is high time that he was recognised in his own right as one of the greatest engineers of all time.