EmbeddedRelated.com
Blogs
Memfault Beyond the Launch

Book Review: "Turing's Cathedral"

Jason SachsNovember 20, 20146 comments

My library had Turing’s Cathedral: The Origins of the Digital Universe by George Dyson on its new acquisitions shelf, so I read it. I’d recommend the book to anyone interested in the history of computing.

Turing’s Cathedral primarly covers the period in early computing from 1940-1958, and bridges a gap between a few other popular books: on the historic side, between Richard Rhodes’s The Making of the Atomic Bomb and Dark Sun, and Steven Levy’s Hackers; on the biographic side, between Richard Feynman’s Surely You’re Joking, Mr. Feynman! and Stanislav Ulam’s Adventures of a Mathematician. Dyson has a meandering style, which is both a strength and a weakness: the book does not have a single primary subject, but rather focuses on the convergence of three entities: Princeton’s Institute for Advanced Study, physicist and mathematician John von Neumann, and the IAS machine, a pioneering electronic computer designed and built at IAS back in the days of vacuum tubes. It does not focus, though the title hints otherwise, on Alan Turing, although Dyson devotes one chapter of the book to Turing and his work both in theoretical and practical computation.

Dyson has written several books on the history of science. This book is close to home: Dyson grew up at IAS (his father, physicist Freeman Dyson, held a professorship at IAS from 1953 to 1994) and had first-hand access to several of the book’s characters.

This article is available in PDF format for easy printing

From the viewpoint of a Serious Historical Work, I think Dyson does a wonderful job. He seems to take the Ken Burns documentary approach; this is a scholarly book on an interesting time: it abounds in quotes and endnotes, and really paints a good picture of a number of interrelated characters and subjects. It conveys things like the motivations for the founding of IAS, the urgency of von Neumann getting married to his second wife Klára and getting her out of Hungary prior to the start of the Second World War, and the social tension between the IAS mathematicians and the engineers working on von Neumann’s machine.

As far as technological coverage, I have mixed thoughts. Dyson does go into a fair amount of detail into the technology of 1940s and 1950s computers. Part of the success and financial viability of the IAS machine depended on the mass-market 6J6 vacuum tube. (Remember: the transistor was not discovered until 1947, and not commercialized until several years later) The memory of these early computers used several different techniques that managed to stuff multiple bits into one device, from mercury delay lines to Williams tubes — none of them had the stability of today’s SRAM or DRAM, and had to be constantly rewritten every time the memory was read. The first few computers like ENIAC used hard-wired programming; literally the wiring had to be changed any time a different program was executed. But one of von Neumann’s contributions (now inherent in the von Neumann architecture) was to realize, perhaps from Turing’s theoretical work on universal machines, that the programs could be represented by data. In the late 1940’s, when the IAS machine was still under construction and work on simulating the ignition of hydrogen bombs grew increasingly urgent, von Neumann came up with the idea of converting ENIAC to a stored-program computer to speed up the process of software modifications. Some of the reliability aspects of computers we take for granted today were nowhere near possible back then: the IAS team used a 40-bit machine word, and creating shift registers in an ALU that consistently produced the correct result took quite a bit of engineering design and testing. Dyson portrays these issues very well.

On the other hand, he consistently refers to what we would call “software” and “programs” as “codes”, and Dyson’s understanding and explanation of some software engineering principles seem very vague and hand-wavy. For example:

To numerical organisms in competition for computational resources, the opportunities are impossible to resist. The transition to virtual machines (optimizing the allocation of processing cycles) and to cloud computing (optimizing storage allocation) marks the beginning of a transformation into a landscape where otherwise wasted resources are being put to use. Codes are becoming multicellular, while the boundaries between individual processors and individual memories grow indistinct.

Huh? Dyson misses the point somewhat about virtual machines and cloud computing, which are much more about relocatable and rescalable computing — rather than having to house and maintain computers to handle the peak computational load, we can run software remotely as a service, and purchase resources on an as-needed basis; the providers of these services can reroute and reallocate resources in real-time without any intervention by or impact on their customers. There’s something here in Dyson’s statement alluding to parallel processing, but I’m stuck in a fog of unconventional words, like “codes” and “multicellular”. This is only one example; Dyson’s book would have greatly benefited from a few strategically-placed diagrams.

His writing style is okay but nothing fantastic. If you want a good engaging read on computing, check out Levy’s Hackers or Tracy Kidder’s Soul of a New Machine, or Richard Preston’s The Mountains of Pi, or these books in other technological and scientific topics:

Still, I did enjoy reading Turing’s Cathedral, and I feel like I learned a lot about the unfamiliar methods of early computing, and about the determination and brilliance of John von Neumann and his fellow scientists. One mistake we tend to make in modern times is to underestimate the intelligence and inventiveness of earlier software engineers. We have the benefit of modern technology and hindsight: we have available, at our fingertips, advanced IDEs, debuggers, massive amounts of processing power and storage, high-level languages, and good practices in application design and project management. The early pioneers of computing did not. Computing was slow, expensive, severely constrained, and not always reliable. The bulk of programming at the time was in machine code, and programs had to be created on punch cards or paper tape. And still, they were able to innovate and accomplish amazing tasks like the simulation of an atomic bomb. These days we don’t seem to have the legendary scientists like Einstein, von Neumann, Turing, Ulam, and Feynmann. Instead we have teams in industry, with the occasional Linus Torvalds, Steve Jobs, Larry Page and Sergey Brin standing out. Our society has changed in the way we innovate, and we can only guess at how things will change in years to come.


© 2014 Jason M. Sachs, all rights reserved.



Memfault Beyond the Launch
[ - ]
Comment by amooreNovember 20, 2014
Very well-written book review. Aside from Mr Sachs' insights into the book itself, I appreciate the links to other, related books and articles (readers are always seeking things to read). I would [occasionally] welcome other book reviews, particularly on books more directly related to the craft of implementing embedded systems.
[ - ]
Comment by stephanebOctober 7, 2016

Hi Jason,

if I have done things right, you should receive a notification by email letting you know about this new comment.  Please reply to this comment so I know that the notification system works fine and so you can test the new system. Thanks!

[ - ]
Comment by jms_nhOctober 7, 2016
yep, got it
[ - ]
Comment by stephanebOctober 7, 2016

Ok great!  Thanks.

[ - ]
Comment by jms_nhOctober 7, 2016

Is there any way you could include the comment in the email notification itself? I like visiting this site but it would be convenient to see the comment earlier.

[ - ]
Comment by stephanebOctober 7, 2016

I agree that it would be convenient, but not easy to do well because comments are in html and can include Mathjax code, pictures, etc.  There are scripts that will convert html to text, but in my experience so far, very few do a good job.  What I might try to do is send html emails.  That would solve the images issue but not the Mathjax one.

Thanks for the suggestion, I'll keep this in mind. 

To post reply to a comment, click on the 'reply' button attached to each comment. To post a new comment (not a reply to a comment) check out the 'Write a Comment' tab at the top of the comments.

Please login (on the right) if you already have an account on this platform.

Otherwise, please use this form to register (free) an join one of the largest online community for Electrical/Embedded/DSP/FPGA/ML engineers: