cover
Buy from Amazon
 

Richard Feynman,Tony Hey and Robin Allen

Feynman Lectures on Computation

Feynman Lectures on Computation is based on a series of lectures given by Richard Feynman in the early 1980's. Note that it isn't a comprehensive look at computer science in the model of his Lectures on physics. Its more like 'A physicist looks at computation'. Rather than taking the 'black box' view of computers, Feynman clearly wants to know whats in those boxes and how it works. So whilst you might think that the lectures in this book would be out of date, I would say that they still contains much of interest, which is presented in Feynman's usual (reasonably) easy to read style.

The book starts off by examining at the basics of computers such as logic gates and how to make them from transistors. This is followed by chapters on the theory of computation - Turing machines and the like - and on information theory. Feynman then looks at issues related to the thermodynamics of computation. The next chapter is on quantum mechanical computers, but its interesting to note that the idea here is to do classical computation on the scale of atoms, rather than to get exponential speedup. The last chapter examines the physics of actual computers, in particular VLSI technology.