Let me set the scene for you. We’re in 1980, in suburban Massachusetts. For consumers, most computers are 16-bits, but cheaper memory means more memory means that 16-bit boundary is quickly being pushed to larger 32-bit memory spaces. Here, Data General pushes to build their first 32-bit minicomputer. This is a space not yet dominated by IBM, which by this point in history is focused more on building large mainframes.

For hardware/software engineers, the language of The Soul of a New Machine is familiar. The fundamental ideas of computing have not wholly changed — from CISC machines (like Data General’s computers, or Intel/AMD’s x86) to RISC machines (ARM, like what runs on most phones and recent Macbooks), registers and ALUs, timing analysis in logic design, virtual memory and OS privilege modes (this was a new idea in the book!). But there is much more that is foreign, much more that we have left behind in the 1980s — design via discrete chips and not a fancy HDL, debugging via oscilloscopes and not simulated waveforms. And many ideas have been properly iterated on since their introductions.

Some things never change. In the 1980s, we see the same discussions we’re having here with AI.

One student of [computing] has estimated that about forty percent of commercial applications of computers have proved uneconomical, in the sense that the job the computer was brought to perform winds up costing more to do after the computer's arrival than it did before. Most computer companies have boasted that they aren't just selling machines, they're selling productivity. ch. 13

Time’s a circle! You could remove the mentions of computers and hardware with generative AI tools and large language models, and it would hold just the same in 2025. In the 1980s, computers “have fostered the development of spaceships — as well as a great increase in junk mail”. They’ve advanced science such that they’re almost essential in those fields (in the 1980s!).

A reporter who had covered the computer industry for years tried to sum up for me the bad feelings he had acquired on his beat. "Everything is quantified," he said. "Whether it's the technology of the way people use it, it has an insidious ability to reduce things to less than human dimensions." Which is it, though: the technology or the way people use it? Who controls this technology? Can it be controlled? ch. 13

We’re now 45 years out from the writing of the book. We know now, with perfect hindsight, that computers have won this battle. The window shifts every 15 years: in the 2000s, the discussion was about mobile phones. In the 2010s, it was about social media. We now cannot imagine lives without these things. There is much praise for this book, and without rehashing it this is where my praise lies: for its timelessness in the face of the fastest-moving field in human history.

I will add that I suspect the book is best enjoyed with some previous technical knowledge first. But it is such essential reading for modern computer engineers that this should not be a barrier.

5/5


Written 30 April 2025
Adapted from my review on Goodreads.

Notes

Quick links: