Aspects of the book that have stuck with me include:
- His firm belief that interfaces should be small; it's not clear he's right about this.
- Many elegant examples, both from his teaching and his engineering.
- The wonderful treatment of information leakage, including some subcategorization of the problem. (Very roughly, leakage can be divided into leakage through an interface and "back-door" leakage. Interface-based leakages can be divided into those that result from oversized interfaces and those from inadequate defaults. Back-door leakages can be divided into those resulting from the use of structural information in several places, those resulting from there being too many objects; and those resulting from other causes.)
The sections below cover important topics from the book in slightly more detail:
Design it twice
Because designing systems is hard, Ousterhout recommends considering at least two designs before choosing one; see chapter 11.
Note that this is analogous to standard advice in cue sports.
We can decompose this advice into (i) caring enough about getting the design right to put in a bit of extra effort and (ii) learning about something--in this case, a design--by asking "if this is wrong, how is it wrong?"
Re: (1), when you're reading a book about software design, it's easy to care about getting design right. But in industry, there are so many reasons to lose sight of it. A few of these are: the pressure of short-term bug fixes, the assumption that a system will naturally optimize itself over time, the difficulty of communicating about system design, and the fact that educations in software often neglect the sort of design in question.
Re: (2), for generality and power, that question has to be one of the most effective cognitive tools there is. (Its converse, "what follows from this if this is right?," is at least as powerful and is a core of the method of analysis.)
Ousterhout recommends spending "about 10-20% of your total development time" on investments such as reconsidering class designs, improving documentation, and fixing bugs more deeply than is required to close a ticket. See sections 3.2-3.4.
This is a nice example of compounding returns paying off more quickly than we might suspect--and, therefore, of the more general phenomenon that people aren't good at estimating the effects of exponential growth.
I suspect that the right number is closer to 20%--actually, I suspect that the right number is higher than that. My investment rate even for "throwaway" projects is at least 10-15%, and surely investment gets more valuable when project time is measured in weeks and years instead of hours.
Although some software engineers truly don't believe that these investments are worth it, many suffer from lack of discipline either at the personal or organizational level. Here as elsewhere, modest improvements in discipline and culture can matter more than large improvements in engineering skill, narrowly construed.
A method that invokes another method, has a similar signature to it, and does not add much functionality to it. Ousterhout discusses the problem and argues that it is a red flag (see below) in section 7.1.
I don't have much to add to Ousterhout's excellent discussion except a speculative diagnosis: constructing layers of abstraction that are only subtly different is sometimes intended (and interpreted) as a sign of care and sophistication. (My code review reference contains a brief discussion of this, under "This layer of indirection...")
These are discussed throughout the book and summarized in an appendix.
This might be the most valuable aspect of the book; fluency with a set of red flags is a big efficiency boost (especially in code reviews). I hope that other engineers will publish on the subject.
The mistake in which "the structure of a system corresponds to the time order in which operations occur;" see section 5.3.
This mistake is vastly more common and more severe than acknowledged; if anything, Ousterhout doesn't do enough to emphasize it. (But by mentioning it at all, he does more than most.)
Related: if what you need to be reasoning about is a dependency graph, don't confuse it with a data flow diagram. (Do not confuse the dependency relation with the data-that-used-to-be-here-is-now-there relation.)
This bias extends beyond software. Perhaps some biographies should go through the life chronologically, but far too many of them do.