# On quadratic complexity

Many algorithms have quadratic complexity: sometimes this is necessary, but often it's not. This and related facts are widely known but underappreciated. Quadratic complexity is a more pervasive phenomenon, and explains more other phenomena, than is generally recognized.

### Quadratic complexity is everywhere

It often happens that you need to track some pairwise relationship for all pairs of objects in some set. These situations are more easily perceived when the objects are elements of an array. Sometimes, though, the objects are the components of a software system: here, "gluing" those components together generates tasks that are quadratic in the number of components you have.

Within a component, one often sees quadratic complexity at higher levels than arrays. This is one reason it's so important to model your domain correctly: if, for example, you make something a property of *A* when it should be a property of *B*, the penalty is often that operations that formerly required checking all the *A*s __or__ all the *B*s now require checking all the *A*s __and__ all the *B*s. You're paying a quadratic cost across a larger domain. (Also, your code tends to be harder to reason about, because it's harder for your brain to invoke some hodgepodge of intuitions about *A* and *B* than either one individually--but that's a separate issue.)

### Quadratic complexity can explain other things

We do (and ought to!) think about how fast exponential growth happens, but we sometimes forget that quadratic growth is pretty fast too. Think about an old riddle: suppose the scum on the surface of a pond doubles every day (growing exponentially), and at the end of April it covers the whole pond. When did it cover only 1/4 of the pond? April 28, two days before the end of the month.

Suppose the scum is growing quadratically: then, on the 28th, it's about 87% full. That's way more than a quarter, but less than intuition might first suggest.

If you're programming, and need to keep track of all the pairwise relationships in a set of items, it matters a *lot* whether that set has four or five things in it: that's the difference between 6 and 10 pairwise relationships. Whatever you think of the research about short-term memory, 10 is a lot bigger than 6 as far as your "active brain" is concerned.

This can explain why getting stuck in programming can feel (i) so sudden and (ii) so intractable. At least sometimes, "why is this so hard to reason about now when it wasn't a minute ago" is answerable with: "because those two extra things you added generated another couple dozen relations (or cases to check or failure modes)."

If you've ever wondered why Norris's number exists: look here, perhaps.