Not all of this is new
There's a lot that's new about generative AI, but transformational change in software has happened before, and there are some useful analogies between this transformation and previous ones.1
When cloud computing was new, a lot of people reacted with contempt, on the grounds that:
- It was inefficient;
- It discarded the hard-won knowledge of local database and system administrators;
- It gave too much power to a handful of companies;
- It necessitated data centers that would have catastrophic environmental effects;
- It was irresponsible to ship off data entrusted to you by your users;
- It would never survive long-term regulatory rigor ("don't these people know about HIPAA?").
- It was just plain ugly to replace traditional techniques with a bunch of API calls.
When Python and Ruby were new, there was more, and vigorous, contempt:
- They were so far from machine code that you couldn't hope to understand, much less fix, lower-level problems: problems, that is, with the actual code;2
- They made it so easy to prototype that people would get distracted from getting the details right and keeping software maintainable;
- It was easy enough to get started that the profession would get flooded with people who could just barely write Python scripts or pass out of Rails bootcamps, and that so much ignorance in the profession would have corrosive effects.
To be clear: generative AI really is new and transformational. And, whether or not you like their vibes, not all of those claims were entirely false. But we've seen many of the problems we face before. We accumulated a lot of tech debt very quickly with Python, but Python also enabled countless verification, testing, and infrastructure tools that improved infrastructure, code health, and tech debt. The "script kiddies" and bootcamp grads brought new perspectives and skill sets to the profession, and this has made us much better off. And so on.
There are more lessons to draw, not all of which would lead us to optimism. My point is simply that current developments are revolutionary, but not so revolutionary as to have made history irrelevant.
I'm sure that everything I say here has been said many times, but the point seems often to be missed, perhaps because many people find it too obvious to need saying, and many others don't know about it.↩
Calling Python "interpretable pseudocode" was always supposed to be funny, but it wasn't always intended as a joke. Many people thought, and still think, that working with sufficiently high-level languages, including Python, is simply not really programming.↩