All that is solid melts into code
It’s been wild to see the arc of AI bend so powerfully towards software development this year. I’m not sure that, in the summer of 2024, anybody was sitting around saying, wow, one of the chief problems facing the world today is the difficulty of producing lines of code. (It was not.) Yet language models (1) are naturally very good at operating inside this magic circle, and, more importantly, (2) can very effectively be trained to become even better.
The second point in particular makes it feel, in retrospect, inevitable: code matches formal(ish) verification, “answers you can check”, to broad application, “answers you care about”, in a way that might be unique. Let ’er rip!
That’s an over-the-row olive harvester. Most olive oil production at medium-or-greater scale depends on machines of this kind; they trundle over trees planted in long rows, almost like continuous hedges, and collect the fruit with vibrating fingers. Machine-harvested olives (1) are cheaper, and (2) arrive at the mill in better shape than olives harvested by hand.
One catch: most olives can’t be cultivated in this configuration, called super high-density; the trees don’t thrive so close together. Only a handful of varieties will tolerate it … so those handful have been planted in huge numbers … and the flavor of global olive oil has changed as a result.
Automation never meets a task in the world and simply does it. There’s always negotiation —
The fact that language models are great at code means there is suddenly a strong incentive for more things to become code. Like: very strong. The majority of valuable work can’t be reformulated in this way; see: the olive harvest. But plenty can be, and plenty is on the cusp, and the cuspy work will feel intense pressure to go 100% code.
If AI is the superfast harvester, then code is the high-density olive variety: there will be more of it now.
It’s not that all work needs to be broken down into flow charts; language models can totally handle a mess. A large part of the excitement here emerges from the understanding that this round of digitization won’t be like the last one, wedging ambiguous human processes into rigid database schemas, being surprised, every time, when they don’t fit.
But language models do prefer their mess to be symbolic —
So it’s paradoxical: language models are some of the most organic technologies ever produced, totally capable of coaxing computation out into the realm of the human … yet instead they’ll pull a vast field of human activity deeper into the domain of code. This is just my prediction, of course —
In the late 2020s, I think a lot of people are going to discover that their job has become: “Translate your work into code. Translate yourself, while you’re at it.”
As with most of the AI stuff, I’m ambivalent, in the sense of having many thoughts and feelings at once. I do think the “path not taken”, of using this technology, in all its flexibility, as a lever to prise ourselves OUT of digital systems, AWAY from the internet, is a tragic one to miss.
There are potential remedies, secret roads —
P.S. AI continues to be a spectacle of strange cause and effect. In another universe without a strong culture of open source, there’s not an enormous pile of freely available code —