Josef and I had a chat about ChatGPT and LLMs in general on a recent episode of Memhaz (hopefully I'll remember to edit in a link; I'm migrating the site to a different host and it's midway through; not finished yet 😅) and I think it intersects with something we’d spoken about earlier in a different way and I wanted to draw that out.
When ChatGPT came our and people started talking about how it or things like it might affect different businesses, I flashed back to a chat I'd had with Josef about the "shape rotators vs wordcels" meme. It occurred that LLMs might shift some societal and economic power back to the wordcels from the shape rotators. Being a 'computer person', the ability to get computers to do what you want is an incredibly powerful one, and heartily remunerative, but harnessing their full potential is limited to code nerds, who tend to be, idomatically, at least, "shape rotators". If there's a new class of human/machine interaction that's chiefly based around correctly phrasing your request in natural language, that tips things in a certain direction.
“Learning to code” is obviously, at least in part, an exercise in learning how to think logically, memorising what various things are called etc, but it’s just as much an exercise in forcing yourself to think in the way the computer does. I’ve been helping someone learn to code recently and it’s really reminded me of the problems I had when I was learning to code years ago: "why is it like this, I just want to do this one thing, why do I have to do it in this weird particular way". You are not the machine's master; coding is not controlling the computer—it's forcing your brain into computer-think and using that to cajole it into doing what you want.
LLMs change things because they move where the fulcrum of this dynamic sits. The computer has eaten so many people-words it’s become amenable to some kind of people-persuasion, so now all the wordcels can be, to some extent, computer people. As for the shape rotators in the nerd department: one potential effect of this, on some level, could be the… not deskilling, exactly, but the self-checkoutification of code. They become the support staff for the machines. (Josef suggested that jobs of this nature are far more likely to be bug-spotting and squashing than anything else.) Automation is almost never about entirely replacing people with machines, but reducing people as much as possible, using them to maintain the machines rather than doing anything actually useful themselves.
It feels like this ends with the real collapse of something or the Butlerian Jihad.