Making It Seem Unimpressive

Making It Seem Unimpressive

I don't know if this is a function of the thing itself, or of my media diet, or of the human ability to quickly get bored of anything, but the more I think about all the LLM stuff the more I'm impressed at how unimpressive it has come to seem.

On the one hand, you're able to ask a computer to draw a certain kind of picture, or write a certain kind of text or code, and it's able to do it. That's absolutely wild. However, there's very often something a bit weird and a bit wrong about it. Images are where it's most obvious, but (changing though I'm sure this is) text written by an AI often seems to have a very "written by AI" flavour.

Prior to the dawn of these models, you could ask the computer for a certain kind of picture, or a certain kind of text or code, and it would find it for you from things that other people had made. This was called a 'search engine', and it's against this standard that the AI has been judged. The pictures returned might sometimes be bad or clumsy but comprehensibly so. The code returned might not be exactly what you're after but you could be sure that's because it was someone trying to solve a subtly different problem.

Presenting the models to users in chatbots was a stroke of genius, because it takes advantage of the already-known "people are easily convinced by robots if you present them as conversation partners" cognitive exploit—but the more people start treating it like search, the more, imo, they realise that it compares disfavourably to existing search. The images have all these weird properties, the code isn't quite right, somehow. It's almost like the difference between progressive and baseline JPEGS: progressive loading gives you the whole of an image at once, but it's blurry and only slowly de-blurs, whereas baseline gives you all the accuracy up-front but loads in line-by-line.

This is the point where I realised I've reasoned myself awkwardly into an analogy that Ted Chiang already made, but in a slightly different way that's worse and makes less sense. oh well