ChatGPT: Eraser of the Implausible

But the implausible does happen

Alberto Romero
6 min readApr 12, 2023
Midjourney

ChatGPT is five months old, i.e., ancient. During this time, one of the most practiced AI-sports has been trying to find the most succinct and precise description of what it is and what it does.

The original definition is along the lines of: ChatGPT is a system trained to predict the next token given a history of previous ones and further tuned to follow human instruction. Andrew Kadel shared on Twitter a more snarky one his daughter came up with: ChatGPT is a “say something that sounds like an answer” machine. On the same note, writer Neil Gaiman observed that “ChatGPT doesn’t give you information. It gives you information-shaped sentences.” Then, at the more imaginative end of the spectrum, we have that ChatGPT is a language model with emergent capabilities that allow it to understand and reason about the world.

But we’re all tired of this game. Mainly because it’s irrelevant — people are using ChatGPT as a Google replacement anyway. But also because ChatGPT’s functional complexity forces any particular description to only illuminate some aspects, leaving out others. The descriptions above are all partially right (the “super-autocomplete” system and the “reasoning engine” takes aren’t at the same level of correctness for me, though; the first is a bit conservative but the second…

--

--