Nov 22, 2024
The reason is the tokenizer. It's not about lack of intelligence! It's how the model parses words into vectors. ChatGPT doesn't see letters like we do but tokens which are parts of words numbered in a long vocabulary. That's why it sometimes gets it right and sometimes wrong