leftzero@lemmynsfw.comtoTechTakes@awful.systems•LLMs can’t reason — they just crib reasoning-like steps from their training dataEnglish
1·
3 months agoParaphrasing Neil Gaiman, LLMs don’t give you information; they give you information shaped sentences.
They don’t encode semantics. They encode the statistical likelihood that each token will follow a given sequence of tokens.
Yeah, general artificial intelligence LLMs are definitely not. Human level intelligence, though… yeah, that depends on what particular human you’re talking about.
(Though, to be fair, this isn’t limited to LLMs… it also applies to Eliza, for instance, or your average lump of granite.)