I’m already sick and tired of the “hallucinate” euphemism.
It isn’t a cute widdle hallucination, It’s the damn product being wrong. Dangerously, stupidly, obviously wrong.
In a world that hadn’t already gone well to shit, this would be considered an unacceptable error and a demonstration that the product isn’t ready.
Now I suddenly find myself living in this accelerated idiocracy where wall street has forced us - as a fucking society - to live with a Ready, Fire, Aim mentality in business, especially tech.
I think it’s weird that “hallucination” would be considered a cute euphemism. Would you trust something that’s perpetually tripping balls and confidently announcing whatever comes to them in a dream? To me that sounds worse than merely being wrong.
I think the problem is that it portrays them as weird exceptions, possibly even echoes from some kind of ghost in the machine. Instead of being a statistical inevitability when you’re asking for the next predicted token instead of meaningfully examining a model of reality.
“Hallucination” applies only to the times when the output is obviously bad, and hides the fact that it’s doing exactly the same thing when it incidentally produces a true statement.
You forgot the best part, the screenshot of the person asking ChatGPT’s “thinking” model what Altman was hiding:
AI is a complete joke, and I have no idea how anyone can think otherwise.
I’m already sick and tired of the “hallucinate” euphemism.
It isn’t a cute widdle hallucination, It’s the damn product being wrong. Dangerously, stupidly, obviously wrong.
In a world that hadn’t already gone well to shit, this would be considered an unacceptable error and a demonstration that the product isn’t ready.
Now I suddenly find myself living in this accelerated idiocracy where wall street has forced us - as a fucking society - to live with a Ready, Fire, Aim mentality in business, especially tech.
I think it’s weird that “hallucination” would be considered a cute euphemism. Would you trust something that’s perpetually tripping balls and confidently announcing whatever comes to them in a dream? To me that sounds worse than merely being wrong.
I think the problem is that it portrays them as weird exceptions, possibly even echoes from some kind of ghost in the machine. Instead of being a statistical inevitability when you’re asking for the next predicted token instead of meaningfully examining a model of reality.
“Hallucination” applies only to the times when the output is obviously bad, and hides the fact that it’s doing exactly the same thing when it incidentally produces a true statement.