Why scaling won't fix hallucination

(echosphere.io)

3 points | by AIThemis 7 hours ago ago

3 comments

  • chrisjj 7 hours ago ago

    Bogus clickbait title.

    > Why do LLMs hallucinate?

    > The answer was not “insufficient data” or “temperature settings.” The answer was that they have no mechanism for checking if a claim is valid.

    Nothing could be further from the truth. Genuinely intelligent beings are quite capable of answering without fabrication in absence of a validation mechanism.

    Drop the "hallicination" euphemism, recognise it as fabrication, and the true answer becomes obvious. Parrots fabricate because that's all they are made to do.

  • undefined 7 hours ago ago
    [deleted]
  • fuzzfactor 6 hours ago ago

    Scaling sure has made it a lot more believable so far . . .