Notes on hallucination

  • Analysts estimated that chatbots hallucinate as much as 27% of the time, with factual errors present in 46% of their responses.
  • Indeed, ChatGPT and other sophisticated chatbots regularly put out false information. But that information is packaged in such an eloquent, grammatically correct statement that it’s easy to accept it as truth.
  • Errors in encoding and decoding between text and representations can cause hallucinations. When encoders learn the wrong correlations between different parts of the training data, it could result in an erroneous generation that diverges from the input.
  • AI hallucinations are caused by a variety of factors, including biased or low-quality training data, a lack of context provided by the user or insufficient programming in the model that keeps it from correctly interpreting information.
  • Northwestern’s Riesbeck said generative AI models are “always hallucinating.” Just by their very nature, they are always “making up stuff.” So removing the possibility of AI hallucinations ever generating false information could be difficult, if not impossible.
  • Temperature is a parameter that controls the randomness of an AI model’s output. It essentially determines the degree of creativity or conservatism in its generated content, where a higher temperature increases randomness and a lower temperature makes the output more deterministic. In short: the higher the temperature, the more likely a model is to hallucinate. Companies can provide users with the ability to adjust the temperature settings to their liking, and set a default temperature that strikes a proper balance between creativity and accuracy.

Whereas no temperature settings are required for humans, who can actually be creative and accurate at the same time. 

Sources:
https://en.wikipedia.org/wiki/Hallucination_(artificial_intelligence)
https://builtin.com/artificial-intelligence/ai-hallucination
https://www.lighton.ai/blog/llm-glossary-6/turning-up-the-heat-the-role-of-temperature-in-generative-ai-49
https://builtin.com/artificial-intelligence/eliza-effect

Leave a comment