Hallucination is an entirely inaccurate description of what's going on.
These machine learning systems are mis-predicting.
They don't think, and they aren' t possessed of an imagination. They select probable streams of words based on the query or queries and a collection of similar queries and responses which are in no way certain to resolve to an objectively correct answer.
The outputs read well, but the only thought involved in the process has been abstracted into the training database.
The fact that they read well is the fundamental source of all the misconception about how ML systems work. That, nonsense marketing, and credulous articles that propagate terms like "hallucinations."
0.05 [Generated]