🔽🔼
Font Size
Are Machine Learning Systems Hallucinating? No.
Category: AI, ML and LDNLS
by Admin on Friday, June 14th, 2024 at 10:48:34

Hallucination is an entirely inaccurate description of what's going on.

These machine learning systems are mis-predicting.

They don't think, and they aren' t possessed of an imagination. They select probable streams of words based on the query or queries and a collection of similar queries and responses which are in no way certain to resolve to an objectively correct answer.

The outputs read well, but the only thought involved in the process has been abstracted into the training database.

The fact that they read well is the fundamental source of all the misconception about how ML systems work. That, nonsense marketing, and credulous articles that propagate terms like "hallucinations."

Want to add a comment to this post? Click here to email it to me.

0.05 [Generated]