393
Tim Cook is “not 100 percent” sure Apple can stop AI hallucinations
(www.theverge.com)
This is a most excellent place for technology news and articles.
I'm 100% sure he can't. Or at least, not from LLMs specifically. I'm not an expert so feel free to ignore my opinion but from what I've read, "hallucinations" are a feature of the way LLMs work.
One can have an expert system assisted by ML for classification. But that's not an LLM.