Hallucinations
I hadn't put my finger on it but Ethan nailed it. AI tools we use today are literally text-based probability engines. They aren't generating novel thought, they are predicting what words go in what order based on the order of the words you gave it.
AI knows no truth or lies. If the corpus of training leans into conspiracy, you're going to get conspiracy.
Everything — everything — that comes out of these “AI” platforms is a “hallucination.” Quite simply, these services are slot machines for content. They’re playing probabilities: when you ask a large language model a question, it returns answers aligned with the trends and patterns they’ve analyzed in their training data.
- by Ethan Marcotte , Hallucinating
AI is a groupthink engine. which is why you have to engage your own thinkmeats when you use it.