Skip to main content
← Back to glossary
Basics

AI Hallucination

Hallucination is when an AI tool confidently gives you information that sounds right but is completely made up. It might invent a statistic, quote a law that doesn't exist, or name a person who never worked somewhere. The tricky part is that the AI doesn't know it's wrong — it presents false information with the same confident tone as true information. Always double-check any facts, figures, or legal details an AI gives you before acting on them.

Real-world example

A Galway solicitor used ChatGPT to look up a court ruling and the AI cited a case that never existed — she caught it by checking the Courts Service website before sending it to a client.

Related terms

Related guide

AI Policy for Your Business

Step-by-step guide for Irish business owners — plain English, no jargon.

Open guide →

Not sure where to start with AI?

Take the free 5-minute assessment and get a personalised plan for your business.

Take the free assessment →