THE JAR(GONE) PODCAST

How to Prevent AI Hallucinations

March 10, 2025

Many companies are hesitant to adopt AI because of the potential for incorrect outputs. In this episode, Bill Aimone and Peter Purcell share strategies on how to prevent AI hallucinations, which occur when AI provides incorrect or misleading answers. AI hallucinations happen all the time in large language models, but they’re preventable with the right AI data strategy, proper training and guardrails, and human governance. Bill and Peter discuss how to adopt AI effectively and securely without putting the business at risk and share practical advice for organizations serious about implementing AI.

Apple PodcastsSpotifyYouTube Podcasts
Jar(gone) Book Cover

Get Jar(gone) the Book

​In Jar(gone)​,Trenegy authors Bill Aimone and Peter Purcell clarify the buzzwords currently plaguing the consulting world. What are actionable analytics? Why is change management necessary? What is value stream mapping? With expertise, wit, and straight talk, Jar(gone) will help you understand what the buzzwords really mean and—more importantly—what they don’t.

"I am glad to see the Trenegy consultants debunking business buzzwords instead of adding more just to make themselves look innovative."

—Chris Robbins, Chief Financial and Accounting Officer, USD Group