Even the most experienced pros can get burned by AI — from an MSP leader ordering a fake bag to Deloitte submitting an AI-written government report riddled with errors. Whether you’re a solopreneur or a Fortune 500 consultancy, the same truth applies: AI is a tool, not a truth machine. - by Jennifer Gilligan, President of IntegraMSP

AI Can Save Time—But It Can Also Blow Up Your Reputation

If Deloitte Can Blow It With AI, What’s Stopping You?

None of us like to admit when we’ve been fooled. But last week, I did exactly that — publicly — after ChatGPT led me to a counterfeit shopping site. Yes, I clicked. Yes, I paid. Yes, I got scammed.

That post hit a nerve. Why? Because most of us are playing with AI tools like we’ve been trained in them. We haven’t.

Not long before I shared my own cautionary tale, another story was already making waves: Deloitte got caught using AI to write a government report—and it didn’t go well.

Let’s pause on that.

We’re not talking about an intern cutting corners. We’re talking about one of the largest, most respected consulting firms on the planet turning in an AI-written report to the Australian government... that was riddled with hallucinated facts, inaccurate citations, and errors a human expert would’ve never let through.

And here’s the kicker: Deloitte billed for expert work.

What Actually Happened?

The Australian Department of Industry, Science and Resources asked Deloitte to write a report about artificial intelligence’s impact on the labor market. Deloitte delivered a report... but reviewers found entire sections were written by ChatGPT (verbatim), complete with fake citations and flawed analysis. Deloitte admitted to using AI, but the blow to credibility was already done. Deloitte’s “Targeted Compliance Framework Assurance Review” was finalized in July and published by Australia’s Department of Employment and Workplace Relations (DEWR) in August (Internet Archive version of the original).

Sources:

Why This Matters (Yes, Even for Small Businesses)

Let’s connect the dots. Deloitte’s mistake wasn’t just about a bad report—it’s about over-trusting a tool that isn’t built for accuracy.

And if they can fall into that trap with entire departments and compliance systems in place, how many smaller firms are sleepwalking into similar mistakes?

Whether it’s:

  • Reposting AI-generated blog content without fact-checking,
  • Building sales materials with hallucinated “testimonials,”
  • Or relying on ChatGPT to research product vendors (hi, that was me)...

You’re not just taking a shortcut—you’re potentially putting your reputation on the line.

Trust Is the Real Currency

Clients don’t care how fast you can churn out content. They care that the advice, information, and insight you’re offering is correct, reliable, and rooted in truth.

That’s why AI shouldn’t replace your brain—it should extend it. Use it to brainstorm. Use it to draft. But always review, vet, and apply human judgment.

Because at the end of the day, people buy trust, not tactics. And trust isn’t scalable if you hand it over to a machine.

Key Takeaways:

  • AI tools like ChatGPT are helpful—but they hallucinate. Don’t treat them as fact engines.
  • If global consulting firms like Deloitte can damage their credibility by over-relying on AI, small businesses are even more vulnerable.
  • Your brand’s reputation is built on accuracy and accountability. Don’t shortcut the review process.
  • AI should assist your expertise—not replace it.

 Final Thought:

Let this be your gut check. If you’re using AI in your business, great. But if you’re trusting it blindly? You're not saving time—you’re setting a trap.

Do what Deloitte didn’t: Validate before you publish. Check before you click.

And if you need a second pair of eyes on how AI is being used in your business? That’s what I’m here for.