AI can not detect lies

AI detectors = not so smart after all?

Christopher Penn, co-founder and Chief Data Scientist at TrustInsights.ai, just dropped a truth bomb about AI detectors on LinkedIn. Spoiler alert: they're not as clever as you might’ve thought.

Here's the scoop: Penn ran the U.S. Declaration of Independence through an AI detector (you know, tools like ZeroGPT that claim to sniff out AI-generated content).

The result? ZeroGPT said the Declaration of Independence had a 97% chance of being AI-generated. Wait, what? Did Thomas Jefferson have a time machine and a ChatGPT subscription?

Nope. Turns out, AI detectors aren't exactly rocket science. Here's why they goofed:

  1. They use metrics like “perplexity” (how predictable the text is) and “burstiness” (how much the writing style varies). The Declaration's precise language and similar sentence lengths made it look suspiciously AI-like.

  2. These detectors are trained on vast amounts of text... including historical documents. So when they see the Declaration, they're like, “Hey, I've seen this before! Must be AI!”

Penn's verdict? AI detectors are worthless.” Ouch. 

So what's the solution? Penn suggests we humans need to prove our work's authenticity, kind of like how fancy cheeses come with certificates of origin.

Funny enough, on Tuesday, Adobe announced an app coming out next year that lets creators add that kind of digital signature to their work to prove it's human-made.

Our take: If the Founding Fathers can be mistaken for AI, we're all in trouble. Want to sound more human in your writing (whether you're using AI or not)? Try these tips:

  1. Mix up your sentence lengths. Short ones. Longer, more complex ones. You get the idea.

  2. Throw in some unexpected phrases… like comparing AI detectors to cheese certificates!

  3. Add a personal touch. AI probably won't mention that time you spilled oatmeal on your J Crew tee while writing about AI…

Remember, the goal isn't to trick AI detectors—it's to communicate effectively and authentically.

Leave a comment

Please note, comments need to be approved before they are published.