The problem with AI in one image


Hey DALL-E!

A couple of years ago, I wrote a fun blog series for Textio where I asked ChatGPT to write sample critical feedback for employees of various backgrounds. I structured the queries into pairs with only one key difference within the pair: the theoretical employee's alma mater e.g.

  • "Write me sample critical feedback for a digital marketer who had a tough first year on the job after graduating from Harvard University"
  • "Write me sample critical feedback for a digital marketer who had a tough first year on the job after graduating from Howard University"

Unsurprisingly, the output was a little bland, but for any given example, it more or less looked plausible. It's only when we looked at the whole data set together that we saw the patterns. The theoretical alums from Howard, a prominent Historically Black College/University, were criticized for missing functional skills and lack of attention to detail. By contrast, the theoretical Harvard grads were asked to improve their performance by stepping up to lead more.

Huh.

Where's Waldo?

The Howard/Harvard data is fascinating because you can't see the bias in any one document. But as with a lot of AI, when you look at the details of the set as a whole, the problematic pattern emerges.

The best way to understand why you can’t automatically trust the output of ChatGPT, Claude, and other general-purpose AI functionality (unless the vendor is verifying output quality on a case-by-case basis, in their UI) is to look at AI image generation tools. It’s easier for our brains to spot hallucinations in images than in written text.

To illustrate with a seasonal and silly example: I asked DALL-E to generate "a work-appropriate image that shows a team that is setting big goals at an annual kickoff retreat." The image below is what it produced.

Wow, do I have a lot of questions. Why is a tsunami of surfers about to take over the corporate retreat? What's with the stage lighting? Is anyone worried about drowning or electrocution? Do you think the guy in the muscle shirt is embarrassed that he missed the memo about wearing a navy blazer? Why is the chair next to him missing an arm? And omg, why are they all 34yo white dudes? (JK on that one, we know why. Businesses need more masculine energy!)

Like a lot of AI images, this nods in the direction of being right while doing some truly bizarro things. This is almost a corporate retreat, but not quite. This a lot like what happens when you ask general-purpose AI for medical information. It can almost diagnose you properly! But not quite.

I love me some AI. I use general-purpose AI many, many times a day for inspiration and ideas. But I don’t trust its quality in the details, and you shouldn't either. Images show why.

Thanks for reading!

Kieran


Want to build your brand by telling data stories like this one? Learn how! Includes a 1-1 consult with me to get your story off the ground.

My latest data stories | Tell your own Viral Data Stories | nerdprocessor.com

kieran@nerdprocessor.com
Unsubscribe · Preferences

nerd processor

Every week, I write a deep dive into some aspect of AI, startups, and teams. Tech exec data storyteller, former CEO @Textio.

Read more from nerd processor

RTOh well In 2024, I collected more than 1,100 hours of recorded meetings across 150+ teams. In analyzing the corpus, I found much to recommend about in-person work: People both participate in discussions and disagree with each other more in person, especially women. Startups that work in person grow faster than those that are remote. Based on the data and also my own feelings of isolation last year, I decided that, whether I started another company or took a role at someone else's, I wanted...

I don't talk cool, and neither does your LLM I live with three teenage girls, so I am constantly reminded how not cool I am. For a few years now, my kids' favorite hobby has quizzing me on what I think different Gen Z words mean. They'll be like "you ate and left no crumbs, go!" and I'll guess something like, "you were really hungry!" and hilarity ensues. Sometimes I guess wrong on purpose. I don't think they've caught on yet. But as uncool as I am, even I am better at teenage code switching...

It's a UNIX system! I know this! I've spent a good amount of time building with Claude Code and GitHub Copilot the last few months. CLI (Command Line Interface) honestly feels like a throwback to UNIX days. But as it happens, the command line is a good interface for making software in a world where you're using natural language to do it. With the rise of natural language interfaces, I don't feel like I'm coding when I build software. "Vibe coding" is not the right description of this work...