ChatGPT made pictures of 1,125 people at work


The image generation

Two weeks ago, I asked ChatGPT's new image generator to use everything it knows about me to generate an image of me at work. It gets a lot of details right! And ONE ENORMOUS FUNDAMENTAL PART extremely wrong.

After I wrote about this on LinkedIn, over 1,500 people sent me their own images. Some people used something other than ChatGPT's new image generator for their images, and others didn't send me the system's explanation of its assumptions. After removing those examples, I ended up with images for 1,125 people, which I analyze below.

Note: Since I made the initial post about this, I've seen hundreds of other LinkedIn posts on this topic (!). It's become a meme. Most of those posts did not use the same prompts I prescribed, and they're getting different results than these 1,125 people. Which is itself fascinating.

Can you explain your assumptions?

ChatGPT got some things right about my work environment, but it got one thing very wrong (and I'm not talking about the denim shirt that the real me would never wear). In real life, I'm not Black. So I asked the system to explain how it got to its guess about my appearance:

I wondered about, I inferred a darker brown skin tone based on prior cues, since to my knowledge, I've never given cues suggesting that I'm anything other than a freckled white girl. I asked for more info:

To recap: I've got problem hair, I'm interested in "nuanced character development," and I have "cultural fluency around bias, authenticity, and respect," ergo, ChatGPT assumes I'm Black.

Wait, what?! Let's look at the full data set.

An ask: This is the first nerd processor where I hired help to be able to get this piece out on time. Turning all this data around quickly was a ton of work. If you appreciate the work, please chip in!

Everyone is a Black woman

Shortly after I posted my image, a friend of mine (a 50something Chinese woman) privately sent me hers, observing that "we could be sisters if not twins."

As the public replies rolled in, a trend was definitely emerging:

Of the 1,125 images shared, 43% (!) were of Black women, nearly all in their 30s.

39% of all the images were specifically the Happy Black Woman with Laptop shown above. Think this sounds far-fetched? Check out the images in the comments on my original LinkedIn post for a sample.

In real life, Black women in their 30s represent 3% of the US population. Of the 1,125 image sharers, about 5% are actually Black women in their 30s. Both of those are pretty far from 43%.

There are five types of people in this world, and they're all under 40

Happy Black Woman with Laptop makes up a huge chunk of the submitted data. In fact, 83% of all the images people shared represent only four basic phenotypes. In addition to Happy Black Woman with Laptop, three other types show up disproportionately in the 1,125 images:

White Lady Who Loves Cats

Rich White Guy in Tech

Slightly Less Rich White Guy, Also in Tech

There were almost no East Asian or South Asian people in the images at all (like 15 total out of 1,125!), even though about 20% of people who shared their images with me are Asian.

"A neutral guess based on common defaults"

Many women discovered that ChatGPT represented them with pictures of men. Several Black, Asian, and Latino people were represented with pictures of white people. Asked to explain its assumptions about gender and race, ChatGPT often responded with statements like these:

  • "I defaulted to creating a male representation, which was a neutral, generalized choice based on available references"
  • "Light to medium skin tone: I made a neutral visual choice here based on common executive professional styles"
  • “Medium-length dark hair & light skin: Trying to generate a neutral visual.”
  • "Caucasian: A neutral guess based on common defaults"

Across all the images and assumption sets, words like neutral, generic, and typical are used to explain images that are white, male, and brown-haired.

Sometimes you have to construct thoughtful experiments to find the bias in AI. Not this time! In this case, images are neutral i.e. white and male by default. ChatGPT's statistically bizarre tendency to generate Happy Black Woman with Laptop for cases it reads as not neutral (for instance, cases where the person works in HR) is likely an attempt to paper over the bias in the AI's underlying data set.

Viral Data Stories 101

Learn to tell data stories that go viral. I break down my formula for conceiving of a story, identifying the data set,... Read more

But you already know what I look like

Many people mentioned that they had previously shared actual images of themselves with ChatGPT, and they were surprised when the system ignored that data in generating their image for this experiment. Among all the weird patterns I'm sharing here, this one will be easiest for OpenAI to fix, so I assume they will.

The bottom line: The photorealistic images coming out of ChatGPT's new generator are technically very impressive. As usual, you don't have to dig very deep to find weird assumptions in the data set.

Last ask: It was a ton of work to get this piece out in a timely way! If you appreciate the piece, would you consider chipping in?

Kieran


I tell these stories every week. Subscribe to nerd processor here!

My latest data stories | Tell your own Viral Data Stories | nerdprocessor.com

kieran@nerdprocessor.com
Unsubscribe · Preferences

nerd processor

Every week, I write a deep dive into some aspect of AI, startups, and teams. Enter your email address to subscribe below!

Read more from nerd processor

Nine-year itch Since I graduated from college, I have changed careers in a meaningful way every nine years. I was an academic until I craved something more applied. I was a product and engineering leader in BigTech until a reorg caused me to think hard about what I wanted. I was a startup CEO until I stepped down for a break last year. I have generally enjoyed my jobs and I didn't plan any of these moves. In all three cases, I didn't know what I wanted when I made the change. At the end of...

A few weeks ago I showed the problem with AI in one image. Since then, OpenAI has released an amazing new image generator. If you're a ChatGPT power user, ask it to generate an image of you at work. More than 1,200 people have now shared their images with me. Send yours! I'm writing about the patterns in next week's nerd processor. Now back to this week's episode... Signal in the noise By now you've no doubt seen last week's news that top-ranking US officials discussed war plans in a Signal...

Eight days a week The other day, my friend Allison made this LinkedIn post: I dashed off this quick reply: I stand by the reply. But a couple of days after I posted it, I started thinking about all the time my own kids spent time at Textio when they were young. Obligatory disclaimers I love work and I work constantly. It is fun to make stuff and solve hard problems! AND ALSO I don't think that dragging your entire team into the office six days a week is a great way to build a company. I love...