AI Bias

November 25, 2024      Kevin Schulman, Founder, DonorVoice and DVCanvass

AI bias is a hot topic, but not the kind you’re thinking of.  Let’s talk about a different kind of bias—the kind that fuels the wave of dismissive, viral hot takes like these:

  • “Relying solely on AI for creative tasks can result in homogenized outputs, stifling innovation and originality.”
  • “While AI can produce content quickly, it often lacks the depth and nuance that come from human experience and emotion.”
  • “The rise of AI-generated content is leading to a flood of low-quality material online, making it harder to find genuine human creativity.”

These takes are popular and sound insightful, but they’re like a thimbleful of knowledge in an ocean of untapped potential.

The real problem isn’t what AI can’t do—it’s the things we think we know about its limitations that just ain’t so.

Let’s drop some data anchors to steady ourselves in the swirling currents of hot takes.

  • Creativity: Not Just for Humans Anymore
    • Professional musicians couldn’t reliably distinguish between AI-composed and human-composed classical pieces. AI’s work was often rated as more original.
    • Art experts struggled to tell whether abstract artworks were created by AI or humans. In some cases, AI’s pieces were rated as more creative
    • Readers couldn’t consistently identify whether flash fiction stories (under 1,000 words) were written by humans or AI.
  • Persuasion and Communication: AI Takes the Lead
    • AI-generated public health messages designed to increase vaccine uptake outperformed CDC-authored messages, eliciting more positive attitudes and engagement.
    • Political messages created by AI were rated as more persuasive, easier to read, and more positive in tone than those written by humans.
    • AI-crafted ads tailored to personality traits performed as well or better than human-created ones in terms of persuasiveness and response rates.
  • Trust and Realism: Seeing Is Believing
    • AI-generated faces were perceived as more trustworthy and real than actual human faces.
    • News articles written by AI were indistinguishable from those by professional journalists.
    • Scientific abstracts generated by AI fooled experts into believing they were human-written. Some were rated higher in quality than human-authored ones.

AI doesn’t just produce “good enough” content—it challenges the boundaries of what we consider uniquely human. In creativity, persuasion, and trust, AI is not merely complementing human effort but replacing and amplifying it. It’s a force multiplier, pushing the limits of what’s possible, and a democratizer, making advanced capabilities accessible to all.

Critics say AI lacks depth, nuance, or originality, but the evidence tells a different story. If anything, it’s revealing our biases against what AI can achieve.

The next time someone dismisses AI as a shortcut to mediocrity, ask them this: Are they basing that opinion on data—or just on what “everyone knows”?  Because what we know for sure might just need a serious update.

Kevin

 

4 responses to “AI Bias”

  1. Stephen Best says:

    I appreciate it when people citing ‘facts’ also cite their sources. Based on the “facts” in this article, it appears that AI (whatever that it is in these instances) is ‘better’ (whatever that means in these anecdotes) than a randomly chosen person.

    Musing here, but it seems you’re setting the stage to market AI copywriters to your clients.

    • Kevin Schulman says:

      Stephen, maybe it’s just semantics but I wouldn’t characterize these as ‘facts’, instead they are experiments and results. Not all experiments replicate. We write lots of copy for our clients and use AI for all of it but we wouldn’t charge for the AI anymore than we’d charge for Microsoft Word and it’s (limited) editing functions. Nor do we see a market in selling AI for copy as a product since it’s been quickly commoditized – i.e. widely available and low cost.

  2. Stephen Best says:

    Curious, are there A/B tests of AI copy vs proven copywriters’ copy in terms of response? That’s the only valid comparison. I recall when ‘desktop’ publishing came along, and early adopters thought they could get rid of their graphic designers and use the receptionist (literally) to create publications. By all means, use AI and I’m sure costs will go down but, it seems, actual results will remain unmeasured.

    I’ve seen A/B tests of envelope colour. How about AI vs human copy.

  3. Paul McFate says:

    This is an interesting take. Yet if I spend an hour on Youtube I find a score of examples of bad AI writing, fake sounding voice-overs, repetitive material, and flagrant plagiarism. As a professional songwriter, I participated in a double-blind study pitting AI compositions against real songwriters. AI won out overall. But not against professional songwriters. Because AI songs are so easy to generate, people are uploading millions of mediocre songs to music platforms and also knock-off material mimicking popular artists. The result is an ocean of mediocrity that will flood every crevice of meaningful human experience. As AI gets better, human experience becomes less and less relevant. Machines will make the money, the food, the art, the music, the wars, the inventions, the movies. What will be our role and purpose? To be spoon-fed everything by machines? No. I think there will be a revolt against AI at some point. For now it should be constrained and only used to do things that people cannot do for themselves.