Why trust actors but not AI?

In this guest post, Made This managing director Vinne Schifferstein Vidal examines a double standard around authenticity when it comes to AI.

In countless conversations with industry leaders and clients, one theme keeps surfacing: transparency — especially when it comes to using artificial intelligence in our creative work.

Clients often feel compelled to disclose AI use, fearing a backlash that could tarnish their brand’s authenticity. Many are comfortable using AI-generated elements for backgrounds, textures, or product mockups. But replacing real people? That’s still often seen as crossing a line.

But here’s the uncomfortable question: why?

Audiences have never been told that the smiling woman on the billboard had every blemish, freckle, and wrinkle retouched. No one ever disclosed that the idyllic sunset behind a car in a TVC was extended with CGI or, more recently, generative fill in Photoshop. Designers and creatives have been using post-production tools like Facetune, Clone Stamp, After Effects, and now generative AI for years — without a whisper of explanation. And no one cared or questioned it.

Subscribe to keep reading

Join Mumbrella Pro to access the Mumbrella archive and read our premium analysis of everything under the media and marketing umbrella.

Subscribe

Get the latest media and marketing industry news (and views) direct to your inbox.

Sign up to the free Mumbrella newsletter now.

"*" indicates required fields

 

SUBSCRIBE

Sign up to our free daily update to get the latest in media and marketing.