Dismiss AI slop at your peril
History is full of dire predictions about technology rotting our brains. The truth is usually more nuanced, writes Shaun Davies.
Slop or not: Show Runner is an AI generator where you write the story
Let’s talk about slop – the deep-faking, engagement-baiting, time-sapping, work-spoiling AI content that’s rotting our brains, destroying democracy and ripping at the fabric of reality. Or so the story goes.
As 2025 draws to a close, it feels like we’re drowning in AI-generated content. But we’re also drowning in hot takes complaining about slop. Nick Cave says it’s “a grotesque mockery of what it is to be human”. The Atlantic calls it “a tool that crushes creativity”. The Guardian says that AI slop is “destroying the internet”.
I, too, enjoy complaining about slop. I agree that a lot of it is shit. I also agree that it has the potential to cause real-world problems.
But I think it’s worth considering that all this dismissive “slop” talk that dominates media discussion of AI content is a psychological safety blanket, rationalising away the threat posed to our already-disrupted media ecosystems.
Crikey! Nuance, evidence and well-constructed argument in an opinion piece. How did this slip through?
So close, Shaun! But outlining 200+ years of cultural turmoil isn’t comforting, it’s the granddaddy of all sunk cost fallacies – “we’ve innovated ourselves into misery this long, why stop now?” What if the first wave naysayers about cultural tech were largely right? Maybe there’s a reason we feel the need to prove or disprove broad “cognitive and cultural collapse” decade after decade? Each decade *feeling* that non-collapse to be more true, more deeply? Maybe the collapse was nuanced too. And shills aren’t really the problem – the road to hell is paved with apologists. Say what you will about moral panickers, at least they’re willing to engage with moral questions.
To think is to be
To be is to feel
To feel is to care
To care is to act
To act is to think