Facebook hits out at ACCC’s algorithm transparency proposals
Facebook has hit out at the Australian Consumer and Competition Commission’s proposals for an independent body to review the algorithms used by online media platforms.
Writing on Facebook Australia’s policy blog, Simon Milner, vice president of APAC policy, said the regulator hadn’t provided any evidence to support such a proposal, an idea that has been strongly urged by News Corp, one of Facebook’s fiercest critics.

Facebook’s Millner: “We do not believe that greater algorithmic transparency will solve the problem of how to support sustainable journalism in Australia”
“The ACCC has not made a case or provided any evidence for why they believe an algorithm regulator is a necessary, effective and proportionate response to the business model challenges facing news media. More importantly, people, not regulators, should decide what they see in their news feeds,” Milner wrote.
Simon Milner spends much of his opinion piece talking up Facebook’s efforts to support sustainable journalism, as if the ACCC cares only about competition. Yet the commission’s brief is about consumer protection too.
Milner wants us to believe that “people, not regulators, should decide what they see in their newsfeeds”. That would be nice, but for now it is Facebook that decides what we see, through its algorithms.
For Facebook, news is only a means to an end. Social networking businesses are advertising brokers, with captive markets in the billions, and unprecedented revenues, maximised through ingenious and subtle means of getting to know us. Facebook serves up news, measures in great detail how we respond, and applies that feedback to refine not just the newsfeed but the way it targets its ads.
We have always regulated advertising, especially to safeguard consumers against manipulation. Facebook’s algorithms exist to lead social media users towards advertisers. The case to regulate them could not be plainer.
We have to be very careful here. Although Facebook should certainly apply the same standards to its content as other industries, we must make sure we are not opening the door to censorship. We live in a world of competing narratives. There will always be media that challenges our current narrative, that criticises our governments and praises those we designate as our enemies. This does not mean that these are wrong and require removal or we enter an Orwellian world where truth is what we are told it is. I like that Facebook asks that regulation be ‘evidence-based’, this is essential but the devil is in the detail.
“People, not regulators, should decide what they see in their news feeds”
What utter hypocrisy.
This is exactly the problem the ACCC are trying to address. We the people don’t decide what appears in their news feeds, and Facebook’s lack of transparency is what’s causing so much harm – globally – to democratic processes, debate, trust and empathy.
Facebook appear to be operating in an ethical void which is why there is so much pressure on them, from numerous regulatory bodies around the world, to be more transparent. They can’t be trusted to tell the truth to governments about their role in elections and democratic processes (UK, US, Nigeria etc), or to advertisers about their true reach or video views (good work IAB, Mark Ritson and others for the pressure about this), or to consumers about data collection and privacy (proposed German regulations about data use across FB, WhatsApp, Instagram & Messenger). They can’t be trusted to self-regulate, so it’s essential they open their doors to others. It’s not censorship, it’s good old-fashioned regulation to ensure balance and fairness to protect society’s interests.
Good work to the ACCC, IAB and others for fighting the good fight.
And, beautifully put Steve Wilson.
About as transparent as when they were pushing campaign reach data of 140%.