Facebook has introduced a user trustworthiness score – here’s why it should go further
In this crossposting from The Conversation, Sharon Coen calls on the world of psychology to explain the mechanisms behind Facebook’s decision to launch a user trustworthiness score.
Facebook has reportedly started giving users a secret trustworthiness score in its attempt to tackle fake news. According to the Washington Post, the score is partly based on users’ ability to correctly flag and report inaccurate news items on the site. Facebook then takes this score into account when working out how a user’s content should be spread around the network (although it doesn’t tell users what their score is).
Why has Facebook started doing this? Following events such as the supposed role of misinformation on social media during the 2016 US election via and the Brexit campaign, Facebook has been increasingly under pressure to curb the spread of fake news. But the platform is now facing the possibility that the growing awareness of misleading and false information has increased the likelihood that users will report articles as “fake” just because they don’t agree with them.

Facebook’s recent anti-fake news billboards