Misinformation, violence, and Facebook
Government reforms need to go further if we want to curb online misinformation, argues Naomi Cray
In 2022, Mark Zuckerberg was asked what his greatest regret was. His response? Choosing to join the fencing club in high school, rather than wrestling. That’s quite a choice of words for a man whose company is often found at the centre of political crises and violence. This is the man who made the decision to only employ moderators who spoke the majority language in Myanmar at best, and none of the native languages at worst – a choice which has since been noted to have played a significant part in the pogrom against Rohingya Muslims in the country.
Since the 2016 American election, the role that social media plays in politics and society as a whole has become more and more apparent. During the pandemic, there was widespread misinformation circulating on Facebook – anti-vaxxers started taking ivermectin (a horse medication used in humans to remove parasites) to cure Covid, on top of your typical anti-government and anti-lockdown conspiracies.
"the role that social media plays in politics and society as a whole has become more and more apparent"
More recently, Meta contributed to the race riots across the UK after the stabbings in Southport. From the woman who falsely claimed the Southport attacker was a Muslim man named Ali-Al Shackati (about as believable as a Frenchman called Pierre Baguette), to the widely circulated fictitious screenshots of WhatsApp groups claiming to plan the next attack, there is no denying that Meta, X, and TikTok played a role in the escalation of violence.
AI adds a new dimension to misinformation, and one which is only going to evolve. Only this past week, Donald Trump shared an AI generated image on X (formerly Twitter), which showed Taylor Swift endorsing him in the presidential race. Not only is this a lie, it is a lie knowingly circulated by a former president and current presidential candidate, with full intention of it gaining him favour amongst the youth vote. Whilst it is no surprise that Musk, who has fully endorsed Trump and owns X, has not had the post removed, this only highlights the political motives behind spreading misinformation.
We also need to factor in addiction when talking about social media, and other facets of our psychology which makes it so easy to spread misinformation online. With features like infinite scrolling and algorithms that are designed to feed you increasingly extreme content to hook your attention and drive engagement, social media is addictive by design. The longer the average user spends on a platform, the more ads they’re exposed to, which is what makes these sites profit. Simultaneously, the longer that an average user spends online, the more likely they are to come across misinformation. Fake content is more likely to be extremist (think fake or mislabelled videos that claim to show asylum seekers committing crimes) as this drives engagement.
"social media is addictive by design"
I am not going to defend social media to you, at least not fully. But, there are some benefits to it which are commonly not considered. In recent conflicts across the world, we have seen the rise of reporters whose platform is social media, and who have been able to garner massive audiences through detailing their daily lives amidst bombardment and fear.
There is no easy solution to this problem when corporations and government regulation are involved. In 2023, the landmark Online Safety Bill to restrict children’s social media access and give adults more control over what content they consume came in. The issue is that it is rapidly becoming too little, too late, with the bill containing no provisions against what makes the sites so addictive, or demanding any more from providers to moderate hate speech and extremist content on their sites.
Finding that solution needs to start by removing features such as the infinite scroll and limitless explore pages, to make the sites less addictive. With fewer incentives to be online, people’s screentime would decrease and with it the amount of misinformation and extreme content being consumed and spread. Whilst I recognise that this is perhaps unlikely to ever happen, given it’s far too financially damaging to the companies responsible, it also would be one that would have the most immediate impact, both on and offline. I hope that even Mark Zuckerberg would agree that a world with fewer lies and more honesty would be better for everyone.
- Arts / What on earth is Cambridge culture?20 December 2024
- News / Cambridge law journal apologises following paper on Gaza annexation19 December 2024
- News / Building works delayed again for £30m student accommodation development18 December 2024
- News / Cambridge by-fellow fails in bid to sue Homerton for discrimination16 December 2024
- Music / Exploring Cambridge’s music scene in the shadow of London17 December 2024