More valuable than oil: data and the individual
The Cambridge Analytica Scandal gave us a glimpse of the data-driven future but, asks Grace Field, what does this mean for the individual and what responsibility do scientists bear?
A Conservative-majority government now controls the future of the UK, at the end of what has been called the most significant election campaign of the century. But, in the wake of the Cambridge Analytica scandal, with our eyes opened to just how much political consultancies are able to influence election campaigns using personal data, many of us are asking: who really won December’s election, and are we comfortable with how they did it?
The Cambridge Analytica scandal has made data privacy one of the biggest ethical, political, and legal issues of our time. As Facebook is hit with a flurry of antitrust investigations, the British government is criticised for allowing Amazon access to NHS data, and the US faces another divisive presidential election, it is more important than ever to look back on what really happened at Cambridge Analytica.
Just over a month ago, a forward-looking group of physicists at the Cavendish Laboratory hosted a film screening of The Great Hack– a Netflix documentary that goes behind the scenes of the scandal, following some of the biggest players in the company through the drama that ensued as the firm came under legal and political scrutiny, and ultimately disintegrated.
The film paints Cambridge Analytica as an immaculately marketed and highly effective political consultancy firm, whose success was based on breaching barriers with data privacy that their competitors either did not dare, or did not have the necessary technology, to breach. Organizers Dr. Paolo Andrich, Dr. Tiffany Harte, and Dr. Elizabeth Tennyson hoped the screening would encourage the public, scientists, and experts to confront the far reaching effects of the Cambridge Analytica Scandal and tackle questions of scientific harm and accountability.
Although most of the media coverage in 2018 focused on Trump and Brexit – two Western-centric campaigns – the film shows how Cambridge Analytica’s influence went much further with it and its parent company, the SCL Group, influencing election campaigns all over the world. Using personal data to generate “micro-targeted” messaging, they aimed at swaying the political views of undecided voters they called the “persuadables” and rolled out schemes to do so in countries across the globe, from Malaysia to Lithuania to Nigeria.
In one example, provocative because of its colonial undertones as a subversive continuation of Western meddling in international politics, the film demonstrates how the SCL group was hired by the United National Congress, one of the political parties vying for power in Trinidad and Tobago’s 2010 election. The firm designed a targeted movement to suppress voter turnout among the young Afro-Trinidadian population, which resulted in a 40 percent difference in voter representation between 18-35 year old Afro-Trinidadians and Indo-Trinidadians on election day. This margin was enough to swing what would otherwise have been a close vote in the UNC’s favour.
Despite the impulse to sensationalize the whole operation, the film reveals the exact opposite: in some ways, it is the apparent normality and transparency of Cambridge Analytica that is most unsettling. The company did not hide what they were doing with data – they were proud of it, publicly claiming to have 5,000 data points on every American voter. They were different, they said, because they used more data and better data analysis. As whistleblower Brittany Kaiser emphasizes in the film, data is now considered a more valuable asset than oil.
Cambridge Analytica is not the only company using data in this way, however, which makes it less than simple to pinpoint blame, or to figure out what we should be doing now to protect our data in the future. Before the film screening, 21 percent of the 237 attendees at the Cavendish event believed that the data analysts and researchers at Cambridge Analytica should be held accountable. After the film, that figure dropped to 5 percent. By the end, most of the audience agreed that the data providers – like Facebook – and the analytics companies – like Cambridge Analytica – rather than individual scientists should be held responsible.
Surely individuals should be encouraged, or even expected, to stop and think about the morality of the tools they are building in their company’s name. And if it is not the individuals at fault, then how can those very same individuals work to protect our data and our democracies?
In the post-screening discussion, Paul Hilder, a writer featured in the film, encouraged all of the young data scientists out there to “think very carefully about how you use your talents and your skills… when we have a huge number of social problems that need to be solved and data science could be a huge part of solving them.”
Dr. Julian Huppert, a former MP for Cambridge and current Director of the Intellectual Forum at Jesus College, explained why he hopes that individual scientists will move towards publicly questioning the ethics of their work, pointing out the leverage that individuals have: “when it becomes hard to recruit because you are doing things that people don’t like, and the good people you have start leaving, that is actually possibly more of a problem than some negative press stories.”
While Huppert argued that micro-targeting should be banned altogether, Julian Wheatland, Cambridge Analytica's former COO and CFO, disagreed. Rather than individual scientists being held responsible, he argued, the emphasis should instead be placed on enforcing ethical and transparent corporate governance.
“If you’re running a company, and you don’t give people any tools or any structures or any guidance or any boundaries in which to consider ethical questions of the work that they’re doing [then] to expect the data scientists or the psychologists to be able to put their hand up and say this is where we’ve crossed an invisible line, is not very reasonable.”
Perhaps most radical is Brittany Kaiser’s position. Since turning her back on Cambridge Analytica in 2018, she has been campaigning for individuals to monetize their personal data, arguing that the big tech companies should be paying us for using their services, not the other way around.
Whether or not monetizing personal data is the solution, her stance motivates us to adopt an empowering change of perspective. In thinking about how we as individuals can protect our data and our democracies, we need to fundamentally change the way we think of ourselves and our personal worth.
We have been conditioned to identify as customers and users – of Facebook, Amazon, Google, and every tech giant out there. But we are not the customers, we are the commodity, and we are not the users, we are what they use to make their profit. With that in mind, we can remind ourselves that we are not powerless in this game.
- News / Cambridge ranked top UK university for employability 21 November 2024
- News / English Faculty returns to handwritten exams following Inspera disruption22 November 2024
- News / Pro-Palestine protesters occupy Greenwich House22 November 2024
- Lifestyle / How to survive a visit from a home friend19 November 2024
- Comment / Cambridge’s safety nets are often superficial20 November 2024