“Who here has seen an advertisement that has convinced you that your phone is listening to your conversations?” asks professor David Carroll to a graduate classroom in the opening scenes of Netflix’s new documentary The Great Hack (2019).
The entire class, without hesitation, puts their hands up, and the room bursts into awkward, terrified laughter. The film takes a deep dive into the Cambridge Analytica revelations and trials of 2018, and while at first it may seem to tell us what we already know — that yes, our phones are probably listening to us, and yes, we ought to deactivate our Facebook accounts — the real truth of the matter is far more sinister.
Our data leakages—the information that we spill across the Internet with every search, scroll or purchase — are being mined in order to generate specific psychological profiles and personality types. These psychological profiles are then used to target us with specific content. This “content” is not just advertising — in fact the advertising, at this point, is the least of our worries — this content is quite simply, and fundamentally, propaganda.
“People don’t want to admit that propaganda works,” says Carroll, “because to admit it means confronting our own susceptibilities, horrific lack of privacy and hopeless dependency on tech platforms ruining our democracy.”
We are living with propaganda every single day, whether we realise it or not, and this is directly interfering with the democratic method. While propaganda cannot physically make us go a certain way in the election booths, it can do a lot to guide us along before we reach them.
In March 2018, Carroll, an associate professor at Parsons School of Design in New York, filed a claim in the High Court in London asking the (then relatively unknown) company Cambridge Analytica and its parent company, SCL Elections Ltd., to give him back the data they had procured on him.
A simple request: tell me what you know about me, and how you found out.
Since both companies were registered in the UK, Carroll approached the UK courts and was successful in suing for information. Carroll, who teaches digital media and app development, was acutely aware that the data we produce online wasn’t ‘just evaporating’. The data that companies like Cambridge Analytica collect gives them direct access to our emotional state of being: it’s not just about what we might buy—but what we fear, and how that can be used to get our attention, or worse, change our minds.
Reminder: Facebook is not eavesdropping on your conversations with your phone’s microphone to target you for ads. Otherwise, Zuck probably did perjury. His ad targeting is just freakishly accurate. #TheGreatHack https://t.co/biDFRwxodR pic.twitter.com/st0QEWHZdB
— David Carroll ? (@profcarroll) August 13, 2019
Carroll’s investigation begins with Project Alamo (named after the Battle of Alamo, a 13-day siege in 1836 in which Texan rebels fought for independence from Mexico). Project Alamo was the Trump campaign’s digital headquarters—and at its peak, it was spending over $1 million a day just on Facebook ads.
Cambridge Analytica spearheaded the operation, bringing in executives from Google and Facebook to the table, and developing the ‘OCEAN five-factor personality test’, which claimed to have over 5,000 data points on every American citizen.
Christopher Wylie, a young data scientist that helped set up Cambridge Analytica (but later turned whistleblower, giving incremental testimony during the UK and US trials) says that Kogan helped access apps on Facebook that had special permissions, where they were able to harvest data not just from persons that joined the apps but also their entire friend networks. So if one person in your friend list had fallen into the trap, so had you.
“We took things like status updates, likes, and in some cases private messages,” says Wylie, “we weren’t targeting you as a voter, but you as a personality.” Wylie estimates that over 50 million users had their data privacy breached in this way.
Mark Zuckerberg is afraid of facing international scrutiny https://t.co/KPUBQV041D
— Christopher Wylie ?️? (@chrisinsilico) May 28, 2019
These quiz-based apps seemed to tug at the heart of a very 21st century condition: in making people answer almost therapy-like questions about themselves they were able to hack in to the desire to be understood.
Questions went as far as to ask whether ‘I have a rich inner life’ or ‘prefer to be left on my own’. Loneliness is perhaps the most prevalent condition of our times, and we manifest this online, searching for community. (For instance, new research shows that one in five millennials say that they don’t have a single friend.)
Without our consent, this desire for community has been turned against us. The film seems to say, ‘It’s like a boomerang — you send your data out, it gets analysed and then it comes back to you as targeted messaging to change your behaviour.’ So while accelerated capitalism has made us more lonely and alienated than ever before, it also uses our loneliness to manipulate us further, turning us into commodities and easy to manipulate datasets. While The Great Hack might be about the goings on of one company and its data usage, it speaks to a much wider issue of data privacy.
Ironically enough, the story of Cambridge Analytica begins with the Obama campaign of 2008, which was the first to successfully use social media. Alexander Nix, CEO and founder of Cambridge Analytica — along with Breitbart’s Steve Bannon, who gave the company its name — identified a “market opportunity”: Republican candidates did not have the same acumen with using social media to the advantage of their campaigns.
In 2015 Guardian reported that Ted Cruz’s presidential campaign had hired Cambridge Analytica to harvest Facebook user information to boost his numbers. He went from being the lowest rated candidate in the Republican primaries to the last man standing before Trump won the nomination. Immediately, the Trump campaign hired Cambridge Analytica.
While Cruz’s rise to fame has previously been thought of as the company’s first success story, it is most certainly (and sinisterly) not. In fact, the company has been running elections from as early as 2010, and in the most frightening part so far: they were actively testing out their manipulation tech in experimental field-tests in developing countries across the world. From 2010 onwards, Cambridge Analytica were taking on about 10 national campaigns for the posts of prime minister or president each year: Trinidad and Tobago, Malaysia, Romania, Kenya, Ghana, Nigeria, and even India (in 2010).
In a scene from the film we see a shot from inside Nix’s office and behind him, if you look close enough, is an Indian National Congress poster.
Enter Brittany Kaiser, one of the central protagonists of the Cambridge Analytica investigations and also of the film. Kaiser, a young American who has been working on elections since she was 15 or 16, was COO of Cambridge Analytica — and is a compelling protagonist.
In the film, we see some great shots of Kaiser swimming in an infinity pool in Thailand and tying a string of whistles to a wooden sculpture at Burning Man (a ham-fisted metaphor for her whistleblowing).
Right before Kaiser appears for her hearing in the UK High Court she is in a red bikini top and a leather jacket, hunched over her laptop at the airport, drafting questions for Mark Zuckerberg’s Senate Judiciary hearing, which were happening in tandem: ‘How much of Facebook’s revenue comes directly from the monetisation of user’s personal data?’ All of it, she shrugs.
At one point Kaiser had the keys to Bannon’s townhouse in Washington, was a member of the NRA and was hardly ever seen without a pearl necklace, cowboy boots and a matching cowboy hat. Kaiser tells us that they were actively looking for ‘people’s levers of persuasion’, and people that they called as the “persuadables”: vulnerable persons that could be most easily influenced.
“Our creative team designed personalised content to trigger those individuals,” Kaiser says, “we bombarded them until they saw the world the way that we wanted them to see it.” Sound familiar?
Kaiser provides crucial evidence as to just how many campaigns Cambridge Analytica was involved in, and how these early tests helped them to refine their technique. The film takes us through one instance, in Trinidad and Tobago, through footage of Nix making a sales presentation on the same. It’s shocking and powerfully racist.
But of course we knew that. This is where the film really opens itself up: and shows how each of us are all affected by what is going on — it is not just a story about America or the UK.
“We are a behaviour-change agency,” begins Nix. He speaks proudly of the “Do So” campaign, which specifically targeted young Afro-Caribbean populations in Trinidad to get them to stop voting, “The campaign had to be non-political because kids don’t care about politics; it had to be reactive because they’re lazy.”
The two main parties in Trinidad were divided along racial lines, the Afro-Caribbean versus the Indian diaspora. Nix was working for the Indians. They had rappers and break-dancers make videos about the power of resistance through denying the vote; it was a campaign that intentionally looked to disenfranchise youth from engaging in politics. And the thing is — it was insanely successful.
The difference between 18-35 year old turnout was around 40%, which swung the election by almost 6%. “That’s all we needed,” says Nix, smiling, “because elections are always so close.”
Cambridge Analytica may since have disbanded — given the success of Carroll’s case — but its strategies certainly have not been erased from existence. Our data remains just as vulnerable, especially given the ambiguity of data privacy laws.
The film serves as a powerful reminder of what happens when we leave data trails online, and to begin exercising our rights more consciously in what we engage with. We need to ask ourselves: is ageing your face in an app really worth loosing your political agency?
Social media has created bubbles — and the irony of the film being hosted on Netflix, which has the most powerful prediction algorithm in the world is not lost here — and we rarely see outside of these bubbles of opinion and “content” (read: propaganda). There is a powerful resistance in disinvesting, and repeatedly questioning how and where your data is put to use.
Skye Arundhati Thomas is a writer and editor based in Mumbai.