This week: Putting a price on privacy, Facebook’s relationship with free will and how to regulate Big Tech.
Name-Place-Animal-Thing is The Wire’s culture newsletter. If you’d like to receive regular updates from this column, please consider subscribing here.
The East India Company was one of the first multinational corporations to undermine and even usurp the state’s role. And now tech giants like Facebook and Google are giving rise to similar anxieties about state power – what happens when corporations start wielding more power over people than governments? Where do citizens and ordinary people fit into the tussle between non-state and state actors? This week’s pieces seek to drive home the real cost of privacy, explain the ethos that drives Facebook and contemplate regulations to rein in Big Tech.
When the price of a t-shirt is a personal picture or text from your phone
What were the last three messages you sent? What about the three most recent pictures on your phone? How much are they worth to you – can you quantify their value in terms of money or material goods?
A few days ago, cybersecurity firm Kaspersky Lab came up with a gimmick to address the value of our personal data – the very same which we unthinkingly surrender to Facebook and Google daily. The Data Dollar Store sold mugs, t-shirts, screen prints and original work by street artist Ben Eine. The catch? You could only ‘buy’ something by giving the store’s employees personal data from your phone. For instance, a t-shirt would cost you the three most recent photographs on your phone, or your last three messages. You couldn’t choose the price by picking innocuous messages. If you wanted to buy an Eline original, you’d have to give your phone to a Kaspersky employee and they got to choose five photographs or three screenshots of your messages. Haggling was allowed but the final decision lay with Kaspersky.
Nick Summers, the author of this piece went in confident that he had nothing to hide; but once faced with the actual exchange, grew uncomfortable with the explicit intrusion involved in these purchases. He ended up paying with pictures that didn’t feature his face or those of any other people in his phone’s memory. Summers wanted to keep it as impersonal as these limited conditions allowed.
Eine told Summers, “I want people to be worried about the information they’re giving away and then realise that they’re giving this information away all day, every day.” He added that currently, nobody is rewarded for giving away this information but Eine thinks they ought to be.
He’s not alone; others have recently written about how Facebook owes its users a universal basic income as compensation for the data they contribute to the company’s running and continuing success. Other issues, now routine for Facebook users across the world, keep popping up. What can Facebook censor or promote on its site? Who gets to decide?
A recent example is a post by Humans of Hindutva, a Facebook page that satirises far-right Hindu nationalist rhetoric in the popular Humans of New York format. The page’s anonymous owner posted a status expressing frustration with Facebook for removing posts flagged for inappropriate content without telling the user which posts had been removed, or why.
Part of the message read, “I have never shared any content which is sexist or racist or abusive in any way. I could explain to someone from Facebook but they don’t have a system where one can make their case.”
There is no forum for users to “make their case” and even if there were, is Facebook really in the position of power to determine? The underlying question here is again a variation of ‘what does Facebook owe its users?’
Champion of individual expression or curtailer of free will?
The US government is starting to grapple with this question on a much larger scale. Did Facebook allow Russian agencies to perpetuate fake news on the platform, tipping public opinion towards Donald Trump?
After months of playing down concerns over the role social media played in the surprising results of the US election, Facebook’s practices have hit the public eye in a big way, inviting more scrutiny than the debate over net neutrality and the company’s Free Basics services.
Franklin Foer considers the workings of Facebook in this piece for The Guardian, arguing that the company’s promotion of free will and individualism is farcical.
Foer writes, “In reality, Facebook is a tangle of rules and procedures for sorting information, rules devised by the corporation for the ultimate benefit of the corporation. Facebook is always surveilling users, always auditing them, using them as lab rats in its behavioural experiments. While it creates the impression that it offers choice, in truth Facebook paternalistically nudges users in the direction it deems best for them, which also happens to be the direction that gets them thoroughly addicted.”
What Foer is saying shouldn’t be surprising. Firms, like all economic actors, are supposed to be morally neutral, which means they don’t owe their users anything other than the promised service or good in exchange for a price. An autonomous and voluntary exchange of valuable goods and services is the important part here. But since we don’t pay for using Facebook, it monetises all the other information that we do give it. A fact that has prompted introspection in the form of papers, op-eds, government inquiries and gimmicks like The Dollar Data Store.
Foer notes this quote from Zuckerberg, as part of the explanation of how Facebook came to be this prominent in our lives and the ethos of the company. Apart from Zuckerberg’s numerous comments about expanding transparency in all spheres of life, he has also said that Facebook “is more like a government than a traditional company.” Adding, “We have this large community of people, and more than other technology companies we’re really setting policies.”
Like what you’ve read so far? Please consider subscribing here if you’d like to receive regular updates from this column.
The real power, the very fabric of this platform and other tech giants, lies in their algorithms. Foer compares them to recipes to make systems that allow us to automate decisions. But points out something that we all seem to be forgetting – these are still human-created, not objective and free of bias. As he puts it, “A system is a human artefact, not a mathematical truism. The origins of the algorithm are unmistakably human, but human fallibility isn’t a quality that we associate with it.” Sure, Facebook sets policies within its domain, but because it also claims that it can tell whether users are gay, pregnant etc. and also impact their moods and behaviour, Facebook’s policies go beyond the domain. In fact, Facebook is currently under scrutiny for allegedly allowing Russian agencies to promote fake news on the site, allowing Russia to manipulate Americans during the US election. The fact that the company only reluctantly tries to reciprocate the transparency it encourages in its users is now glaringly obvious.
If Facebook espouses the individual’s right to freedom of speech and Zuckerberg thinks of the company as a government capable of setting policies, then Facebook is comparable to a democracy. Democratic governments are not supposed to manipulate citizens, and are also ultimately answerable to citizens/users. But if algorithms now determine what we see and may soon be proven to have influenced users’ enough to have swayed an actual election, then Foer is right to call attention to the subjectivity of algorithms themselves.
In his concluding paragraph, he warns, “Facebook would never put it this way, but algorithms are meant to erode free will, to relieve humans of the burden of choosing, to nudge them in the right direction.”
Is data the new oil, or is it the new water?
Governments across the world, especially in Europe are already trying to nudge Facebook, Alphabet (Google’s parent company) and Amazon into complying with privacy laws. But given the diffuse, global nature of the product and the internet itself, these companies can simply tweak their operations, continuing to draw private data from American users while refraining from doing so with European ones.
These monopolies are leading to comparisons with the East India Company and Big Tech is being considered the new Big Oil. This Economist article considers what would happen if Facebook et al are brought under stricter regulation, using a similar system that regulates utilities like water and electricity. The author suggests using a regulated asset base (RAB) to regulate social media giants. The RAB works like this:
“The idea is that the monopolist’s profits should not exceed the level that a competitive market would allow. That means estimating the cost to an imaginary new entrant of replicating the incumbent’s assets (this is the RAB) and calculating the profits the newcomer would make if its returns matched its cost of capital. The actual monopoly’s earnings should not exceed this amount.”
If an RAB were imposed on Facebook, users could choose whether to sell their data to advertisers instead of Facebook just taking it for free and selling it to advertisers (targeted access for advertisers earned the company $27 billion last year). This would also entail paying Facebook a fee for using its services, which we currently don’t do. In effect, the author posits, “Using figures from 2016, the average Facebook user would pay $15 a year to the firm for its return on its RAB, but they would pocket $23 from selling advertisers their data and the right to be advertised to.” More valuable users would earn higher amounts for their data.
It’s not a perfect solution. The author rightly predicts that regulators would have trouble keeping pace with the rate of innovation in the tech industry. There are additional challenges – companies could “bundle their services so tightly” that distinguishing monopolies within the company’s many services would become impossible (Amazon is in e-commerce, video streaming and food but it’s difficult to tell how much it earns from what). And the Big Oil comparison could just as easily extend to Big Tech’s ability to mould policies to suit its interests.
These pieces lay out the challenges of understanding and regulating Facebook and its contemporaries, but the underlying question remains: Is Big Tech just another version of Big Oil, or is it something entirely different that requires novel policymaking by state actors?
Want to suggest a piece that should be included in this column? Write to me at [email protected]
If you’d like to receive regular updates from this column, please consider subscribing to Name-Place-Animal-Thing.