Digital

#NotMyFilter: Bursting Our Own Filter Bubbles to Save Democracy

The content we consume on social media is filtered to lull us into a reassuring bubble that reinforces our own politics. But we can choose to burst them.

A smartphone user shows the Facebook application on his phone in the central Bosnian town of Zenica, in this photo illustration, May 2, 2013. Credit: Dado Ruvic/Reuters

A smartphone user shows the Facebook application on his phone in the central Bosnian town of Zenica, in this photo illustration, May 2, 2013. Credit: Dado Ruvic/Reuters/Files

“#NotMyPresident” screamed placards just a day after Donald Trump swept the polls to unexpectedly win the US presidential election. The operative word here, ‘unexpectedly’.

Just consider this – towards the end of October when news surfaced that an AI system call MogIA was pipping Trump for victory, most people ignored it as the media and polls told them otherwise. Had they paid a bit more attention they would have realised that the system was probably the closest thing they could have gotten to an unbiased opinion.

The system – which had been built to develop its own rules instead of relying on algorithms written by developers who themselves might suffer from biases – was assessing over 20 million data points from various online platforms such as Google, YouTube and Twitter. In the process, it cut across party lines to come up with its unbiased prediction. Yet large swathes of the Clinton camp had genuinely believed for months now that victory was theirs, only to be blindsided in the end by a result no one saw coming.

At least not in their filter bubble.

Not only an American problem

This obliviousness is a critical threat posed by filter bubbles created by social media algorithms. Their algorithms identify the kind of stories and sources you and your friends like, and try to show you more of the same. The end effect is a closed confine of comforting and relatable news – a filter bubble. The continuous filtering envelops you in a cocoon of reassuring and familiar news stories and information, and eventually locks you within a proverbial echo chamber where you only end up only receiving material that conforms with your sensibilities.

These bubbles can be directly linked to the increasingly polarised debates across the globe. Even in India there has been a worrying trend of late to label everything neatly as right or left. Dissenting views from one’s own are easily labelled as ‘Aaptards’, ‘Sanghis’, ‘Khangressi’ and so on.

While it is easy to just blame it on politics, we can’t deny the underlying impact of computer algorithms, which themselves view everything in the binary, thus washing away the grey areas of our public discourse.

The overall impact of such filtering can be broken down into the following effects, which both individually and cumulatively shape public discourse.

  • Information blinkering: The most basic impact of filtering is that it reduces visibility of information in your streams. You get digital “blinkers” slapped onto you thereby restricting your field of vision to only what the algorithms decide is right for you.
  • Entrenching biases: As this information tends to be generally viewpoints you agree with or believe (decided by the fact that you like, read and share them) the algorithms encode your tastes that much deeper. Eventually you will end up only receiving material that conforms with your sensibilities, locking you in a proverbial echo chamber and entrenching your existing biases.
  • Rationalising fears: The filter shock when you step out of your bubble can be disorienting. This disorientation can make opinions and views seemingly different than one’s own seem alien and alarming, thereby feeding into the fear psychosis of all that is outside the bubble.
  • Opinion entitlement: Repeated bombardment of similar opinions and views helps solidify feelings into “facts”, thereby making it easier to waive off any differing views that might occasionally crop up in our feeds. It also aids normalising more extreme views due to continuous affirmation from a similar thinking network. No wonder Oxford dictionary’s word of the year is ‘post-truth’.
  • Discourse divide: The final and the most critical impact that derives from all of the above is building chasm in public discourse, as the algorithms drive diverging views into their own sealed off bubbles. Just ask the British who realised after their momentous vote that their public was effectively voting on completely different agendas.

Bursting our own bubble

While things have not yet reached an impasse in India (I’m glad I’m still able to debate issues with friends without melodrama), it isn’t hard to see that we’re sitting on a ticking time bomb. There is already growing clamour for platforms such as Facebook to put stricter content guidelines in place. However, experiments with this in the past have been more worrying than relieving. Given the complex nature of these platforms and the intricate nature of debate around content monitoring, it will be a while before any proper solution emerges.

In the meanwhile, is our social discourse and, by extension, our very democratic principles under threat? Probably, but that doesn’t mean there aren’t measures a discerning reader can take to burst their own bubble.

They can consume news across the spectrum. One of the key drawbacks of consuming news on social platforms is that they tend to provide you pieces only from your most frequented sources. Over time this tends to enclose you in a bubble. The best way out is to consistently search for varied and even contradicting sources, thereby ensuring that you are at least listening to all sides of an argument.

The reader must also understand the importance of verification. In an age where opinions are coming to matter more than facts, it is the responsibility of each individual to ensure that they are not propagating fallacies. This also helps ensure your news feed does not get populated with dodgy sources. With all sides of the spectrum readily promoting half-truths and even outright lies to suit their narrative, it has become imperative to fact-check as much as possible. The simplest (but not fool-proof) way to do this, is to do just a quick Google search to see if other ‘reputable’ publications are also reporting the same. If not, don’t click that share button!

Focusing on facts instead of opinion might also help. This is a contentious point, but still one worth considering. With the large amount of rhetoric and exaggeration floating around in the social sphere, it might be best to focus more on facts. Opinions by their very nature, appeal to our emotions and are hence more difficult to judge. While trying to avoid opinions altogether might be an exercise in futility, a more weighted focus on facts might at the very least steer us clear of gross fallacies. [Note: Sprinkle this point with a heavy dose of Google search.]

Finally, wrapped in our comfort streams, it can often be a shock for us to hear someone articulate something completely contradictory to our beliefs. While our instant reaction might be one of disbelief and even ridicule, we need to consistently strive to be open to new ideas and opinions. After all as Thomas Jefferson put it, the cornerstone of democracy rests on the foundation of an educated (unbiased) electorate.

While these measures might sound like a lot of work – and they probably are – it is still worth the effort to avoid the mass polarisation and political detachment we have seen elsewhere. #NotMyFilter

Sidharth Sreekumar is an in-house entrepreneur with a research and analytics MNC, and has worked extensively in developing platforms for content identification and filtering.

  • Discussee

    I guess we all need to find some quiet space somewhere, away from the newsfeeds, advertising and chatter, so that we can keep our heads straight.

  • K SHESHU BABU

    One of the perils of technology is calibrating mental makeup with status quo. The present elite have mostly fallen into the trap and are unable to come out of the black hole.