The Facebook-Kashmir Blocks: Technical Errors, Editorial Mistakes and Invisible Censorship Galore

The experiences of those who were restricting from sharing information on Facebook offer insight into how social media giants act as a final arbiter of what is appropriate and what is not.

The blocking mechanism that Facebook put in for Kashmir content. Credit: Screenshot, The Wire

The blocking mechanism that Facebook put in for Kashmir content. Credit: Screenshot, The Wire

New Delhi: Over the past month, residents of Jammu and Kashmir have been cut off from mobile internet services, regular internet access and, for three days, even daily newspapers, after the death of Hizbul Mujahideen commander Burhan Wani in a gun battle with security forces sparked a period of tension and violence for the conflict-stricken state.

Far away from the J&K region, from London to California, a group of nearly 30 academics, activists and journalists found that their ability to discuss and share information regarding the happenings in Kashmir on Facebook was similarly curtailed. Over the past three weeks, Facebook’s global content moderation team, led by policy head Monika Bikert, has kicked into overdrive: deleting swathes of posts that apparently violated the company’s ‘community standards‘, handing out temporary account bans and restricting the reach of certain Facebook communities.

“The entire experience has generated anxiety and a feeling of being bullied,” said Dibyesh Anand, professor of international relations at the University of Westminster, whose account was blocked for two days over the last two weeks.

This isn’t the first time Facebook’s team has waded into a politically controversial situation: Just in the last month, the company has blocked posts that link to the leaked trove of hacked e-mails from the Democratic National Committee while blocking several Arab atheist community groups over alleged anti-Islam speech in late June.

The replies Facebook inevitably gives are boilerplate responses that refer to the company’s community standards pages, technical errors, or, in some cases, an editorial cock-up by a member of its content moderation team.

In order to better understand this opaque process – that has significant implications for freedom of speech, online activism and public discourse – The Wire spoke to a number of academics and activists who had their Facebook posts deleted and accounts blocked over the past two weeks.

Although their accounts do not point definitively towards fault on Facebook’s part, their separate experiences offer insights in to the way social media giants view politically controversial material, the way they hand out punishments and – disturbingly – how they act as a final arbiter of what is appropriate and what is not.

Anti-community standards

1) Tamoghna Halder, PhD student, University of California – Davis.

Halder’s case borders on the extreme. Not only has he been blocked from posting on Facebook five times since July 2015, the time period in which he is restricted from posting increased with each subsequent block. His experience also presents a unique example as he has been blocked from posting for both Kashmir content as well content from the Kangla protest in Manipur.

“I got blocked for the first time on 15th July, at noon, for 24 hours, for posting a photo of the Kangla protest. When I came back, before I could post anything, I was blocked again for 24 hours on account of an image of Kashmir graffiti which I had taken from Google that I had posted earlier,” he said.

After receiving another three-day ban for re-uploading the Kangla protest photo, he was handed another 7-day blocking episode for uploading three images of Kashmir graffiti (July 20th – July 27th). After coming back on July 27th he re-uploaded an album of Kashmir graffiti and received a 30-day block period (up to August 27).

“I am still under the ban for another 28 days. I currently can’t do any activity other than reading messages and accepting friend requests. I can’t share or comment,” Halder said.

2) Arif Ayaz Parrey, senior sub-editor, Centre for Science and Environment

Parrey’s account was blocked on July 26, slightly before noon. Unlike Halder, his account was completely disabled three times over the last four days.

“I was chatting on FB messenger with a friend when I was suddenly logged out. When I tried to log back in, it said my account had been disabled. I was shocked of course, but tried to remain calm and send Facebook my proof of identity as requested,” Parrey said.

His account was restored by the company within an hour after being taken down, but he was also informed that a Facebook page (Kashmir Solidarity Network) for which he was an administrator had been taken down. Other administrators of the same Facebook page apparently also had their accounts blocked.

The same evening, around 7 pm, his account was disabled for a second time but only for a few minutes. After logging back in, Facebook told him that a link that he had posted was removed and that he was banned from posting and commenting for 24 hours.

The link in question is from the Kashmir Reader and is a report of how a group of Hizbul Mujahideen militants have started distributing posters in the likeness of Wani, stating that their mission would continue. When Parrey tried posting the same link again, he was blocked from commenting on Facebook for another 24 hours; a block that has yet to be lifted.

Over the past few days, Parrey has posted a number of links about the violence and turmoil in Kashmir, most of which have to do with the violence being committed in the region with some reports referring to the consequences of pellet guns used by Indian security forces.

3) Dibyesh Anand, professor of international relations, University of Westminster

Anand’s first post, which talked of the “Kashmiri self-determination movement” and the alleged double-standards of many Indian Hindus, was taken down on July 12 and he was blocked from posting on Facebook for 24 hours. After his block ended, he received a message from Facebook that apologised and said that his post had “accidentally been removed” by a member of their team.


After that on July 15, another of his posts was blocked and he received another posting ban for 24 hours. After that ban ended, on July 19 he received yet another message from the Facebook content moderation team that apologised for his post removal and ban.

“At the very least, Facebook should know that even if we are treated as ‘customers’, we cannot be taken for granted nor can we be bullied. By focusing on specific demands, we are giving FB the opportunity to come clean about its stance on censorship of posts highlighting state atrocities in Kashmir,” said Anand, referring to an online petition he and his fellow supporters have launched.

4) Sanjay Kak, Documentary Film-Maker

Sanjay Kak is a film-maker who is widely known for his 2007 film called Jashn-e-Azadi, a documentary about conflict in the Kashmir Valley and the “Kashmiri freedom movement”.

The circumstances surrounding Kak’s account ban — which happened for two days from July 26th to July 28th — are equally unique. The post that Kak believes led to his ban, was an anecdote that he posted on Facebook ok July 9th. The post itself is about the general atmosphere in Kashmir in the aftermath of Burhan Wani’s funeral; a policeman refuses to let a civilian pass through a barricade saying that “today is the day that Burhan Wani was martyred”.

Kak however suspects that this post was brought to the attention of Facebook after journalist Rahul Pandita referenced it in an article for Firstpost. “The twist is that barely an hour after it appeared online, my FB account was deactivated, and remained so for two days,” Kak said in a Facebook post.

“Since Mr. Pandita is a journalist of some experience I have no doubt that it was not his intention to bring this unwanted attention to my FB account.  But sadly we live in times where a careless, unsubstantiated allegation, emanating from a somewhat reductionist reading of a complex anecdote, does end up in harassing people. (And lets face it, FB being blocked does only qualify as harassment. Its not physical violence).”

Lessons on censorship

There are a few curious patterns that can be gleaned from these incidents. First, it’s possible to posit that Facebook and other social media giants have a number of factors or criteria that go into handing out bans on posting content; a principle that a number of social media spam analysts confirmed to The Wire as “being natural”.

In the case of Parrey, most the administrators of the Kashmir Solidarity Network Facebook group received bans on posting content, some of which were revoked after they “confirmed their identity” to Facebook by uploading a few documents. It could be possible that profiles like Parrey are flagged because of their association with a blocked Facebook page: The Wire tested posting the link that Facebook had warned Parrey about and was not stopped from posting or blocked afterwards as a result; there does not appear to be a site-wide ban on that specific Kashmir Reader report.

The content posted by Anand, along with other prominent personalities such as documentary film-maker Sanjay Kak, appear to have largely been removed “by mistake”. While Anand received an apology indicating that his posts had been removed mistakenly by a member of Facebook’s content moderation team, another Facebook source The Wire spoke to said that Kak’s case was also a “technical error”.

That Facebook’s process for moderating content, and upholding its community standards, has flaws is obvious. For a vast majority of their work, as the company states, they rely on user reports after which content is removed both algorithmically and by human intervention.

It is this human intervention, by Facebook’s content moderation team, that is troubling. When asked to comment for this story, a Facebook spokesperson sent an emailed statement: “Our community standards prohibit content that praises or supports terrorists, terrorist organizations or terrorism, and we remove it as soon as we’re made aware of it.  We welcome discussion on these subjects but any terrorist content has to be clearly put in a context which condemns these organisations or their violent activities.”

The last sentence of the company’s statement is particularly chilling because, as evidenced by the apology Anand received, at the end of the day Facebook’s global team exercises a certain amount of editorial judgment and in the case of Kashmir content, all posts need to be “put in a context that condemns these [terrorist] organisations or their violent activities”.

Who determines this sort of context? How do they do it? The editorial judgement required here inevitably creates a structure of black box censorship and raises a number of obvious questions: How many Indians or Pakistanis are on Facebook’s global community standards team? How does one make sure that they don’t delete posts based on their own biases? This was an issue that cropped up earlier this year during the Facebook trending news controversy.

In a press conference on Saturday, a spokesperson for Pakistan’s Ministry of Foreign Affairs voiced these concerns by publicly stating – without adducing any evidence, however – that Facebook’s Kashmir censorship was being manipulated by “employees from India or other nationals of Indian origin”.

Facebook insists that it has a “global team” for upholding its community standards, with a few sources telling The Wire that the team is “significantly diverse”. According to people with direct knowledge of the matter, Facebook is removing any online content that expresses support for groups that are involved in Kashmiri militancy violence or criminal behaviour in the region. Any posts that support Burhan Wani or leaders of similar organisations or which condone their violent activities will be removed quickly.

Content, however, that falls in between these two and is difficult to categorise might fall through the cracks as seen in the case of Anand or Kak. Facebook follows up on the removal of this content with apologies or refers to them as technical mistakes: posts that were flagged by users and removed by a member of Facebook’s team in error.

There is little doubt Facebook’s policies need to improve, but even if some amount of bias goes into the moderation process, the real question is whether anything can be done about it.

Featured image credit: Reuters