Facebook’s community censors curb free speech

BY GEETA SESHU| IN Censorship | 21/11/2017
Accounts that are satirical, expose hate speech, or are totally harmless are being blocked for ‘violating’ Facebook guidelines.
GEETA SESHU reports

Facebook's community standards page

 

On October 26, barely two days after US-based lawyer Raya Sarkar created a Facebook ‘Sexual Violence Hall of Shame’ post on what simply came to be known as The List - a listing of over 50 prominent academics charged with sexual harassment - her account was blocked for seven days, forcing her to scramble to restore the account and post responses and reactions to her messages.

It’s another matter that the account was restored soon after she complained to Facebook and that, in the interim, there was wild speculation as to the identity of those who mass reported her and caused the blocking. The blocking was widely condemned on Twitter and even by some of her detractors who argued that a block would amount to censorship.

Sarkar was lucky she managed to get Facebook to unblock the account in a matter of hours; several other victims of Facebook censorship haven’t been so fortunate. For them, it has been the longer route of emailing Facebook’s opaque reporting mechanisms, often being directed to a myriad options and then further options before they manage to actually interact with a human to explain their case.

 

The two billion behemoth

With Mark Zuckerberg’s announcement of the ‘Facebook community’ having grown to two billion in June this year, Facebook has underlined its pre-eminent status as the most significant platform for all manner of digital interactions and communication.

 

It has become the preferred platform for everything, from the sharing of personal events and opinions by individual Facebook users to business networking and publishing for news media. When one considers Facebook’s ownership of Whatsapp and Instagram, its cross-social media ownership is pretty much sewn up.

Periodically, the Internet explodes with criticism of its policies on regulation of content that does not meet its community standards (remember what happened to all those breastfeeding pictures or the controversy over blocking the account of Norwegian newspaper Aftenposten‎ for posting the picture of the iconic Vietnam napalm bombing).

In 2015, Facebook updated its community standards to include a range of issues from hate speech, bullying and harassment, sexual violence, criminal activity etc. It carefully but clearly articulated and defended its position regarding its regulation of content on its platform on behalf of the Facebook community. It is not without its critics, as this report from ProPublica on Facebook’s Hate Speech guidelines demonstrates by revealing that these guidelines ‘protect white men from hate speech but not black children’.

 

Some illustrative examples 

In India, in the past year alone, a number of accounts, both individual and organizational, have been blocked. Here are a few of them:

 

 

Of course, as this report documented, Kashmir occupies a special place in the heart of censors. In February, a cartoon by Kashmiri cartoonist Mir Suhail depicting roots growing from Afzal Guru’s grave and stretching towards Kashmir was taken down by Facebook and he was blocked from posting anything that might ‘praise or support terrorist groups or their actions’.

 

In May, Facebook blocked the account of Baroda-based artist Rolli Mukherjee whose paintings are empathetic to the suffering of people from the strife-torn state. In July, a host of Kashmiris who uploaded pictures expressing their adulation for Burhan Wani found their profiles blocked by Facebook.

 

 

It operated without any problems whatsoever on Facebook till September 18, 2017, when it posted screenshots of Whatsapp messages of death threats received by journalists, social activists and students who had protested against the killing of Gauri Lankesh. It was abruptly blocked, without warning or notice.

Says Wency Mendes, one of the founders of ‘Un-fair web’, “We were at a critical juncture in our reporting and collating of these identities when we were blocked. It was a blanket block. A total erasure of all our work.  We had begun posting Truecaller searches of the names associated with the phone numbers from which these threats were issued. In fact, there were two searches for a singular number. But the block just killed it”.

Facebook responded to ‘Un-fair Web’ after multiple attempts by its founders and digital rights activists to reach it to say that ‘Un-fair web’ was not a ‘real’ identity and had, therefore, broken its community standards. “I find it weird that Facebook decides that we are a non-real identity only when we started to collate the Whatsapp threats after the killing of Gauri Lankesh,” says Mendes, asking “What about the number of Facebook bots that exist?”

Tired of battling the block and eager to continue the vital platform for the African community in India, members of Un-fair Web then began a dialogue with Facebook to restore its identity. Facebook responded by saying that Un-Fair Web could continue as a group or a page and allowed its founders a month’s time to download all their data and retrieve all their 2600 members. But the process, far from over, is arduous. “Try to deep dive into your own timeline. It’s impossible. You need to pull into your archives and Facebook has a problem with that and I keep losing the connection. Now, every time I try to add friends from my own friends list, I get 12-hour blocks. How can I transition to a group at this rate”, Mendes asks.

But he is persevering, despite knowing full well that there are disadvantages in a group or a page, in that he will have to depend on Facebook’s monetary mechanisms to boost a post or suffer digital oblivion.

 

 

An active Facebooker, he was blocked for three weeks in November 2014, for a post about the reasons for Jawaharlal Nehru being chosen as prime minister over Sardar Vallabhbhai Patel. In November 2015, he was again blocked for three weeks when he posted comments about Tipu Sultan and again a year later. This time, he had to send a number of identification documents to Facebook but the problem recurred again thanks to the recent controversy over the Tipu Sultan Jayanti celebrations (Union minister Anant Hegde refused to attend but President Ram Nath Kovind addressed the Karnataka Assembly).

 

Facebook told Prasad that his posts were termed hate speech as per the community standards of the social network giant. However, he says it is clear that the ‘Hindutva troll army’ (the Bajrang Dal and Sri Ram Sene) worked hard to see that anything he posted would constitute hate speech! As he told The Hoot, he has no protection from the death threats issued by these two groups and instead, is made to feel like a criminal!

 

 

Vartha Bharati, a 14-year old newspaper published in Kannada with a circulation of over a lakh in Bengaluru and Mangaluru, was well-known in the region for taking up the causes of dalits, women and farmers. It was also a trenchant critic of right wing Hindutva forces active in the area, highlighting atrocities and the outbreak of violence by these forces.

Vartha Bharati began reaching out to a larger diaspora after it launched its Facebook page and now has 110,014 followers. The block cut it off from its readers but ad revenues also took a hit. Facebook later apologized for what it termed was a ‘technical error’.

 

 

“As much as I was confused as to why Facebook removed this post, especially since it didn’t mention any guilty party, what I found troubling was when Facebook warned me that if I continue to post objectionable content, I may be asked to reveal my identity, the originator said in an article.” In disgust, the originator of the page decided to set up a website where ‘…Community Standards won’t be holding us back. This time we make the community and its standards’.

 

The flaws in the real name policy

Inji Pennu, reporter with Global Voices and a digital rights activist has been campaigning vigorously against Facebook’s real name policy for some years now, spurred by the blocking of Malayalam writer Preetha Gin 2015 (the same real name policy that has been used to block both Un-fair web as well as Humans of Hindutva and acts against names that denote ethnic and language minorities, LGBTQ identities, nicknames or colloquial names etc).

 

Last year, in response to the repeated blocks and take downs, she started a spreadsheet detailing the instances, with links and reasons wherever available. “I think this is very important and clearly reveals the patterns. It shows that Facebook blocks accounts of people who are protesting violence, misogyny or hatred – instead of the other way round!”, she told The Hoot, adding that she utilizes her own experience of restoring her blocked account to follow up with Facebook officials on each block and take down.

In her case, she held out for six months, as Facebook kept demanding an identification, which she refused to give. She said, “ I encourage everyone not to submit any form of legal identification.” Her account was restored following her sustained protest on the issue.

She fears that India is just two years away from the next general election and there will definitely be much more intense political exchanges in the days to come and censorship on platforms like Facebook will effectively silence important voices.

In the comprehensive Freedom on the Net 2017 report, the controls on content by the state are well documented. A paragraph on Facebook’s community standards for censoring content also features in the report (footnotes removed for accessibility):

‘Based on the ruling, Facebook said it would require more formal notifications to restrict content. It restricted 719 items between July and December 2016, citing legal requests from the central government and local law enforcement agencies, down from 14,971 items during the equivalent period in 2015’.

 

Kashmir content is regularly censored

Content removal based on alleged violations of Facebook’s community standards still attracted controversy, however, particularly for content posted amid protests surrounding the death of a militant in the Kashmir region (see Restrictions on Connectivity). 

In July 2016, Jajeer Talkies, a popular Kashmir-based page that publishes satirical content, was temporarily disabled. Three page administrators also had their profiles disabled, making it harder for them to appeal against the action. A news video published by a local daily was also removed. Facebook said it removes content about terrorism that does not clearly condemn terrorist organizations or their activities, but several academics and journalists were among those temporarily suspended from posting after sharing information about the ongoing crisis.

Facebook does have a Global Government Requests Report but this does not cover take downs on blocks that it has initiated on its own, or on the basis of its community. Despite Facebook’s claim that it issues warnings, those blocked assert that they are unaware of the specific complaints against them are or where they come from. The blocks are initiated after which the scramble to restore the accounts begin. Questions later, if at all.

 

Facebook’s official statement

With the repeated take downs and blocks, The Hoot wrote to ask Facebook a few basic questions: If any kind of warning or notice is issued to users whose accounts are blocked before the blocking; if they are informed as to what the specific complaint is and why they are being blocked; if the account is restored, how does this happen; and what does Facebook do to mitigate any damage that may have occurred as a result of the blocking.

 

A Facebook spokesperson sent in a detailed response that  is instructive. Here  are the relevant portions:

“We are an open platform for all ideas, a place where we want to encourage self-expression, connection and sharing. At the same time, when people come to Facebook, we always want them to feel welcome and safe. That’s why we have a set of Community Standards that lay out what is and isn't allowed on the service.  These include policies against bullying, harassment, and hate speech. Anyone can report content to us if they think it violates our standards.

“One report is enough to take down content if it violates our policies, and multiple reports will not lead to the removal of content if it meets our standards. That said, we’re not perfect when it comes to enforcing our policies. Often there are close calls - and we sometimes get it wrong. When that happens, we quickly act to resolve them, and conduct regular audits and quality assessments to help prevent them from happening again.”

In addition, the statement said that anyone could report content if they think it violates Facebook community standards. It said Facebook has ‘real people looking at reported content - they span the globe and review reports 24/7. ‘These include native language speakers because we know that it often takes a native speaker to understand the true meaning of words and the context by which they are shared’.

While one report is enough for Facebook to review content and remove it if it violates standards, it does not remove content just because it has been reported a certain number of times as long as it does not violate its standards. The statement said that ‘when people first violate our Community Standards, we send them a warning and a reminder to read our Community Standards.  If we continue to see further violations, we may restrict a person’s ability to post on Facebook’. 

Facebook makes a distinction between the sharing of what may constitute hate speech for ‘non-hateful’ reasons, or where the intent is not to spread hate. It does respond to appeals against decisions to remove a person’s page or profile, besides doing what it terms ‘regular quality audits’ of the decisions its reviewers make on reported pieces of content, such as posts and photos.

At the moment, it places greater reliance on the community rather than artificial intelligence to identify and report potential hate speech. The spokesperson also admitted that, with ‘billions of posts and the need for context to assess the meaning and intent of reported posts - there’s not yet a perfect tool or system that can reliably find and distinguish posts that cross the line from expressive opinion into unacceptable hate speech’.

What its community really needs from Facebook

While Facebook doesn’t even address the issue of compensating the losses resulting from its blocks, it needs to be more transparent on its ‘quality audit’, as of now an internal process. With the phenomenal reach of this two billion community, it definitely needs an external audit. But it will also need to spell out what it really means by ‘community’.

And, with the multiple and competing communities on Facebook, it will need to specify which community it upholds in the inevitable event of differences and conflict between these ‘communities’.

In this case, it will need to demonstrate that it will enforce community standards that apply to all in an equitable manner and provide the democratic space for discussion, instead of blocking dissenting voices. And, in turn, its two billion community will need to be much more vigilant and assertive about their right to remain on Facebook.