Friday 10 May 2013

Facebook's censorship hypocrisy

Facebook's censorship policies have been in the news again recently. This latest row is the result of several videos of people being decapitated. These are (somehow) not in violation of Facebook's community standards, and the site resisted initial pressure to have them removed, only caving when the media started reporting on the story. Their initial reasoning for leaving the video online was that it was being shared in the spirit of condemnation of the actions depicted in the video, and that Facebook respects "people's rights to describe, depict and comment on the world in which we live". While this statement makes a great soundbite for the site, it is clearly shown to be false when considering how the site enforces its existing censorship policies.

My main beef here is that there seems to be a world of difference in the strictness of the policy when applied to different types of content. Let's look at those community standards again. The guide on hate speech acknowledges a distinction between serious and humorous speech, allowing purveyors of "bad taste" jokes a platform while (theoretically) filtering out the worst excesses of hate groups. In practice however, the distinction between serious and humorous speech is not so easily made, especially over the Internet where a person's intended meaning can easily be misinterpreted by others. In this instance Facebook seems to err on the side of leniency when filtering content, and rightly so - while jokes about dead babies may be grossly offensive to many people, this alone is not reason enough to instigate a policy of censorship. Facebook's policy on phishing and spam is taken equally lightly - to be honest, I was surprised to find this included in their community standards at all. While I find unsolicited spam annoying, I can understand why it's there - Facebook is a business, and the first priority of any business is to make a profit, so fair play.

One of their community standards which is taken seriously is "nudity and pornography". The rule here seems to be no nipples (if female), and nothing in the crotch area, though pretty much anything else goes. (It's worth noting that Facebook makes a specific exemption for breastfeeding photos.) This is probably what gets me the most about the lopsidedness of their censorship policy. The argument most often brought up in the interests of censorship (apart from the "offense" point, which I don't respect enough to consider an argument) is that Facebook's minimum age limit is 13, meaning potentially anyone of this age with a Facebook account can access anything posted on Facebook. What I don't understand here is how nipples are more damaging to a child's development than videos of decapitation or extreme violence; though the decapitation videos have been removed, there's plenty of other violent content on the site (don't worry, that link doesn't point to them). To understand my problem here, take a look at these pictures:




It took a surprisingly long time to find a photo of Femen that didn't include swearing.

The one on the left is copied directly from Facebook, and is a fairly vanilla example. The one on the right I had to source from Google images and is apparently too extreme for Facebook. That particular photo is taken from a French news site and is hardly pornographic in nature (unless you're really missing the point). Now consider our hypothetical 13-year-old again. A lot of children would be upset by the image of a dead animal - could the same really be said of boobs? Most of us are, after all, exposed to boobs from an early age. That's not to say that Facebook should start allowing hardcore pornography on its site, especially in the absence of any kind of age-verification wall, but to ban images such as the above on the same grounds is just silly.

I think the root cause here is what is considered appropriate varies from country to country, and Facebook is very much an American creation. America has always been somewhat lenient concerning depictions of violence in the media, but much less so when it comes to anything sexual. I have no idea why, perhaps something to do with the country's Christian roots, but that's the way it's always been. However, a (supposedly) international and modern institution like Facebook can and should be more mature about these things. Did I mention Femen have a Facebook group? They have to censor all their own photos to stop Facebook taking them down.

My point here is that Facebook needs to have a serious rethink about their censorship policy, and what they consider to be okay for their website. Certainly, when images such as that Femen protest photo need to be censored they can't claim to respect depictions of "the world in which we live" without reeking of rank hypocrisy. Jeremie Zimmerman makes an excellent point in the article I referenced earlier - "Since Facebook collects and stores so much information it should be able to determine when one of its members is a minor and is about to be exposed to content that has been reported as unsuitable, and display a warning message". I would go one step further - if a user is a minor in their country (let's not forget that different countries have different definitions of this), they should be barred from seeing certain content. This reduces the amount of censorship needed on the site while protecting children from age-inappropriate content. While debate may rage on exactly what this entails, a simple "mature content" flag on photos or videos such as the ones discussed in this blog would suffice to quell the criticism of Facebook almost entirely. Moreover, the more content Facebook hosts, the more options they have for making money from it. One can only wonder why they haven't implemented a system like this already.

No comments:

Post a Comment