Branding Facebook: Power and Impact


cmimg_72810When updating the Online Hate Prevention Institute’s Facebook page to announce the new report into antisemitism on Facebook,  OHPI were automatically offered the option of making the announcement a promoted post in return for paying Facebook a fee. Given the importance of the announcement, we opted to pay the fee.

A short time late OHPI receivednotification from Facebook our ‘advertising’ had been rejected. They posted the rejection notice online along with information about the launch of the report. This rejection came just days after OHPI’s page was itself, for the first time, suspended by Facebook, presumably in order to be reviewed in response to complaints. While OHPI believe the systems involved can be improved, we also commend Facebook on their response. The page was restored within hours of being taken down for review. OHPI believes the suspension of a page under investigation, at least the first time it is reviewed, is the correct response.

The risk that a page may, on review, be found to be promoting serious harm to human beings, such as emotional abuse through serious cyberbully or physical harm through the promotion of self harm or suicide, justifies the approach of first suspending it then reviewing it. The caveat is that such a suspension should trigger an immediate review and the matter should be resolved within hours. OHPI believes Facebook has this part of the system right. We would recommend that Facebook additionally notify page administrators when their page (as a whole) has been reviewed, and what the outcome of that review was. The suspension of a page deserves an explanation.

On the matter of the promoted post, Facebook has reached out to OHPI with a more detailed explanation. Regardless of the content of the response, we commend Facebook for taking the time to further clarify the matter. OHPI’s aim is to work with all stakeholders, including platform providers, to improve the response to issues of online hate which can lead to harm against human beings. This is why we reach out to Facebook, for example by providing them with drafts of forthcoming reports related to them, and by welcoming both their suggested corrections for our consideration and any public responses they wish us to include these reports.

In this case Facebook’s response reads: “I saw your Facebook post about being unable to pay to promote an OPHI Facebook page post and I wanted to follow up directly upon review. The reason the promoted post did not work was not only because of the Nazi reference, but because you attempted to use our brand in your ad and because you attempted to promote a post that was more than 20% text. This is laid out in our terms because we’ve deemed ads with heavy amounts of text to be a poor user experience. I hope this helps.”

The response identifies three separate issues:
1. The use of the word Nazi
2. Reference to the brand Facebook
3. The post having more than 20% text

Regarding the first issue, OHPI believes the use of the word Nazi certainly should trigger a closer manual view, but it should not automatically lead to a rejection. The term is widely misused in a manner that not only offends and insults the person or brand that may be accused of Nazism, but it also trivialised the Holocaust when it is misapplied. This said, prohibiting the use of the word Nazi when it is correctly being applied to the German National-Socialist government during the Holocaust would be a mistake. Banning advertising by the US Holocaust Memorial Museum of an exhibition on Nazi propaganda, for example, would be absurd. Facebook needs to ensure the use of trigger words leads to a considered judgement being taken by a human being which is dependent on the context of the usage.

Regarding the second issue, OHPI acknowledges that Facebook has the right to accept or reject advertising based on the interests of Facebook as a company, but we caution that doing so seriously undermines Facebook’s core values. In a 2009 interview with Wired, Facebook’s Founder and CEO Mark Zuckerberg stated that “We believed that people being able to share the information they wanted and having access to the information they wanted is just a better world”. He went on to note that “openness fundamentally affects a lot of the core institutions in society”. While OHPI believes certain kinds of content, namely hateful content, have no place in a community like Facebook, we also believe Zuckerberg is right to praise the openness of the core institutions of society. Using the power that comes from having control over the social network as a means to censor critics and manage the company’s reputation raises huge questions.
OHPI acknowledge that Facebook has not suppressed the report or the post announcing it. Facebook’s system does, however, create a non-level playing field. Visibility depends firstly on a user’s willingness to pay, that we take no issue with, and secondly on Facebook’s willingness to accept the payment. If Facebook filters the second question based on Facebook’s own interest, that would be highly problematic. There is a reason we value editorial independence in newspapers, and that the concept becomes increasingly important when one media owner controls a significant segment of the market. Newspapers must be willing publish stories that run against their owners interests. Governments must be willing to fund government broadcasters that at times publicly and vocally criticize them.

Facebook, in OHPI’s view, must allow the promotion of posts that are critical of Facebook if it is allows openness and avoid accusations of effectively implementing censorship over content that runs against its corporate interests. Thedanger of allowing Facebook to control speech based on its corporate interests is ably demonstrated by considering the impact of Facebook deciding it dislikes the policies of one political party and will therefore prohibit the advertising or promotion of posts that are supportive of that party. While such a position may be legal, it is fundamentally opposed not only to openness, but to democracy itself.

On the final issue, Facebook needs to solidify the concept of promoted posts. If Facebook has a problem with promoted posts that are more than 20% text, Facebook should not display the “promote this post” widget when such posts are posted. Further, as the post will appear in at least some people’s feeds anyway, and before the invention of promoted posts would have appeared in more users feeds, this argument is a little spurious. Facebook needs to distinguish promoted posts from other forms of advertising and ensure the only difference between a promoted post and a post that is not promoted is the number of people that will see it.

OHPI thanks Facebook for their communication to us, as it raises a number of serious questions which we urge Facebook to further consider.

Dr Andre Oboler is CEO of the Online Hate Prevention Institute, his new report “Recognizing Hate Speech: Antisemitism on Facebook” was released on March 21 2013 to mark the International Day for the Elimination of Racial Discrimination.

Originally published as: Andre Oboler, “Branding Facebook: Power and Impact“, Cutting Edge News, March 25 2013