TRIGGER WARNING: Some links in this passage (marked with a double asterisk) contain graphic depictions of domestic abuse, assault, and rape. The article itself contains some descriptions of such content.
Recently, Facebook has been at the center of various controversies surrounding photos of women and their bodies. The company’s behavior has been twofold: on the one hand, it raised significant outrage within a portion of the Facebook community when it began removing photographs of women breastfeeding, and on the other it has received widespread criticism by feminists and other activists for refusing to remove content depicting battered or even dead women, as well as sexual images of underage girls posted on the site without their permission.
Despite the explicit statement in Facebook’s Community Standards that breastfeeding photographs are acceptable on the site, the Huffington Post reported that as of April 2013 users, including breastfeeding advocates, have reported the site’s removal of such content. Similarly, even though Facebook has a community standard against harassment, graphic content, nudity, and breaches of privacy, it has made a habit of refusing to remove incredibly graphic and violent images of battered and bleeding women [graphic content]** bearing captions such as “THAT WILL TEACH YOU… NOW WALK IT OFF AND GET BACK TO THE KITCHEN” and “NEXT TIME, DON’T GET PREGNANT.” In addition to refusing the removal of pages featuring such content or the content itself, Facebook has refused to remove pages containing what is known as “revenge porn,”** sexual photographs of women leaked to the site without their permission for the purpose of anonymously slut-shaming the subject of the photograph. Notably, many of the women in these images are legal minors, and directly in opposition to the Facebook Community Standard’s “strict policy against the sharing of pornographic content and any explicitly sexual content where a minor is involved”.
Despite the explicit prohibition of such content under its own community standards, Facebook refused to respond to reported photographs, and in the case of breastfeeding pictures, they never acknowledge their removal of content and suspension of users. In response to the former issue, feminist collectives and activists including Women, Action, and the Media! (WAM!), the Everyday Sexism Project, and Soraya Chemaly launched an online project to hold Facebook accountable to its standards. Their strategy was to attack Facebook’s weakness: advertising, it’s source of revenue. Activists began sending screen-shot images of the offending content beside the Facebook advertising column to the companies sponsoring those ads. Harnessing the power of the Internet, over 60,000 messages and 5,000 emails were sent by supporters to major companies that advertise on Facebook, including Disney, McDonalds, Dove, Nissan, and others. Many of these companies pulled their ads, and within a week, the pressure proved to be too much for Facebook to withstand. They began removing the pages and images, and on May 28, a week after the campaign started, they issued a formal response.
Success! Victory! …Well, not quite. While the Huffington Post does call this story “How Facebook Learned Rape is Bad for Business,” and WAM!’s site celebrates, “IT WORKED!”, Facebook’s official statement and actions since leave much to be desired.
Facebook’s statement acknowledges the controversy but reminds users that it only takes action against “harmful [not simply controversial] content.” Harmful content, according to Facebook, is “anything organizing real world violence, theft, or property destruction, or that directly inflicts emotional distress on a specific private individual.” This may leave many wondering: Are photographs of breastfeeding “harmful” by this definition? What about posting sexual photographs of a minor without her permission, often accompanied by the child’s name? And what about “joke” photographs that encourage or trivialize violence against women?
While Facebook avoids the hard questions, it does take some responsibility for the “controversy,” admitting, “In recent days, it has become clear that our systems to identify and remove hate speech have failed to work as effectively as we would like, particularly around issues of gender-based hate.” Notably, Facebook blames its “system” for this issue, despite noting several times on the same page that there is a team, the “User Operations team” whose job it is to “evaluate reports of violations of our Community Standards around hate speech.” Following this admission of its “system’s” error, Facebook notes that it will take several steps to avoid similar issues in the future. These steps range from the helpful, such as reviewing and updating the guidelines used to determine what content should be removed, to the responsibility-dodging, like encouraging women’s activists to work with external groups such as the Anti-Defamation League.
While Facebook’s statement and suggestions already teeter between relevance and empty promises, its actions are even less promising. WAM! writes, “Facebook is claiming that because they took down all the images in our original sample set, there is no more problem. Until Facebook recognizes that the problem is their POLICIES AND PROCEDURES, not any individual pages, we will keep posting fresh examples each day.”. Personally, I have seen Facebook’s reluctance to change in my own Facebook activity. The same day Facebook issued its apologetic statement, I reported this image**, picturing a girl with tape over her mouth and the caption, “DON’T WRAP IT AND TAP IT. TAPE HER AND RAPE HER”:
The next day, May 29, I received an email from Facebook reading, “Thanks for your recent report of a potential violation on Facebook. After reviewing your report, we were not able to confirm that the specific page you reported violates Facebook’s Statement of Rights and Responsibilities,” and, ironically, referring me to the Community Standards.
The conclusion I draw from these recent events is that despite Facebook’s commitment to “balance concerns about free expression and community respect” as well as its “mission…to make our platform a safe and respectful place,” it has done no such thing. Facebook has repeatedly ignored victims of privacy violations, even when those violations result in images that legally constitute child pornography being featured on the site. It has, despite explicit statements in its Community Standards, barred breastfeeding mothers from its online spaces. It has both implicitly and explicitly confirmed its acceptance of images and words that celebrate, trivialize, and threaten physical and sexual violence against women. And it refuses to take significant action against these issues, responding only minimally when these practices hurt its bottom line. What’s more, while some might believe Facebook is simply apathetic to such issues (which would be bad enough), the company’s placement of blame on its “systems” rather than human employees and official policies, as well as its refusal to address these issues with formalized practices, likely belies a misogynist rape culture within its ranks, an active insistence on implicitly endorsing such opinions and actions. The technologist in me thinks Facebook could use more feminist-minded women programmers like me, but until Facebook’s leadership demonstrates a commitment to supporting women users, the rest of me wouldn’t touch a Facebook offer with a ten-foot pole.
By Danaë Metaxa-Kakavouli, Bluestockings web editor, with endless support and admiration for her women friends working in tech.
All Images found via Google Image Search