Facebook didn’t seem to care i was being sexually harassed until i decided to write about it

It’s difficult to describe how it feels to discover hundreds of complete strangers debating how fuckable you are. Or to see them casually suggest they’d like to “smash,” “pump” and “skullfuck” you as an apparent form of punishment for your gender and job title.

Dozens of my friends and coworkers reported the group to Facebook. Several hours later, Facebook responded to them with a message saying it had discovered content that “doesn’t follow our Community Standards.” It added: “We removed that specific content … instead of the entire group.” But subsequent screenshots showed my photo was still there.

A staff manager at HuffPost also contacted Facebook directly to request that the post be taken down as quickly as possible.


His message contained the screenshots of comments and stressed the matter’s urgency. Two days later, a representative followed up and asked him for the link to the post with the photo, which was inaccessible to anyone who did not belong to the private group (and which I had failed to obtain from the growing swarm of “ trolls” in my inbox).

With a growing network of more than 2 billion users around the globe, Facebook’s role in shaping the way we communicate online cannot be easily overstated. It has public policies in place to penalize those who violate its Community Standards (including posts containing nudity, hate speech and violent or graphic content). These users may have their content removed or their accounts disabled, or may even be reported to law enforcement.

Moderating hate speech is an issue “that we struggle with continuously,” Zuckerberg said last week. “There’s a lot of content flowing through the systems and a lot of reports, and, unfortunately, we don’t always get these things right when people report it to us.”

He vowed that Facebook would have “more than 20,000 people working on security and content review across the company” by the end of the year, acknowledging that “no amount of people that we can hire will be enough to review all of the content.”

But instead of advocating for more moderators, pundits have long been calling for greater transparency surrounding the organization’s internal content review standards to ensure consistent enforcement and accountability ― a plea many say has fallen on deaf ears.

Alejandra, who works at a debt-relief company in Fresno, California, commented: “Smash with a ballgag so she doesn’t speak.” Ironically, in a public bio statement sitting atop her Facebook profile, she has written: “I don’t care who you are if you are kind to me I will be kind to you, it’s that simple.”

Michael, a New York-based account manager who was previously a cadet first lieutenant at a military academy in Pennsylvania, wrote that he would “Smash, but I’ll unload on her face… for the patriarch.” On his professional website, he lists his favorite Bible verse as Psalm 133: “How good and pleasant it is when God’s people live together in unity!”

Far from promoting unity, Facebook is an ideal ecosystem for hate groups to flourish. During Zuckerberg’s appearances on the Hill last week, much was made of his site’s explosive growth and transformation, but from my vantage point, staring at screenshots from “Emperor Trump’s Dank Meme Stash,” Facebook looks remarkably unevolved. A platform that grew out of a “hot or not” women-ranking site is now a place for users to debate “smash or pass.”

Upon receiving a request for comment for this article on Friday ― two months after HuffPost initially alerted Facebook to the situation ― the organization said it would review “Emperor Trump’s Dank Meme Stash.” By day’s end, the group was taken down.