Commentary

Facebook Deals With Massive Revenge Porn, Sextortion Allegations

Every month Facebook considers nearly 54,000 cases of suspected revenge porn or sexual extortion on its platform says a new story from the Guardian.com. The new report is a Monday follow-up to earlier Guardian stories that shed light on the social site’s intricate standards.

in some instances those standards seem bewildering. In almost all instances, they suggest that Facebook is attempting to impose limits on an Internet culture that has defiantly fought against any prohibitions, long before video on Facebook was possible. Rules, after all, are from a more timid media age.

On the other hand, some of the rules and commentary suggest Facebook also feels uncomfortable and unsure about what it's doing. The detailed Guardian accounts are worth close examination.

The new story says in January alone, Facebook disabled, 14,000 accounts because revenge porn or sextortion, and that 33 of those involved children.

Facebook monitors and uses image-matching software to try to stop explicit content. But the story says Facebook acknowledges “sexual policy is the one where moderators make most mistakes,” in large part because Facebook is increasingly being used for pornography.

Because complaints about content are mostly based on viewer complaints, the situation could be weirder, sicker and more exploitative than the monitors actually know.

Some of the language from this point on will be pretty raw.

According to the Guardian, “One Facebook document, titled Sexual Activity, explains it is permitted for someone to say: ‘I’m gonna fuck you.’ But if the post adds any extra detail – for instance, where this might happen or how – it should be deleted if reported. According to this 65-slide manual, other general phrases allowed on Facebook include: ‘I’m gonna eat that pussy’; and ‘Hello ladies, wanna suck my cock?’ “

Visually, Facebook allows “moderate displays of sexuality, open-mouthed kissing, clothed simulated sex and pixelated sexual activity” involving adults, according to the Guardian. “The documents and flowcharts then set out what is permitted on Facebook in detailed sub-categories called ‘arousal’, ‘handy-work’, ‘mouth work’,  ‘penetration’, ‘fetish’ and ‘groping’.”

On matters related to terrorism or violence, the Guardian said Facebook would remove language that says, for example, “Someone shoot Trump” but “it can be permissible to say: ‘To snap a bitch’s neck, make sure to apply all your pressure to the middle of her throat”’ or ‘fuck off and die’ because they are not regarded as credible threats.”  Neither is, “Let’s beat up fat kids” but “#stab and become the fear of the Zionist” is.

The story quotes Monika Bickert, ‎Facebook’s head of global policy management, who explains that with 2 billion daily users worldwide, Facebook can’t very easily reach consensus about what’s within bounds.

She calls Facebook “a new kind of company. It’s not a traditional technology company. It’s not a traditional media company. We build technology, and we feel responsible for how it’s used. We don’t write the news that people read on the platform.”

But a Facebook critic says the social giant skates on thin ice with that argument. Because Facebook monetizes its content, it has some responsibility--and a profit motive--regarding those user submissions.

pjbednarski@comcast.net.


@pjbtweet

Next story loading loading..