Search

‘No grey areas’: experts urge Facebook to change moderation policies

The Guardian // 22nd May 2017

Powered by Guardian.co.ukThis article titled “‘No grey areas’: experts urge Facebook to change moderation policies” was written by Jamie Grierson, for theguardian.com on Monday 22nd May 2017 14.04 UTC

Facebook’s ethical standards should not be decided “behind closed doors”, the former chair of an influential parliamentary committee has said after the Guardian revealed the social media giant’s secret rules for moderating extreme content.

Yvette Cooper, chair of the home affairs select committee before parliament was dissolved for the upcoming election, said the files – used by Facebook to moderate violence, hate speech, terrorism, pornography, racism and self-harm – underlined a need for more transparency.

A report from the cross-party committee last month concluded social media companies, including Facebook, should face fines of tens of millions of pounds for failing to remove extremist and hate-crime material.

The Guardian has learned that, according to Facebook guidelines, some photos of non-sexual physical abuse and bullying of children do not have to be deleted or “actioned” unless there is a sadistic or celebratory element. The social media giant will also allow people to livestream attempts to self-harm because it “doesn’t want to censor or punish people in distress”.

Child protection experts said there should be “no grey areas” when it comes to child abuse and have called for an independent regulator who can deal with extremist content online.

Cooper, who is standing as a Labour candidate for Pontefract and Castleford, said on Monday: “These files demonstrate why powerful social media companies, including Facebook, have to be more transparent… They also show why we were right to call on social media companies to urgently review their community guidelines, as too much harmful and dangerous content is getting through.

“None of this is easy, and we welcomed Facebook’s commitment a fortnight ago to hire thousands more staff to tackle the problem and bring in more safety measures.”

The Facebook Files: sex, violence and hate speech

On Facebook’s approach to non-sexual abuse of children, Cooper said the social media giant is “getting this very wrong” and hoped the guidelines will be changed urgently.

Facebook does not remove such images or videos, partly to allow for the child to be identified and rescued. But Cooper said this was only likely to happen in exceptional circumstances.

“In most cases the reality of sharing vile and violent images of violence and child abuse simply perpetuates the humiliation and abuse of a child,” she said. “Images should be given to the police and removed instead. Facebook are getting this wrong and need to urgently change.

“These companies are hugely powerful and influential. They have given people a platform to do amazing and wonderful things but also dangerous and harmful things.

“Given the impact of the content decisions they make, their standards should be transparent and debated publicly, not decided behind closed doors.”

Details of the more than 100 training manuals, spreadsheets and flowcharts seen by the Guardian will fuel the global debate about the role and ethics of the social media giant.

Claire Lilley, head of child safety online at the NSPCC, said Facebook should re-write its guidelines from scratch. She added that the company had come a long way in tackling child sexual abuse and should apply the same approach to non-sexual abuse.

Lilley said: “My question to them is why do you think a child being beaten up really brutally is any less difficult either for a child watching it to see or any less victimisation of the child who’s in the image? We want to see them taking non-sexual child abuse imagery just as seriously as they take sexual abuse imagery.”

Lilley, who said it was “shocking” to see Facebook’s rules in “black and white”, added: “They’re really judging the public mood on this the wrong way.”

In 2015, the NSPCC wrote an open letter criticising Facebook for allowing a video of a baby being dunked in a bucket of water to remain online. It was ultimately taken down by the user who posted it.

Facebook defended its decision to leave the video online by saying this allowed the child to be traced – but the NSPCC was told by law enforcement that this argument was “nonsense” because authoritiescan trace the origin of a video even when it is taken down.

“Facebook are not the arbiter of social norms and expectations,” Lilley said. “They shouldn’t get to decide what’s in the best interests of children or the public. If something needs to be investigated or prosecuted and the perpetrators of that crime brought to light, that’s not for Facebook to make the call. It’s for the police to make the call.”

The NSPCC advocates for algorithms to take down extremist content automatically. And Lilley said: “I’d like to see them take a step back and look at their guidelines that they hand to their moderators and look at the contradictions that are inherent in them. They need to throw them away and start again with a blank sheet of paper.”

A spokesman for Theresa May said: “The abuse that we too often see online has no place in Britain. Social media companies like Facebook need to do more about removing illegal and abusive content and implement proper community standards to keep their users safe.”

guardian.co.uk © Guardian News & Media Limited 2010

 

 

LEAVE A COMMENT

Your email address will not be published. Required fields are marked *