Facebook’s ‘Community standards’ have been a popular topic of discussion for some time now, so whilst I wasn’t surprised by todays BBC headline Facebook failed to remove sexualised images of children, I was of course disappointed.
It is clear that who the report makes it to at Facebook determines whether a post or page gets taken down, but this really should not be the case. We know that Facebook has abundant funds which it should obviously (in my mind at least), be using to fully train staff and ensure that each individual member that processes reports, does so in the exact same way to the exact same standards. If the people reporting the posts know that this is content that is not suitable for Facebook, and let’s be honest, often content that isn’t suitable anywhere for human viewing, then how come the staff at Facebook do not?
Facebook says that it has improved its system since an investigation by the BBC last year which found that "secret" groups were being used by paedophiles to meet and swap images.
Of the 100 images reported, a mere 18 were taken down. According to Facebook's automated replies, the other 82 did not breach community standards.
Facebook's rules prohibit convicted sex offenders from having accounts, but the BBC found five convicted paedophiles with profiles and reported them to Facebook, none of which were taken down.
The NSPCC also voiced concern.
"Facebook's failure to remove illegal content from its website is appalling and violates the agreements they have in place to protect children. It also raises the question of what content they consider to be inappropriate and dangerous to children" said a spokeswoman.
Facebook later responded:
"We have carefully reviewed the content referred to us and have now removed all items that were illegal or against our standards. This content is no longer on our platform. We take this matter extremely seriously and we continue to improve our reporting and take-down measures.”
It is clear that this is a ‘special case’ and this content would not have been reviewed again were it not for the BBC’s involvement.
As Facebook users, I consider it to be our duty to repeatedly report content to be reviewed by Facebook’s moderators, and I will personally be doing my bit. I just hope I can count on others doing the same.