Facebook Business: What Content Facebook Deletes and What You Have to Delete Yourself
Memes, cat pictures and private photos are allowed to stay, you can’t call someone an egghead, loser or moron, but comparing a minaret to a penis is okay as long as Islam is not mentioned. Training documents provide some surprising insights into the work of the content moderators at Facebook. What Facebook deletes, what it doesn’t and why you should stick to tougher standards in your social media account...
Content Moderation: Clean Websites Require a Lot of Work
Anyone who runs their own website or Facebook page knows how much work it is to keep it clean and professional. Sometimes you have to make difficult decisions, especially when it comes to content moderation. While you have the choice on your own website whether to allow comments and posts from third parties in general, Facebook allows these ways of interaction by default. That make sense because it’s what social networks are all about. To ensure that surfing on Facebook remains safe, the company employs so-called content moderators who decide which posts may remain on the platform and which will be deleted because they violate the guidelines.
Deleting Posts: Facebook Relies on AI
However, before one of the approximately 15,000 moderators receives a post for review, it is checked by Facebook's AI (artificial intelligence). The effort would otherwise be impossible to cope with given the countless contributions that are posted on the platform every day. As soon as a user or an AI-based content filter classifies a post as problematic, the system flags it.
Flagged posts are then analysed in chronological order by another AI system and, in clear cases, deleted. The posts are only passed on to humans when there are doubts. In itself, it’s a good system that is constantly being further developed. For example, the social network announced that in future it would no longer edit posts according to the date they were received, but according to relevance. Facebook didn’t reveal which factors the new AI uses, but posts with particularly incendiary content or with a particularly large reach are given higher priority.
Become a Professional Content Moderator with 80 Hours of Training?
For the content moderators, this means that they only work on problem cases that the AI can’t handle. If you think that such a responsible job requires years of training, you are wrong. Because according to Facebook, 80 hours of training with internal training materials explaining how to apply Facebook's policies to specific posts is enough. However, if you take a closer look at the documents, as Christian Erxleben, the editor-in-chief of Basic Thinking did, you might be surprised ...
Delete Posts: Between Freedom of Expression and Discrimination
Disparaging terms for ethnic groups have no place on Facebook, according to the documents. But a caricature of a minaret in the shape of a penis is protected under freedom of expression. As long as the caption doesn’t make derogatory comments about Muslims or Islam. It’s hard to understand how the line is drawn here.
The Final Supervisory Authority: On your Business Page your Rules Apply
Christian Erxleben’s article ends with a terrifying insight: Facebook created a network that it can no longer control. The fact that Facebook doesn’t delete something doesn’t mean it is not discriminatory or offensive. If questionable posts land on your Facebook business page, you as the operator share responsibility. If a user’s post violates applicable law, you become liable for it and the associated damages, just like you would for your own website. Since you created the conditions for the existing infringement in the first place by providing the page, you are partly to blame in the eyes of the legislator. The solution to the problem is obvious: Your Facebook business page also needs content moderation.
Tips for the Content Moderation for your Own Facebook Business Site
In principle, the same rules apply on Facebook as on the rest of the Internet. This means that content that violates applicable law must be deleted from your own site straight away (immediately after it becomes known). Examples of typical posts:
- Copyright infringement (also in the form of memes or quotations )
- Hate speech, agitation and insults
- Discrimination
- Data protection violations (including customer inquiries in which users post their own email addresses or telephone numbers)
- Links to unsafe websites (malware, phishing, etc.)
Strong professional indemnity insurance also covers your social media accounts
With so many regulations and decisions, it’s often difficult to keep track of things. That’s why insurance for your social media presence is included in Professional Indemnity Insurance through exali.com along with your other professional risks. If you are fined for your social media activity, the insurer’s claims specialists will investigate for you. If it’s legal, the claim is paid, excessive claims are adjusted and unfounded claims are rejected, all at the expense of the insurer.
And you can take out exali Professional Indemnity Insurance completely online in just 5 minutes. If you have any questions about the ideal insurance product for your profession, please contact the exali insurance experts. They look forward to your call and can be reached without going through a a call centre or waiting queue.
Online Editor
Daniela has been working in the areas of (online) editing, social media and online marketing since 2008. At exali, she is particularly concerned with the following topics: Risks through digital platforms and social media, cyber dangers for freelancers and IT risk coverage.
In addition to her work as an online editor at exali, she works as a freelance editor and therefore knows the challenges of self-employment from her own experience.