Facebook and Instagram to remove disinformation that leads to violence geweld
Facebook will remove posts with misleading information if those posts lead to physical violence. That will happen on Facebook and Instagram. Facebook will not do that on WhatsApp, presumably because chats have end-to-end encryption and can therefore not be scanned.
The policy will come into effect in the coming months, Facebook tells CNBC. The new rule is a response to violence in countries such as Myanmar and India. Misleading messages on Facebook in particular would play a role in this and Facebook wants to prevent physical violence in this way. Facebook will allow misleading posts if there is no link with physical violence; Facebook users may continue to deny the holocaust, Zuckerberg reported in an interview with Recode.
The new rule applies to Facebook and Instagram, where moderators will work with local and international organizations to determine if a post needs to go. It is unclear what the exact rules will be and where the border lies.
Facebook appears to be acknowledging its responsibility to address the consequences of misleading posts where possible, something it has not been inclined to do in recent years. Facebook also owns WhatsApp. Forwarded messages on WhatsApp have led to an estimated dozens of lynchings in India in recent months, as people spread false messages via WhatsApp that certain people, for example, had kidnapped children. WhatsApp now has a function to indicate that a certain message has been forwarded.