The social network has been accused of helping to spur violence in Myanmar, Sri Lanka and India. In recent days and Facebook executives, including founder Mark Zuckerberg, have been adamantly defending its policy of letting fake news live on the platform. On Wednesday (July 18), however, the company announced that it will start to take down some fake news—specifically, content that spark violence and result to physical harm and that includes both written posts and manipulated images and videos. The company (Facebook) has been accused seriously for its role in spreading misinformation that has led to violence in Myanmar and Sri Lanka and lynching of several people.
A reporter Olivia Solon tweeted “At a @facebook Q&A about misinformation. They have announced they’ll delete misinformation that causes real-world harm, but no clarity on what constitutes real-world harm or whether they’ll do this in the US”.
A Facebook spokesperson said,“Reducing the distribution of misinformation—rather than removing it outright—strikes the right balance between free expression and a safe and authentic community,there are certain forms of misinformation that have contributed to physical harm, and we are making a policy change which will enable us to take that type of content down.”
Currently, Facebook bans content that directly calls for violence. But the new policy will now cover fake news that has the potential to stir up physical harm.