Facebook adjusts ‘fake news’ policy after unwanted effects

Spread the love

Facebook is going to change the way it handles fake news. Links that have been judged as fake news by fact-checkers no longer get red flags, but are actually made smaller. Earlier it turned out that the red flags actually cause undesirable effects.

The social networking site unveiled its new policy at the Fighting Abuse @Scale event last week. In one of the sessions, Facebook employees talked about an adjustment in how the company handles fake news, TechCrunch reports. Initially, the policy was to place red flags on links to articles that fact-checkers verified to be fake news. However, that didn’t make such articles less widely circulated on Facebook; it led to just the opposite.

Because the desired effect was not achieved, Facebook will now deal with fake news differently. The red flags have to give way to other visual effects. For example, the social networking site wants to make it easier to scroll past fake news unnoticed, something they will try to achieve by reducing links to the articles. Facebook then wants to ensure that the related content contains links to good articles as much as possible.

At the same time, Facebook wants to recognize fake news earlier with machine learning algorithms. The algorithms must work in tandem with user reports to ensure that potentially fake news is high on the priority list of fact-checkers employed by Facebook. They could then verify that it is actually fake news.

Facebook has recently come under a lot of criticism for its policies. For example, several attempts have been made to influence people’s opinion, including to influence the outcome of the American elections. This happened, for example, in the Cambridge Analytica scandal. Facebook CEO Mark Zuckerberg has since apologized.

You might also like
Exit mobile version