False news is a “major concern” around the world, Facebook acknowledged yesterday, announcing changes to the ways in which it alerts users to misinformation. The social media giant will take another step in the direction of editing what it publishers, while continuing to resist the idea that it is, in fact, a publisher with editorial responsibilities.
In particular, Facebook admitted that the “disputed flags” system had a number of problems, and will be withdrawn:
- Although the flags indicated that fact-checkers disputed the article, it wasn’t clear why without multiple extra clicks
- Disputing information could actually entrench some users’ convictions that the information is accurate
- Flagging disputed items was a slow process, because it required sign-off by two independent fact-checkers
- Only false news, not partly false or unproven assertions, got flagged
Instead, Facebook will be leaning on its Related Articles feature, which surfaces articles on the same topic from different sources, and will now include fact-checked articles. In tests, Facebook found that although providing context through Related Articles didn’t reduce clicks on hoax news items, it did reduce users’ propensity to share them.
The new initiative also removes barriers to finding out the reasons a news item is disputed, doesn’t require the involvement of multiple fact-checkers, and isn’t restricted to news items which are outright false. Facebook will continue to send notifications to users who choose to share articles disputed by fact-checkers, using “language that is unbiased and non-judgmental [to help] build products that speak to people with diverse perspectives.