Facebook on Fact-Checking

Facebook responds to the Guardian on Fact-Checking

Facebook’s newroom responded to an article in the Guardian on the effectiveness of their fact checking.  Both the article and Facebook’s response are worth a read:

Sam Levin’s December 13th, 2018 article in the Guardian, ‘They don’t care’: Facebook factchecking in disarray as journalists push to cut ties. Some highlights:
– Facebook began building its partnerships with news outlets with the goal of relying on journalists to flag false news and limit its spread, but research and anecdotal evidence have repeatedly suggested that the debunking work has struggled to make a difference.
– Facebook has over 40 media partners across the globe and has said false news on the platform is “trending downward.”
– Some factcheckers believe this collaboration has produced minimal results – “They’ve essentially used us for crisis PR.”

Facebook responded with their Fact-Check on Fact-Checking on December 13th, 2018. Some highlights:
– Facebook’s process; (1) machine learning queues potentially false news to third-party fact-checkers, (2) fact-checkers then go through the list and choose what to fact-check.  (3) as soon as something is rated “false,” it is automatically de-prioritized in News Feed, and where it does appear, Facebook will show Related Articles including the fact-checker’s article below it. These processes are automated.
– Labeling something false can reduce future impressions of that content by about 80% – Facebook de-prioritizes all content from actors who repeatedly get “false” ratings on content they share, and their advertising and monetization rights are removed.
– Misinformation is an ever-evolving problem – Facebook states that they are committed to fighting globally, and the work that third-party fact-checkers do to help review content on Facebook is a valued and important piece of this effort.

IPA Permalink: http://information-professionals.org/facebook-on-fact-checking/