Continuing from yesterday’s blog pertaining to Anti-Fake News Laws…
When it comes to Fake News, some governments are acting, but a component we left out from yesterday’s blog (read here) is the companies who are facing the barrage of criticism for allowing sham articles to circulate and flourish by ways of their sites; and how they are helping to fix the problem.
Facebook has been running through various ideas on how they can stop the calamity that encases their reputation, one would be the Disputed Flag, a tool introduced a year ago to make it easier for Facebook users to identify hoax articles on their News Feeds. Though this idea seemed progressive, the company recently found that the representational image of the red Disputed Flag may convey another message to the users, possibly leading to a rise in the spread of inaccurate news. The team at Facebook believes the red Disputed Flag could change someone’s mind about the content using reverse psychology, and in summary, promote Fake News, even popularize it. Instead, Facebook is turning a key component of meaningful journalism to work with third-party fact-checking groups to vet stories. By replacing red flags with a “related articles” section, this will present reports from other news outlets that allow users to immediately fact-check stories that appear in their feed. For instance, if there were an article published and shared on you Facebook feed, the related articles will reveal if it is from the news source normally linked; and if it is not, it is therefore considered a fake. The feature has notably been around since 2013, but in April tests have begun extending this section, and more about how Facebook will utilize this new tool will appear in the future.
Some influential people say the spread of stories like that on Facebook tipped the election, a charge Facebook’s CEO Mark Zuckerberg initially called “a crazy idea.” Regardless of the influence, Fake News is a problem that many, like Facebook, will continually battle.