Social media is a tool to become and stay informed. Despite its roots in benign posts from friends about parties and babies on the way, social media has progressed to the point where news of any kind is freely shared and intelligently disseminated, including balanced and fair reporting. But with the caveat of anyone having the ability to share anything on the internet, it can often result in the spread of misinformation.
Facebook is constantly in the throes of battle with what many call “fake news,” and now the massive social media platform is updating its tools to better balance the playing, er, sharing field.
In an update to their Related Articles rollout, Facebook will now send more articles to third-party fact checkers to determine if they are hoaxes, then share the findings below the original posted article.
Machine learning will find and recognize potentially fake or grossly misinformed articles, then send them to another group of fact-checkers who will be able to confirm or deny the existence of real information. Any findings, positive or negative, may then be posted below the original piece on someone’s newsfeed for a user to peruse through at their own leisure.
“Since starting this test, we’ve heard that Related Articles helps give people more perspectives and additional information, and helps them determine whether the news they are reading is misleading or false,” wrote Sara Su, product manager for News Feed in the update. “So we’re rolling this out more broadly.”
An example given in the Facebook newsroom consists of someone posting an article about a recent medical advancement. Below the article is other takes from different publications about the same advancement, prompting readers to widen their views and hopefully fact-check things themselves.
Many believe this rollout was meant to battle fake political stories. Often those on the extreme left or right exist in an echo chamber, only reading and sharing information that they readily agree with. The rollout of Related Articles could help introduce partisans to discussion they would otherwise not have been exposed to.
This update isn’t free from its own problems. Even though these balanced articles may be right there to click and read, many people will probably gloss right over them. Facebook is trying to do its part by making them accessible, but it takes two to stay informed.
Another possible conflict lies in what Facebook’s third-party fact-checkers deem as appropriate and informative. Everyone has some form of bias, and regardless of machine learning or human-interaction, things will slip by or become unnecessarily censored. This has been shown within Facebook as the platform repeatedly tried to change their trending section, amidst accusations of liberal favouritism and fake stories becoming worldwide trends.
Either way you post it, Facebook is trying to create an informed community and appears to be on the right track. Machines learning and reacting to news dissemination will be difficult, especially if politics and countries begin to divide even more, but it is a mission that has to be brought to task.