Google added fact checking and it's time for Facebook move now against fake stories
Facebook has stepped into the role of being today’s newspaper: that is, it’s a single destination where a large selection of news articles are displayed to those who visit its site. Yes, they appear amidst personal photos, videos, status updates, and ads, but Facebook is still the place where nearly half of American adults get their news, and we Filipino we spend time checking our Facebook feeds.
Google yesterday announced it will introduce a fact check tag on Google News in order to display articles that contain factual information next to trending news items. Now it’s time for Facebook to take fact-checking more seriously, too.
Facebook has a responsibility to do better, then, when it comes to informing this audience what is actually news: what is fact-checked, reported, vetted, legitimate news, as opposed to a rumor, hoax or conspiracy theory.
It’s not okay that Facebook fired its news editors in an effort to appear impartial, deferring only to its algorithms to inform readers what’s trending on the site. Since then, the site has repeatedly trended fake news stories, according to a Washington Post report released earlier this week.
The news organization tracked every news story that trended across four accounts during the workday from August 31 to September 22, and found that Facebook trended five stories that were either “indisputably fake” or “profoundly inaccurate.” It also regularly featured press releases, blog posts, and links to online stores, like iTunes – in other words, trends that didn’t point to news sites.
Facebook claimed in September that it would roll out technology that would combat fake stories in its Trending topics, but clearly that has not yet come to pass – or the technology isn’t up to the task at hand.
In any event, Facebook needs to do better.
It’s not enough for the company to merely reduce the visibility of obvious hoaxes from its News Feed – not when so much of the content that circulates on the site is posted by people – your friends and family – right on their profiles, which you visit directly.
Outside of Trending, Facebook continues to be filled with inaccurate, poorly-sourced, or outright fake news stories, rumors and hoaxes. Maybe you’re seeing less of them in the News Feed, but there’s nothing to prevent a crazy friend from commenting on your post with a link to a well-known hoax site, as if it’s news. There’s no tag or label. They get to pretend they’re sharing facts.
There is a difference between a post that’s based on fact-checked articles, and a post from a website funded by an advocacy group. There’s a difference between Politifact and some guy’s personal blog. Facebook displays them both equally, though: here’s a headline, a photo, some summary text.
Of course, it would be a difficult job for a company that only wants to focus on social networking and selling ads to get into the media business – that’s why Facebook loudly proclaims it’s “not a media company.”
Except that it is one. It’s serving that role, whether it wants to or not.
Google at least has stepped up to the plate and is trying to find a solution. Now it’s Facebook’s turn.
Google yesterday announced it will introduce a fact check tag on Google News in order to display articles that contain factual information next to trending news items. Now it’s time for Facebook to take fact-checking more seriously, too.
Facebook has a responsibility to do better, then, when it comes to informing this audience what is actually news: what is fact-checked, reported, vetted, legitimate news, as opposed to a rumor, hoax or conspiracy theory.
It’s not okay that Facebook fired its news editors in an effort to appear impartial, deferring only to its algorithms to inform readers what’s trending on the site. Since then, the site has repeatedly trended fake news stories, according to a Washington Post report released earlier this week.
The news organization tracked every news story that trended across four accounts during the workday from August 31 to September 22, and found that Facebook trended five stories that were either “indisputably fake” or “profoundly inaccurate.” It also regularly featured press releases, blog posts, and links to online stores, like iTunes – in other words, trends that didn’t point to news sites.
Facebook claimed in September that it would roll out technology that would combat fake stories in its Trending topics, but clearly that has not yet come to pass – or the technology isn’t up to the task at hand.
In any event, Facebook needs to do better.
It’s not enough for the company to merely reduce the visibility of obvious hoaxes from its News Feed – not when so much of the content that circulates on the site is posted by people – your friends and family – right on their profiles, which you visit directly.
Outside of Trending, Facebook continues to be filled with inaccurate, poorly-sourced, or outright fake news stories, rumors and hoaxes. Maybe you’re seeing less of them in the News Feed, but there’s nothing to prevent a crazy friend from commenting on your post with a link to a well-known hoax site, as if it’s news. There’s no tag or label. They get to pretend they’re sharing facts.
There is a difference between a post that’s based on fact-checked articles, and a post from a website funded by an advocacy group. There’s a difference between Politifact and some guy’s personal blog. Facebook displays them both equally, though: here’s a headline, a photo, some summary text.
Of course, it would be a difficult job for a company that only wants to focus on social networking and selling ads to get into the media business – that’s why Facebook loudly proclaims it’s “not a media company.”
Except that it is one. It’s serving that role, whether it wants to or not.
Google at least has stepped up to the plate and is trying to find a solution. Now it’s Facebook’s turn.
No comments
Post a Comment