Facebook users will be able to flag fake news articles.(Photo: Facebook)
SAN FRANCISCO — Facebook is taking steps to weed out fake news and hoaxes, addressing the growing controversy over its role in the spread of misinformation on the Internet that sharpened political divisions and inflamed discourse during the presidential election.
The giant social network said Thursday it plans to make it easier to report a hoax and for fact-checking organizations to flag fake articles. It's also removing financial incentives for spammers and plans to pay closer attention to other signals, such as which articles Facebook users read but then don't share. Last month, Facebook barred fake news sites from using its ad-selling services.
"We believe in giving people a voice and that we cannot become arbiters of truth ourselves, so we're approaching this problem carefully," Adam Mosseri, vice president of product management, said in a blog post.
Facebook users can also alert their friends to articles they think are fake. (Photo: Facebook)
Facebook took heat after the election for not doing enough to remove fake news reports, such as a widely shared but erroneous article claiming Pope Francis endorsing Donald Trump. Some 170 million people in North America use Facebook every day. Nearly half of all adults in the U.S. say they get their news from Facebook.
Facebook has been reluctant to put itself in the position of judging what content is misleading, resisting calls that it has become a de facto news publisher, exercising editorial judgment with the power to sway the minds of billions.
BuzzFeed News found that people who say they rely on Facebook as a major source of news were more likely to believe politically slanted fake news stories. An earlier BuzzFeed News analysis found that top-performing fake news articles on the election generated more engagement on Facebook than articles from major news outlets in the last months of the presidential campaign.
Fake news creates significant public confusion about current events with nearly one-fourth of Americans saying they have shared a fake news story, according to a Pew Research Center survey.
While saying it was "extremely unlikely" that phony stories shared on Facebook changed the election outcome, Facebook CEO Mark Zuckerberg said last month that work had begun to help the nearly 1.8 billion users of the social-media service "flag fake news and hoaxes."
"Our goal is to show people the content they will find most meaningful, and people want accurate news," Zuckerberg wrote on his Facebook page.
CUNY journalism professor Jeff Jarvis says he's pleased to see Facebook take the scourge of fake news seriously. There's more that Facebook will need to do to eradicate fake news from the platform but, he says, it's "headed in the right direction."
Facebook is reaching out to users in those crucial moments in which they are deciding whether to share an article, Jarvis says, and that will improve the experience of Facebook, with fewer opportunities to be fooled by fake news and fewer fake news articles flowing through the News Feed.
"It's a fairly crappy experience to go in and see ridiculously stupid lies in your feed. Some people say people want to believe what they want to believe. I'm not so cynical about mankind. I actually believe that people want to be correct given the opportunity to be correct," Jarvis said.
An article whose veracity is being disputed by fact checkers will carry a warning label. (Photo: Facebook)
When Facebook users try to share an article whose veracity has been disputed by fact checkers, a message will pop up. (Photo: Facebook)
Facebook says it will now make it easier to report fake news. "We've relied heavily on our community for help on this issue, and this can help us detect more fake news," Mosseri said. To flag a fake news article, users will be able to click on the upper right hand corner of a post.
News articles flagged by users will be sent to third-party fact-checking organizations that are part of Poynter's International Fact Checking Network, Facebook says. If the article is identified as fake by the fact-checking organizations, it will get flagged as "disputed" and there will be a link to an article explaining why. Stories that have been disputed will also get pushed down in News Feed.
"It will still be possible to share these stories, but you will see a warning that the story has been disputed as you share," Mosseri said. Once the article is flagged, it cannot be turned into an ad and promoted.
Articles that get read by Facebook users but not shared may indicate they are misleading, Facebook says. So it will be paying closer attention to those. "We're going to test incorporating this signal into ranking, specifically for articles that are outliers," Mosseri said.
And Facebook says it's going to remove the financial incentives for spammers. Fake news sites lure people from Facebook to show them ads. Facebook says it has eliminated the ability to spoof Web domains so that fake news sites can no longer masquerade as real publications. It says it's also analyzing publisher sites for misleading or deceptive content and ramping up enforcement.
That's welcome news to Lisel Laslie, a 48-year-old mother of two and avid Facebook user from Tallahassee, Fla. Over the summer, bogus articles trashing both presidential candidates began flooding her News Feed, and she began posting less frequently and spending less time on Facebook. Even after the election is over, Laslie says she's reluctant to share any articles on Facebook. It's too much work figuring out what's real.
"It's like I am an investigative reporter and I have to check eight sources before sharing anything," Laslie said.