Advertisement

Facebook is fighting back against fake news with PolitiFact as a partner

 
This photo combo of images provided by Facebook demonstrates some of the new measures Facebook is taking to curb the spread of fake news on its huge and influential social network. The company is focusing on the "worst of the worst" offenders and partnering with outside fact-checkers to sort honest news reports from made-up stories that play to people's passions and preconceived notions. (Facebook via AP)
This photo combo of images provided by Facebook demonstrates some of the new measures Facebook is taking to curb the spread of fake news on its huge and influential social network. The company is focusing on the "worst of the worst" offenders and partnering with outside fact-checkers to sort honest news reports from made-up stories that play to people's passions and preconceived notions. (Facebook via AP)
Published Dec. 16, 2016

Facebook announced changes on Thursday to curtail the spread of fake news on its online platform, making it easier for users to report potential hoaxes while highlighting independent fact-checking in news feeds.

The social media giant is partnering with fact-checkers who are part of the Poynter Institute's International Fact-Checking Network — including the Tampa Bay Times' PolitiFact — to flag bogus content, it said. Among the changes, users who try to "share" a suspect post will be alerted that the story may be inaccurate.

"We believe in giving people a voice and that we cannot become arbiters of truth ourselves, so we're approaching this problem carefully," said Adam Mosseri, Facebook's vice president for news feeds. "We've focused our efforts on the worst of the worst, on the clear hoaxes spread by spammers for their own gain, and on engaging both our community and third-party organizations."

The announcement comes just days after PolitiFact named fake news its "Lie of the Year" because of its powerful symbolism in an election year filled with exaggerations and falsehoods.

Fake news typically includes fabricated stories that are manipulated to look like credible journalistic reports and then spread online to large audiences. The makers of the phony posts are motivated by quick and easy profits from automated online advertising.

Facebook has been on the receiving end of complaints about fake news since Election Day. Most of the fake news posted during the election catered to fans of President-elect Donald Trump.

Facebook said it was selecting fact-checkers who are members of the International Fact-Checking Network, a global group hosted by the Poynter Institute in St. Petersburg. (The Poynter Institute owns the Tampa Bay Times.) The network, among other criteria, requires members to be transparent in their fact-checking process and funding.

In addition to PolitiFact, other fact-checkers participating in Facebook's pilot project are the Associated Press, Factcheck.org, Snopes and ABC News.

Facebook emphasized in its announcement that it was testing the concept and would be seeking input to refine and improve the program. Users will be able to flag content as a potential hoax, and fact-checkers will be able to see what users have flagged. After the fact-checkers publish their reports, Facebook will tag fake news as "disputed by 3rd party fact-checkers." Users will receive a warning before they share disputed content.

The company is also taking steps to discourage fake news creators from operating on Facebook by eliminating the ability to spoof domains, thereby reducing the prevalence of sites that pretend to be real news organizations.

Under the program, PolitiFact and other fact-checkers will be able to identify posts that users have identified as fake. PolitiFact will choose which reports it wants to fact-check and proceed with its rigorous reporting process. After publishing on its own site, PolitiFact alerts Facebook to the Web address of its fact-check report.

"People and organizations are getting more sophisticated in how they spread false information. So it's great news that Facebook is getting more sophisticated in its efforts to fight back," said PolitiFact executive director Aaron Sharockman. "PolitiFact, which has been fact-checking claims by politicians and pundits since 2007, is excited to be a part of this new project to help readers and voters sort out fact from fiction."

On Twitter and Facebook, Facebook's move received mixed responses. Some hailed it as a necessary corrective to fictitious reports, while others expressed concern about it leading to censorship or the suppression of unpopular views.

Brendan Nyhan, a professor at Dartmouth College who studies fact-checking, said that he expected Facebook's program to reduce the number of times that fake news is shared and viewed, and that the move was overdue.

"Their actions are a belated recognition of the lucrative market in fake news they helped create," Nyhan said. "With that said, the steps Facebook announced seem potentially effective. At this point, I'd like to see how well they work."

However, he said, Facebook needs to be careful about crossing fundamental lines in public debate.

"We can agree which stories are 100 percent fake, and those should be discouraged from being shared," he said. "But I worry about Facebook policing the sharing of politically disputed content that is misleading but within the boundaries of normal political discourse. They play such an outsized role in the online news ecosystem that significant caution is required. We don't want Facebook determining what content is widely shared online if it is normal political speech."