Facebook Exec Says Technical Updates Are More Effective Than Fact Checkers To Fight Fake News

Facebook’s head of News Feed says he was “caught off guard” by public outcry over the spread of misinformation on its platform during the election.

Six months ago, Facebook was under siege.

The surprise election of Donald Trump resulted in a backlash against the company over the amount of misinformation spread on its platform during the campaign. The initial response from CEO Mark Zuckerberg was to dismiss concerns, which only generated more outcry. As BuzzFeed News reported at the time, a group of rogue employees initially took it upon themselves to begin working on ways to reduce the spread of hoaxes and misinformation on the platform. But soon the company's leadership changed its tone and began thinking about ways to attack the problem.

That job has largely fallen on the shoulders of Adam Mosseri, the VP of News Feed. He is increasingly the face of the company’s mission to rid Facebook of clickbait, false news, and what the head of the company’s news partnerships team recently referred to as “garbage” content. It’s been an ongoing process, one that’s meant a lot of changes to the way Facebook presents updates.

In order to ensure publishers aren’t surprised (meaning: unexpectedly downranked) by the News Feed’s ongoing changes, Mosseri today published a blog post to offer publishers “basic guideposts” in order to stay on the platform’s good side. He advises publishers to avoid clickbait headlines and attempts to “deliberately try and game News Feed,” while pushing them towards sharing “meaningful” and “informative” links. Those that don’t heed the advice risk being caught up in the ever-expanding News Feed dragnet.

Mosseri presents these guideposts in the context of the changes it has made in the six months since the election, during which time the company has rolled out a series of updates to News Feed ranking that target clickbait headlines and links to sites riddled with junky ads and little content. It made the ability to flag false or misleading content more prominent for users, and also partnered with third-party fact checkers in the United States and two other countries to place a warning label on content that’s deemed false.

Along with the product changes — which, given Facebook’s core as an engineering organization, are perhaps the best expression of its commitment to this work — the company has adopted a new, more humble tone in talking about the fight against misinformation and its relationship with news organizations. It also poured millions into the creation of a News Integrity Initiative.

All of this is a result of the election and its aftermath, which Mosseri acknowledges took him and others by surprise.

“I do think we were caught off guard by how much attention there was put on false news specifically,” he said in an interview with BuzzFeed News.

“There’s no way I could have expected, or any of us would have expected, that to be the main focal point of scrutiny of the election. That definitely surprised us.”

Mosseri told BuzzFeed News he believes the recent updates targeting ad farms and clickbait headlines will have the most positive impact on the quality of the links people see in their News Feed. But those updates have been overshadowed by Facebook’s partnership with third party fact checkers that attaches a “disputed” label to links containing false content. That program came under scrutiny earlier this week when The Guardian reported that disputed links are sometimes shared more than those that haven’t been debunked. Mosseri said Facebook’s data, which has not been shared publicly or with its fact checking partners, shows that’s not the case. (Facebook declined to share this data with BuzzFeed News.)

“We look at what gets flagged and generally they get less distribution almost immediately,” he said. “The shape of their distribution curve changes almost immediately after they get labeled.”

Mosseri said the fact checking partnership often receives more attention and scrutiny because it’s easier to understand than an algorithm tweak.

“It’s much more difficult to understand or take a screenshot of all of the work we do to build ranking infrastructure and modeling …. than it is to see ‘Oh there’s a big piece of text on top of a link that says it was disputed by Snopes or PolitiFact.’”

Right now the fact checking program is unique among Facebook’s “integrity” measures because it often elicits two contradictory takes: that it’s the biggest part of the company’s package of efforts to fight misinformation, and therefore dictates whether the overall effort is effective; or it’s nothing more than a fig leaf meant to generate good PR. The latter view suggests the actual efficacy of the checking label is irrelevant as long as it’s there for the company to point to in communications.

“I wouldn’t say it’s irrelevant — I think that we are always looking to find new ways to better identify all forms of problematic content,” Mosseri said. “So this was one thing that we found that was of incremental value.

“Honestly the work goes well, well beyond the third party program,” he continued. “And though I think the program’s important, it’s —- well I’ll say it this way: we need to make sure this [effort to reduce false news] works in countries where there are no third party fact checking organizations, which by the way is a lot of countries around the world.”

One way to potentially increase the number of checkers would be for Facebook to invest money in fact checking around the world. (Currently Facebook enlists the aid of fact-checkers in the United States, France, and the Netherlands.) Mosseri says it’s an option the company would consider, but the current priority is on technical efforts.

“There are over 2 billion things posted on Facebook a day, and maybe 400 to 500 million links posted a week,” he said. “So there’s no version of the third party fact checking program that can check everything. By necessity we always look to supplement those kind of programs or systems with technical solutions that can scale, just because we have to.”

That’s not to suggest people aren’t involved. Along with the engineers who write the code that that provides algorithms with instructions for how to rank content, Mosseri said the company’s decision to make it easier for users to report misinformation has “dramatically increased” the number of people sending reports.

“We do find that a lot of people will report things as false that aren’t false — it’s just that they don’t agree with it,” he said. “But if you look at the stories that get the most reports, they are very consistently either false news or another form of really problematic content.”

With close to 2 billion active users sharing billions of links a month, the challenge for Facebook is to make sense of a massive amount of information and the social interactions resulting from it in order to determine what people actually want.

“We use things like clicking and liking and commenting as signals that people care about things,” Mosseri said. “But because people’s behaviours and actual interests don’t always align we also try to ask people questions. We actually run tens of thousands of surveys every day all over the world because those things aren’t always aligned.”

Clickbait content — wherein the headline withholds information or over promises what’s in the actual story — is a perfect example of the contradictions highlighted by a platform like Facebook. This content can elicit great signals in the form of clicks and shares, but in surveys the same people who click on these links also say they hate clickbait.

“People click on clickbait but almost always when you ask them they say they don’t like clickbait, which just means that we’re not doing our job properly if clickbait is doing really well on our platform,” Mosseri said.

Right now the News Feed team’s main focus is on helping ensure the quality and relevance of links shared on Facebook. But Mosseri says they’re aware that video and photos could emerge as new fronts in the battle against spammers and misinformation.

“We are more sophisticated in our understanding of text than we are at photos and video, but we are spending a lot of time and energy trying to figure out how to get better at understanding what’s in front of you in photos and videos as well,” he said.

The election and its aftermath have given Mosseri and his teams a new sense of urgency when it comes to News Feed quality and integrity. It’s also made him think about how he and Facebook can ensure they aren’t again surprised by something happening on their platform.

“One thing we care a lot about is making sure we can map out what we don’t know,” he said. “So I think that you can always make progress on the problems that you’ve identified — but I want to make sure that we don’t have any blindspots.”

Topics in this article

Skip to footer