Skip to main contentSkip to navigationSkip to navigation
Twitter is a growing source of news, but its Trending Topics algorithm remains mysterious.
Twitter is a growing source of news, but its Trending Topics algorithm remains mysterious. Photograph: Kacper Pempel/Reuters Photograph: KACPER PEMPEL/REUTERS
Twitter is a growing source of news, but its Trending Topics algorithm remains mysterious. Photograph: Kacper Pempel/Reuters Photograph: KACPER PEMPEL/REUTERS

Strictly algorithm: how news finds people in the Facebook and Twitter age

This article is more than 10 years old

Many people assume important news will find them online, but don’t understand the algorithms defining how

From Spotify and Netflix recommendations to Facebook’s news feed or Google’s search results, algorithms play a pivotal role in our media consumption.

But what does this mean for journalism, and democracy in general? That was the focus for one of the sparkiest sessions at the SXSW conference in Austin this weekend.

Gilad Lotan from startup incubator Betaworks and Kelly McBride from journalism school The Poynter Institute chewed over the question of how algorithms can help us understand the world better, or distort our perceptions of reality.

“I really think journalism ethics right now are really concerned with whether democracy can function,” said McBride,

“When we talk about democracy, we’re really talking about the marketplace of ideas, and whether your idea can surface in a marketplace of ideas,” she said, harking back to the physical Agora gathering places in ancient Greek cities.

“This is why democracy worked in Greece, because this space existed. We take the marketplace of ideas for granted, because we live in a place where it functions for the most part,” she said.

“We don’t think about it. If you lived in Egypt, you would think about it a lot more, where you’re not allowed to express certain kind of political ideas, even now.”

McBride claimed that in the 20th century, the marketplace of ideas was the professional press, complete with gatekeepers to those ideas in the form of journalists. “You either had to be an editor, or had to have access to an editor, or once television came along you had to have access to the means of production,” she said.

“The modern marketplace of ideas has completely changed in just the last six or seven years. You can be the first one to publish information,” she said, referring to the famous first photograph of the plane that landed on New York’s Hudson River, as well as to blog posts that have gone viral.

In theory, then, we’re in a time when anyone can have an idea, publish it and theoretically have it float up to be encountered and considered by the wider population without the permission of those 20th-century gatekeepers.

The challenge: the modern marketplace of ideas is “a very noisy place: so noisy that you yourself don’t get to just sort through all of the ideas” said McBride.

“If you look at the research on how people get their news now: you often hear this phrase: ‘If news is important, news will find me’ – particularly for millennials. But behind that statement is something really important: if news is going to find you, it’s going to find you because of an algorithm.”

This was the basis for McBride and Lotan’s talk: the step-by-step calculations whirring away in the background on Facebook, Google and big news websites are what define the ideas (or stories) that people encounter.

“When we think about these algorithms… we have to think about the power that they encode. Effectively it’s power to draw people’s attention,” said Lotan.

“You can’t expect people to be attentive at a certain point in time to read your article… These algorithms’ embedded power is that they can draw attention, and they can attain attention, and we don’t exactly know how they work.”

McBride and Lotan's SXSW session played to a packed room. Photograph: Stuart Dredge/The Guardian Photograph: Stuart Dredge/The Guardian

He also criticised the often-held notion that these algorithms are neutral, noting that engineers have to make a number of choices when building algorithms, whether it’s the EdgeRank system that defines what stories are displayed in people’s Facebook news feeds, or those pulling together the ’10 hottest’ lists on news sites.

“You’re constantly tweaking these systems to generate this top content or these lists, but you’re doing it and relying on your intuition as an engineer, because the results look good, or because people are clicking on it, so it must be ok,” he said.

McBride continued: “There is this idea that is falsely propagated that we’re in a superior market of ideas because the algorithms are neutral. They’re not neutral: they’re all based on these mathematical judgements that the engineers have made behind the algorithm.”

By way of an example, Lotan talked about Twitter’s Trending Topics list, which in its early days was dominated by Justin Bieber thanks to his enormously active fanbase on the social network. Twitter’s engineers tweaked the algorithm to normalise the data used to draw up the list, in response.

“They normalised content… meaning it was much harder for Justin Bieber to trend,” he said. “Until he gets arrested in Miami and has to pee in front of a camera, and the video is then released,” added McBride. “Then Justin Bieber will trend!”

The pair talked about the weight given to spikes in activity around specific topics, noting that events like a Kardashian wedding will always bubble up higher than steadier, longer-term stories, even if they feel just as popular to the people involved in them.

For example: the reason the #occupywallstreet hashtag had problems trending on Twitter in 2011 wasn’t because Twitter was censoring it, but because it wasn’t quite “spiky” enough for the Trending Topics algorithm.

“If you ask what should constitute a trend, one thing journalism companies are looking at is what is spiking in the marketplace,” said McBride. “Things that spike, like Kim Kardashian’s wedding, will always take precedence over any conversation that’s long-term,” added Lotan.

McBride suggested that these algorithms are having a direct influence on news values, as many journalists and outlets in turn give more priority to spiky events because they can see the greater amount of traffic coming in. The algorithms can call the tune for their editorial decisions.

“The marketplace of ideas actually has a real marketplace effect to it, because journalism companies for the most part are economic engines: they have to make money,” said McBride. “The journalism companies are responding to these algorithms because these algorithms have an effect on the economics of the journalism companies.”

Problems? Maybe. McBride noted that in the 20th century marketplace of ideas, the gatekeepers were editors, who tended to be white middle-class men reinforcing their own ideas about what stories and ideas were important – a problem, initially, for areas like the civil rights and women’s rights movements.

The question now is whether it’s white, middle-class software engineers whose decisions when building their algorithms are having unintended consequences on the kind of news that finds an audience.

“These people are not badly intentioned… it’s not like they have no concern at all for democracy, the same as those editors who controlled the marketplace in the 20th century weren’t badly-intentioned people,” said McBride. “But we know there were unintentional consequences from letting them control the marketplace of ideas.”

Ideas aren’t solely the property of journalists, though. Lotan and McBride talked about Facebook’s recent “Look Back” videos, constructed by an algorithm to show individual Facebook users their key moments from their time on the social network, to be shared with friends as a schmaltzy video.

The algorithm gave priority to babies and children, to smiling faces and groups of people, and to photos and announcements that got a big response from friends.

“The Facebook movie ended up working for a certain kind of Facebook user,” she said, while suggesting that for people whose Facebook lives didn’t “fit” this algorithm, their Look Back movies were much less satisfying.

“We understand who we are as a community by the stories we tell ourselves, but there’s this gap between who you really are, and the story that the algorithm will let you tell,” she said. “Thank God those Facebook stories have gone away! For the people it worked for, it really really worked, but the people it didn’t work for were totally disenfranchised.”

Kim'n'Kanye will always be spikier than a Wall Street occupation. Photograph: Theo Wargo/WireImage for Gabrielle's Angel Photograph: Theo Wargo/WireImage for Gabrielle's Angel

Other non-news examples of algorithms having unintended consequences included Google’s Android Market app store (as it was known at the time) suggesting the Sex Offender Search app as a related recommendation for gay dating app Grindr, or Apple’s Siri virtual assistant’s initial inability to find abortion clinics or morning-after pill retailers.

“It’s not like there’s a conspiracy at Apple or Apple isn’t pro-life. It’s because Planned Parenthood doesn’t call it an abortion clinic. The algorithm for Siri doesn’t understand that language… the word abortion clinic wasn’t linked to Planned Parenthood,” said McBride.

The pair expressed concern about the ability to game these algorithms: using certain keywords, publishing at certain times of the day and trying other tactics to clear the threshold that defines whether algorithms will pay attention to a story or piece of content.

“Any time there is this ability to game the system, what happens is people with more power, and particularly with more money, have more ability to game the system,” said McBride. “And it creates an imbalance of power in a democracy.”

Unintended consequences? See the flurry of websites a few years’ ago that scraped mugshots and arrest details from law enforcement sites and through canny search engine optimisation (SEO) tactics ensured they were high on Google’s search results when people looked for specific names, creating a business model based on the arrestees paying to be removed from these sites.

Also see the question of whether searching for whether vaccines cause autism, whether climate change is real, or whether Barack Obama has an American birth certificate brings up stories that are true, or stories that are false but popular and/or have gamed Google’s algorithms.

“The reason that these people can get these to come up is because they have done some type of SEO to make them rise to the top of Google,” suggested McBride.

What should the response be from journalists, from wider society and from the companies building these algorithms? Lotan pointed out that it’s impossible for Google, Facebook and others to be completely transparent about how their algorithms work.

Why? If they did, it would be even easier to game them. He noted that these companies share a little information when they roll out a big change to their algorithms – Facebook and Google will both publish blog posts explaining key changes providing “just enough information to understand and trust them, but still not enough to game the system” according to Lotan.

McBride said this is good, and must be encouraged. “The same as the 20th century newspaper editors: we asked them to explain how they made their decisions, and they had columns to do so. It gave you a window in,” she said.

“But we really have to shift the burden of transparency from he creators of the algorithm to other organisations and entities. We have to shift the burden to journalists, to citizen journalists, to activists and really to every news consumer, every information consumer, because we still need checks and balances… This is a power that needs a balance, and needs to be checked.”

Lotan pointed to a recent report by Nicholas Diakopoulos – Algorithmic Accountability Reporting: On the Investigation of Black Boxes – published by the Tow Center for Digital Journalism as pointing the way towards this kind of oversight.

He called for journalists to spend more time examining these “black box” algorithms: what goes in, what comes out, and the patterns that emerge as a result.

“The more we do this, the more we understand these unintended biases in these algorithms, the better we can write about them, and hold them accountable,” he said, while McBride called for “algorithmic literacy” to be given even more priority in the education system.

“We need to get it into our curriculum,” she said. “News literacy is a thing, and they are teaching it in some classrooms in some cities… We should be demanding that it become part of our curriculum in all of our education systems.”

Lotan hoped that people will not “blindly click” on lists of stories without questioning how they were put together, while McBride brought the conversation back to the often-expressed belief that “news will find me” if it’s important.

“The next question is how? How does news find us?” she said. “What you need is a certain critical literacy about the fact that you are almost always subject to an algorithm. The most powerful thing in your world now is an algorithm about which you know nothing about.”

More on this story

More on this story

  • Instagramming North Korea

  • Can the US gun lobby be made to misfire through social media?

  • Does social media really bring us closer to the reality of conflict?

Comments (…)

Sign in or create your Guardian account to join the discussion

Most viewed

Most viewed