clock menu more-arrow no yes mobile

Filed under:

The Rise of the Like Economy

Once a cursed project, the tiny feature has had a huge effect not only on the site, but on us

(Getty Images/Ringer illustration)
(Getty Images/Ringer illustration)

Leah Pearlman has grown wary of Facebook. The 35-year-old illustrator uses the social network to promote her business, Dharma Comics, but has set up various safeguards to avoid becoming too emotionally invested in the happenings on the site. She uses a web browser plug-in called News Feed Eradicator, which replaces the social network’s endless stream of status updates, auto-playing videos, and advertisements with a single inspiring quote. (An example: "If we don’t discipline ourselves, the world will do it for us." — William Feather.) She compares compulsively refreshing the site to see how many Likes a new post has accrued to eating bad potato chips. Last August, she hired a social media manager to promote her comics on Facebook because she could no longer stand the addictive feedback loop the site thrives on — a constant dispensing of positive affirmation that decays into self-doubt nearly as soon as a notification alert has disappeared. "I check and I feel bad," she says. "Whether there’s a notification or not, it doesn’t really feel that good. Whatever we’re hoping to see, it never quite meets that bar."

Pearlman is a practicing Buddhist who meditates every day. Through Dharma Comics, she publishes drawings of stick figures who capture her emotions by waxing philosophical on the best ways to live a fulfilled life. ("I am the source of my own happiness," proclaims one scribbled character, whose leg has morphed into a watering hose splashing Technicolor joy.) She began drawing on a trip to India during a six-month sabbatical from her high-intensity Silicon Valley job. She had lost track of herself as she became increasingly absorbed in her work and needed a clean break. "I was so caught up in the external validation of life, I’d never realized what was happening inside," she says. A year later, she quit her job in the tech industry. She had been a manager at Facebook.

Even if you’ve never seen one of Pearlman’s comics, you’re familiar with her work. Shortly after arriving at Facebook in 2006, she began evangelizing for a new feature on the nascent website: a Like button. Launched in 2009, the Like button has become the low-hanging digital fruit for human connection, not only on Facebook but across the social web. YouTube switched from a star-rating system to a Like/Dislike format in 2010. Twitter changed its neutral star symbol into an ebullient heart-shaped Like in 2015. For many years on Instagram, people who doled out an 11th Like (which changed a boring list of names into a more appealing numerical value) viewed the act as a form of charity. On the new bubblegum rap song "iSpy," artist Kyle picks out a girl he can woo because "she don’t get too many Likes." Sometime in the past few years, we transitioned from a world where we could laugh at the MeowMeowBeenz episode of Community with arm’s-length ironic distance to one where the episode of Black Mirror about a society controlled by social media metrics has a "jarring sense of hyperreality."

This is not exactly what Pearlman had in mind. "Have you seen that episode of Black Mirror?" she asked me during our interview. "I just watched that about a month ago, and that haunts me on a pretty regular basis. Because it’s not that far off."

Like many user interface changes, the introduction of the Like button was done to solve a problem, but it ended up creating new ones. Certainly, it’s offered people a lightweight way to express support for one another and helped build Facebook into a tightly honed machine that knows what users want to view before they do. But by its mere presence, it has also set the rules for engagement on the social network and implicitly guided us toward a specific online discourse. Most urgently, it may be helping to contribute to the proliferation of misinformation on a platform where nuance must be supplied by the user, not the interface. The story of the rise, and rise, and rise of the Like button illustrates how visual design can influence our activities with the same quiet power as a secret algorithm. But in this case, the invisible hand is hiding in plain sight, in the form of a blue thumbs-up.

"I think the Like button did a lot of things it set out to do," says Justin Rosenstein, who was also part of the Like button’s creation. "And had a lot of unintended consequences."

The Like button was supposed to make our lives easier. That’s the mission of virtually every tech product today, but it’s not an idea that was birthed in Silicon Valley. Though "user friendliness" wasn’t coined as a design term until the 1970s, the celebration of ease of use in consumer products dates back to World War II and industrial designers such as Henry Dreyfuss, according to Cliff Kuang, a design journalist and head of product for Fast Company.

"There is a very, almost religious equation of ease of use with social progress," says Kuang, who is writing a book on the history of product design called User Friendly. "In the 1930s and the 1940s, the idea was that a washing machine in your house shouldn’t gather too much dust. Its corners should be rounded. It should be easy to clean. A toaster oven shouldn’t be too hard to work. A thermostat should be easy to understand."

These concepts persisted in the business world and became especially potent in the tech sector around the turn of the century. Google was dead simple compared to Yahoo or AltaVista. The iPod was less cumbersome than competing MP3 players. Facebook was less cluttered than Myspace. That was one reason Pearlman was attracted to the company in the first place. "I was immediately impressed," she says of her first time using Facebook as a college senior at Brown University. "It was clean and easy to use, and it felt exciting to be on there."

Two years later, she would join the startup as an early product manager. Facebook at the time was moving very fast and breaking many things; Pearlman arrived shortly after a new feature known as News Feed had angered Facebook users by putting all their activity on the site for public consumption. News Feed, in fact, made people more hooked on Facebook than ever, but as the site grew more popular, posts were being overrun with redundant comments that expressed some variant of "I like this." If the site had a button that conveyed affirmation, Pearlman thought, the quality of the comments would be improved. "My motivation was really about cleanliness and consolidation," Pearlman says.

She and a coworker developed the button concept, giving it the internal code name "Props." Pearlman posted it on an internal message board for new product ideas in 2007, where Rosenstein, an engineering lead who had recently joined Facebook from Google, noticed it. "I think at the time it was this really small kernel of an idea of ‘Wouldn’t it be nice to be able to more easily just give someone kudos?’" he recalls. "That dovetailed with some thinking I had been doing myself at that time, which was, we had this big network of all these people interconnected in all these ways and communicating with each other. Ultimately, there was an opportunity there to use design to make the path of least resistance to engage in certain kinds of interaction."

Pearlman, Rosenstein, and a few other early Facebook employees (including current vice president of ads Andrew Bosworth) began work on the "Awesome button." Rosenstein recalls writing the user-facing front end of the button’s script. "It was a very classic hackathon of: We stayed up all night and at like six in the morning demoed what we had made," he says.

The Awesome button was built in 2007 but wouldn’t roll out to users until 2009. Because it was such a small design change, it naturally attracted opinions from a huge number of Facebook employees. Some people thought it was unnecessary given the presence of comments. Others thought it should be called "Love" or "Like." The team experimented with using symbols like a star or a plus sign rather than a thumbs-up. There was even a small test where users were able to give both positive and negative feedback to a post, perhaps teeing up the years-long campaign by users to get a "Dislike" button. Through it all, the Awesome button could not manage to earn final approval from CEO Mark Zuckerberg. Bosworth has referred to it as a "cursed project."

"I spent a lot of time internally evangelizing it because the project kept failing," Pearlman says. "My memory is that at some point Mark just decided that [Like] is what it would be called, despite some resistance, which is how many things happened on the site. And he was usually right."

In February 2009, the Like button was launched. Pearlman wrote the initial blog post. "Your friends, and their photos, notes, statuses and more are what make Facebook great," she wrote. "When your friends share something great, let them know you like it."

The button accomplished many things for Facebook at once. As Pearlman had wanted, it lowered the need for users to write comments indicating they liked something. Likes could also be used as a strong signal to the News Feed algorithm to determine which posts to prioritize. The button was portable and able to be embedded on external websites, and made it easier for users to share content on Facebook. And because it also extended to brand Pages, Likes helped power the company’s fast-growing advertising machine.

But there was something more abstract that Rosenstein (these days the cofounder of the workflow software company Asana) was hoping the Like button would accomplish. "Something I had been thinking about was, is there a way to increase positivity in the system?" he says. "Not force it, but to increase the likelihood that Facebook is contributing to creating a world in which people uplift each other rather than tear each other down." In some ways, Like was almost too good at his mission.

The Like button was not an immediate hit — at least, not on my Facebook profile. A survey of my 2009 Facebook activity as a college freshman and sophomore turns up only a handful of posts that earned Likes, even though I was posting on Facebook nearly every day. ("My mom is on Facebook AHHHHHHH" got one Like; "There’s no comfort like Southern Comfort" got five Likes; "Newton’s Fourth Law: Don’t start no shit, won’t be no shit" got five Likes.)

But over time, the habit of Liking became instinctual, not only for me, but for millions of other Facebook users. It’s not clear just how many times we’ve collectively Liked something — Facebook wouldn’t disclose the data — but the figure is well into the trillions. An analysis by the social analytics company Socialbakers of 522 Facebook Pages with more than 10 million fans found that the number of Likes on their posts had collectively grown more than 500 percent between 2012 and 2016, climbing to 21 billion (the figure also includes the Like alternatives, known as Reactions, that Facebook introduced in early 2016).

Like’s growth is a result of both design decisions and the human craving to be, well, liked. Originally, the button appeared to the right of the "Comment" option below posts. Now it’s left-aligned right below the content of a post, serving as a "mental roadblock" that users must consider before moving on to the next status update, Kuang notes. Facebook added the ability to Like comments in 2010 (my quip "All these liking options on Facebook are approaching an Xzibit level of absurdity" got four Likes). 2013 saw the debut of nested comments, which can also be Liked.

All of these different kinds of Likes feed into Facebook’s notifications system, which is a key driver in bringing users back to the site. If someone Likes your post, or your comment, or your comment within a comment, a red-and-white number appears in the top right corner of the Facebook UI (or the bottom right of the iPhone app). Click it and the alert disappears as a list of notifications pops up. The details of the notifications are often meaningless (especially if you haven’t updated your settings to block Candy Crush invites from your aunt). The promise of Facebook is in the potential of that bright-red number.

"When I give talks and ask people, ‘When you go into Facebook, what’s the first thing you look at?,’ fairly universally everyone says that notification icon space in the upper right," says Ben Grosser, an artist and professor of new media at the University of Illinois at Urbana-Champaign. "It’s not surprising. This is a reflection of who paid attention to us while we are gone."

In 2012, Grosser released a browser plug-in called the Demetricator, which strips Facebook of all numbers, including the notification alert, Friend counts, Like counts below posts, and timestamps. The goal of the project, he says, is to "take something that is familiar and perhaps so familiar that we aren’t looking at it, and remove it temporarily, so we can get a sense of what it was doing in the first place."

What were all the numbers on Facebook doing? Grosser has heard from about 100 people who have used the Demetricator over the years, and many of them spoke about having subconsciously established rule sets for how they use the site based on Like counts. "People have written me and said, ‘You know, with Demetricator installed, I don’t know if I can Like something anymore because it turns out that I had a rule for myself, without knowing it, that if it’s older than, say, two days, I won’t Like it.’ ‘It turns out I have a rule that if it has fewer than one or two Likes, I don’t want to Like it because what if it doesn’t become popular?’ They don’t want to be that person who was the only Liker." One user told Grosser that racking up Likes was similar to earning experience points in a role-playing game.

Research bears out Grosser’s anecdotes. A recent study out of UCLA found that teenagers are more willing to give Likes to a photo that already has lots of Likes, even if it’s a mundane image of a plate of food. The researchers discovered that images with more Likes tended to trigger greater brain activity in the neural regions tied to reward processing, social cognition, imitation, and attention, according to a New York Times article about the study. This neural activity is heightened even further if the user posted the much-Liked image themselves.

In addition to influencing what we Like, Facebook’s metrics deeply shape the kinds of content we post. "We have learned almost from birth … to pay attention to how many of something we get," Grosser says. "We get to school and instantly we’re assessed every year, and we get these scores, and we’re told higher numbers are better. If you think about capitalism more generally, accumulation is the goal. Growth is kind of the necessary factor for maintaining a capitalist society. As a result of all of these different things acting upon us, when all of a sudden Facebook puts a count of reactions of some sort on anything we post or say, I think we can’t help but become attuned to it and in a sense desire those numbers to be higher."

None of this is all that different from the ways people use and absorb social cues in the physical world to augment their demeanor, Pearlman says. "You go to a party and people are putting on a certain outfit, and they want to show a certain face, and they want to hang out with certain people. I think we’re kind of doing what we do on Facebook all the time. Everybody else is smiling, even if they’re judging them in their head, which is kind of the same as pressing the Like button. I think it’s just human behavior. I think we can just see it a little more clearly."

But the place where Like diverges from typical human vanity is the way it powers Facebook’s increasingly omniscient News Feed algorithm. Facebook takes into account thousands of factors to determine what posts to prioritize in people’s feeds, but Like is one of the most straightforward ways that users convey positive sentiment to the company’s algorithms. A Like isn’t just a digital pat on the back — it’s an ambiguous upvote that drives a piece of content to more eyeballs. Like is presented as a simple, rewarding interaction point, but the ways in which it dictates what we see are opaque.

"We’re collectively voting on what other people like us should see," Kuang says. "You’re actually getting into this thing where pushing a button is no longer just pushing a button on a toaster and watching your toast toast. What does that button actually do and what does that connect to? That’s harder to know."

We often think of the web as a dichotomy between personalized and nonpersonalized experiences. Google serves users filtered search results, but if you jump through some hoops, you can get a neutral service that isn’t biased toward your supposed interests. On Facebook, though, there is no normal to revert back to. Users can trawl through their Activity Log and un-Like every piece of content they’ve ever Liked, but it’s unclear whether the ranking algorithm would view this action as a counterweight to the initial Like. On the desktop version, users can view posts in chronological order, but the feed eventually reverts back to being filtered. On the mobile app, there’s no way to show posts chronologically by default. In effect, we’re stuck with the Likes we’ve made forever.

This model has made Facebook an irresistible platform for highly personalized viral content, which, according to Rosenstein, was not the Like button’s original intent. "I do think it also led to the rise of clickbait," he says. "The Like button has had a lot of the positive benefits that we originally intended for it to have, but I think it’s also caused the distribution of things that, even if people Like them, aren’t necessarily time well spent."

Facebook’s clickbait problem is well known, and the company has tweaked its algorithm over the years to address it. But it wasn’t until last year that people began wondering whether Facebook’s fundamental design could lead to even bigger problems as the site became a go-to destination for news. "‘Liking’ something, having an affinity for something, is an almost instantaneous gut reaction, and it’s supposed to be," says Michael Caulfield, the director of blended and networked learning at Washington State University Vancouver. "If you look at the interface of Facebook, it’s completely designed to get you to share as quickly as possible with as many people as possible without leaving the site, without doing any deep analysis. That model, which works so well for ‘Charlie bit my finger,’ works horribly for anything information-based."

Some writers exploited this aspect of Facebook’s design in the run-up to the 2016 presidential election. A BuzzFeed analysis found that the 20 top-performing fake election stories, led by the lie that the Pope had endorsed Donald Trump, gained more reactions, shares, and comments than the 20 top-performing stories from 19 major news outlets in the last three months of the campaign (Facebook said the stories represented a tiny part of total news consumption on the platform). The fake stories were overwhelmingly partisan, playing to Facebook’s tendency to extend the reach of content that elicits extreme reactions.

But it’s not just cynical Macedonian teenagers out for a buck who play into this polarizing system. Individual users, implicitly angling for Likes, are seeking to fill Facebook with the most provocative content possible. "You kind of get this view of the world where it’s like highlights and powerful emotions and nothing in between," says Kuang. "That’s what the algorithm is putting in front of you. It’s putting this very intense, distilled version of life in front of you, and that’s a kind of stress. You’ve just created this user experience (UX) that emphasizes intensity."

A system that rewards appealing content with Likes must also punish unappealing content with fewer Likes, and this dynamic may help silence people who hold contrarian views. "If we find an article that has an interesting point that kind of contradicts most of what we think … if our peer group is broadly against the point of view of the article, we’re much less likely to post it," says Caulfield. "Subtly, I think, we have the same underlying fears of rejection, no matter how old we get. It subtly influences what we post and makes us want to post material that will get a good crowd reaction."

Every day Facebook gets better optimized, it gets harder to disentangle what you "Like" from those concepts that may challenge you or frustrate you or open your mind to new ideas. This was, in Rosenstein’s words, the original promise of the open internet and one of the most exciting things about stumbling around the web in the days before social media.

At the same time, the site’s interface has become so lightning fast and so eager to reward posters with virtual kudos that helping to spread extreme content has its own viral appeal. "It’s so easy to use that it could be easy to just be like, ‘Oh, I read this news article,’ and just propagate it without giving it much thought," says Pearlman. "If things took a little longer, a little more effort, I think maybe some of this stuff wouldn’t propagate so fast. It’s almost too easy [to allow] for critical thinking."

Facebook realizes that its fake-news issues are at least in part a design problem. In December the company announced several interface changes meant to deter the spread of misinformation. If a story has been deemed false by multiple news organizations that are working with Facebook, users will see a warning below the post noting that it has been "disputed by third-party fact checkers" (the warning also includes a hazard sign with the same color scheme as a notification alert). Users who try to post a story that’s already been labeled fake will be prompted with a pop-up noting that the story’s veracity has been challenged.

These tweaks are in some ways anathema to everything that Facebook stands for. The company, along with everyone else in Silicon Valley, has long championed seamless design, which works to hide complex digital machinery behind easy-to-understand interfaces. Long ago, Facebook allowed users to adjust which types of posts appeared in their News Feed via an equalizer-like interface. Now an algorithm automatically makes such adjustments (though today users can prioritize which users appear first in their feeds via a preferences menu).

But in recent years, more designers and researchers have been advocating what is known as seamful design, which trades ease of use for greater user clarity about how a complex system works. The pop-up warning about fake news is an example of seamful design, according to Karrie Karahalios, a computer science associate professor at the University of Illinois at Urbana-Champaign. "Traditionally, seams highlight ‘mistakes,’" she said in an email. "The idea of seamful design is more of a process than a prescriptive doctrine. It is about an experience."

The designers I spoke to floated plenty of potential changes to Facebook’s interface that could help lower fake news and expose more seams in the site’s framework. What if users could see the original source for a photo that’s gone viral? What if the age of a publication was displayed in the News Feed in addition to a headline? What if Facebook inserted stories from opposing viewpoints directly below links to partisan pieces? All such efforts could lower engagement by adding complexity to the platform. But driving toward engagement at all costs is a fundamentally unethical way to design products. "Unethical design in the technology sector is almost always caused by commercial pressures," Cennydd Bowles, a former designer at Twitter who focuses on ethics in the profession, said in an email. "Datafication and the pressure of short-term targets can coax even a well-meaning product manager into compromises. I think this tension is particularly pronounced in companies that rely heavily on experimentation and A/B testing. In some of these companies, the framing of users shifts — they become no longer raisons d’être but means to hit targets. In these cultures, unethical design is the natural result."

No matter how Facebook proceeds in its efforts to fight fake news, there are business risks. Users could also grow resentful of the social network if it’s perceived as being biased or untrustworthy. "I think it is important to start slow because as scary as fake news is, it would be even scarier to have the [Facebook] infrastructure itself be opinionated," Rosenstein says. "It’s a very hard problem."

On the other hand, becoming a confusing cesspool of lies poses its own risk. In its annual earnings report, Facebook identified "misinformation or news hoaxes" as new financial risk factors that could lead the company to face lawsuits or increased government regulation. In the company’s last earnings call, Zuckerberg highlighted the company’s recent efforts to combat fake news, saying his goal was to "make our community stronger and a more positive force for good in the world."

There’s also a more abstract risk that Facebook’s self-image as a moral "force for good," in Zuckerberg’s words, could shatter. Silicon Valley is being roiled by the realization that its efforts to fix the world via code may be backfiring. According to a BuzzFeed report, it was a rogue group of employees within the company who launched the first efforts to fight fake news, not executives. As Facebook grapples with its ever-growing power, it must retain the trust of both users and employees if it wants to maintain its dominance.

Regardless of how Facebook chooses to combat these issues, one thing is certain: The Like button and the power it wields aren’t going anywhere. It’s hardwired into the way Facebook powers its site and the way we frame our discussions there. Our only hope is that they — and we — can try to use the tool in a more thoughtful way. "I do think it’s important for designers to continue to think about how we can make the software itself bias us toward playing to our highest selves rather than our lower selves," Rosenstein says. "If you design a system and you don’t watch it, it’s as likely that you’re going to get bad emergent behavior as good emergent behavior. Sometimes people can just think, ‘Well, it’s just what the platform computes and we don’t have any control over it.’ Well, that’s not true. The way you make these design decisions deeply impacts the results, and small tweaks can make huge differences."

The Dave Chang Show

Pizza Potluck and Buy or Sell With Chris Bianco and Kelly Meinhardt

The Town

Hollywood Vs. Big Tech in the Age of Consolidation

The Bill Simmons Podcast

A Masters–NBA Title Crossover, Caitlin’s New World, and the ‘Curb’ Mini Pyramid With Joe House. Plus, Creating ‘3 Body Problem’ With David Benioff, Dan Weiss, and Alexander Woo.

View all stories in Tech