What Gender Stereotypes and Sexism Have to Do With Algorithms and Robots

Years of social-psychology studies show that humans default to organizing each other by gender, race and age and use stereotypes to create, define and perpetuate organizational norms. Implicit, unconscious biases are well understood and, while individual people themselves might not set out to consciously act in sexist and racist ways, systematized biases, left unexplored and addressed, yield sexist and racist results.
This post was published on the now-closed HuffPost Contributor platform. Contributors control their own work and posted freely to our site. If you need to flag this entry as abusive, send us an email.

As technologists, and no small number of films, frequently remind us, the singularity, a time when the realization of smarter-than-human computers irrevocably alters our future, is nearer every day. Futurists take this prospect very seriously. They gather to discuss what it means at the annual Singularity Summit, a meeting hosted by the Singularity University, dedicated to exploring the "disruptive implications and opportunities" of the evolution of artificial technology. Based on what we know today the prospects for a robotic future are pretty disturbing.

We don't have to wait for data-like robots to think about how discriminatory norms manifest themselves through technology. Algorithms that we use every day, such as Google Instant's predictive search, provide ample illustrations. Predictive search saves users 2-5 seconds by making the most likely suggestions "based on popular queries typed by other users." Last year, a study conducted by Lancaster University concluded that Google Instant'sautocomplete function creates an echo chamber for negative stereotypes regarding race, ethnicity and gender. When you type the words, "Are women ..." into Google might predict you want one of the following: "... a minority," "... evil," "... allowed in combat," or, last but not least, "... attracted to money." A similar anecdotal exercise by BuzzFeed's Alanna Okun concluded that anyone curious about women would end up with the impression that they are "crazy, money-grubbing, submissive, unfunny, beautiful, ugly, smart, stupid, and physically ill-equipped to do most things. And please, whatever you do, don't offer them equality." In effect, algorithms learn negative stereotypes and then teach them to people who consume and use the information uncritically.

In the last month, two other startling and egregious cases made the news. In the first instance, Jacky Alciné was looking at one of his photos and saw that both he and a friend, both black, had been automatically tagged "gorillas" by automatic recognition software. Google quickly apologized. In the second, a recent analysis showed that searching for the word "CEO" basically surfaces pages of photos of white men. The first image of a female is of Barbie and, as Tech Times noted, it's even worse because it's actually a photo from the satirical news site The Onion. While white men do make up the vast majority of CEOs, the search results actually grossly disproportionately favor their images (27% of CEOs are women, only 11% of images however.) The problem is even worse when you consider the impact of stereotypes and on advertising. As with Search, Google's predictive targeted advertising algorithms use aggregated user results to make what appear to be sexist assumptions based on gender, for example, inferring based on a woman's search and interests that she was a man because she was interested in technology and computers. A recent study showed that when men search for jobs they are shown ads for higher paying jobs than women are six times as often as women. Researchers used hundreds of thousands of false accounts that shared all attributes in terms of education, skill and work experience. The only thing that changed was the system's determination of gender.

Stereotypes are extremely important to what's going on. Sean Munson, a UW assistant professor of human centered design and engineering commented on the impact of searches like these earlier this year, after releasing the results of a study designed to examine how images affect perceptions. "You need to know whether gender stereotyping in search image results," he explained, "actually shifts people's perceptions before you can say whether this is a problem. And, in fact, it does -- at least in the short term."

One of the most gendered stereotypes concerns agency, an important concept in considerations of humanity. Implicit bias means that many, if not most people, associate agency with masculinity, not femininity. This idea is informing the design of elegantly anthropomorphized robots. Ask almost any scientist involved in artificial intelligence and they will tell you that in order to make robots socially acceptable they need to be humanlike. Which means, most likely, they will have gender. Last year, scientists at Bielefeld University published a study in the Journal of Applied Social Psychology that might have profound implications. They found that human users thought of "male" robots as having agency -- being able to exercise control over their environments. On the other hand, female robots were perceived as having communal personality traits -- being more focused on others than on themselves. Believing that male robots have agency could turn into a belief that, when we employ them, they should have agency and autonomy. "Females," not so much. In essence, male robots are Misters, female robots are Mrs. and Misses.

The Bielefeld University researchers also found that people relied significantly on hair length to assign a gender to a robot. The longer the robot's hair was, the more likely people were to think of it as female. And once gender was effectively assigned by the participants in the study, it colored their choices of what the robot should do. "Male" robots were considered better choices for technical jobs, like repairing devices, and "female" robots were thought to be "better" at stereotypical household chores.

What do gendered stereotypes in robots look like? Robotic natural language capabilities and voices substantively affect human interactions. Consider Siri today. Today, unlike originally, iPhones give users in the U.S. the option of choosing a male or female voice. Apple provided no reasons for why prior versions of Siri were female in the U.S., but male in the U.K. Having voice options may sound like a step toward gender equality, allowing people to think of assistants as either male or female. Siri's purported sexism was not only a matter of voice selection, but of content. It seemed as though sexist biases were embedded in the functionality. Answers were markedly skewed in favor of meeting the needs of straight, male users. Siri couldn't answer basic questions about female-centered oral sex, contraception and health.

However, there may have been another reason for making the voice product choice -- namely, while people are more likely to like female voices, they actually trust male voices more and are more likely to think of them as intelligent and authoritative. Women, on the other hand, are fundamentally though of as assistants to centrally located males. Consider, that today, the number one job for women in the US remains what it was in the early 60s: administrative assistant. Nursing and teaching are similarly culturally understood.

At the 2008 Singularity Summit, Marshall Brain, author of "Robotic Nation,"described the predictable, potentially devastating effects of "second intelligence" competition in the marketplace. The service industry will be the first affected. Brain describes a future McDonald's staffed by attractive female robots who know everything about him and can meet his every fast food need. In his assessment, an attractive, compliant, "I'll get you everything you want before you even think about it" female automaton is "going to be a good thing." However, he went on to talk about job losses in many sectors, especially the lowest paying, with emphasis on service, construction and transportation sectors. Brain noted that robotic competition wouldn't be good for "construction workers, truck drivers and Joe the plumber." Nine out of 10 women are employed in service industries. The idea that women will be disproportionately displaced as a result of long-standing sex segregation in the workforce did not factor into his analysis.

I don't mean to pick on Brain, but the fact that male human experiences and expectations and concerns are normative in the tech industry and at the Singularity Summit is clear. The tech industry is not known for its profound understanding of gender or for producing products optimized to meet the needs of women (whom the patriarchy has cast as "second intelligence" humans). Rather, the industry is an example of a de facto sex-segregated environment, in which, according to sociologist Philip Cohen, "men's simple assumption that women don't really exist as people" is reinforced and replicated. Artificial intelligence is being developed by people who benefit from socially dominant norms and little vested personal incentive to challenge them.

The Bielefeld Researchers concluded that robots could be positively constructed as "counter-stereotypical machines" that could usefully erode rigid ideas of "male" and "female" work. However, the male-dominated tech sector may have little interest in countering prevailing ideas about gender, work, intelligence and autonomy. Robotic anthropomorphism is highly likely to result in robotic androcentrism.

Years of social-psychology studies show that humans default to organizing each other by gender, race and age and use stereotypes to create, define and perpetuate organizational norms. Implicit, unconscious biases are well understood and, while individual people themselves might not set out to consciously act in sexist and racist ways, systematized biases, left unexplored and addressed, yield sexist and racist results. Joan Williams, who has written about these issues for decades, suggests that companies and organizations genuinely committed to understanding this problem consider a whole range of what she calls "bias interrupters."

Singularity University, whose mission is to challenge experts "to use transformative, exponential technologies to address humanity's greatest challenges," has 22 core faculty on staff, three of whom are women. The rest, with the exception of maybe one, appear to be a very homogenous group of men. This ratio does not suggest an appreciation of the fact that one of humanity's greatest challenges right now is misogyny.

This post is updated from an earlier version first published in Salon.

Popular in the Community

Close

What's Hot