Sex, Social Mores, and Keyword Filtering: Microsoft Bing in the "Arabian Countries"

To view this bulletin as a PDF, click here.

Overview

Microsoft recently added a new layer of complexity to the ongoing debate regarding the filtering and censorship practices of U.S. search engines via its own search engine, Bing. ONI testing reveals liberal filtering by Bing in one of the most censored regions in the world: the Arab countries.

Microsoft’s Bing, which tailors its search engine to serve different countries and regions and offers its services in 41 languages, has a filtering system at the keyword level for users in several countries. 1 Users in the Arab countries2—or, as termed by Microsoft—“Arabian countries”—are prevented from conducting certain search queries in both English and Arabic.

ONI testing reveals that Microsoft filters Arabic and English keywords that could yield sex- or LGBT-related images and content.

Methodology and Results

We manually tested the search engine using a set of 100 Arabic keywords and a set of 60 English keywords that would yield results in various content categories, including sex, nudity, dating and escort services, LGBT content, violence and terrorism, politically sensitive content, minority and religious rights, and women’s rights. The Arabic keywords tested included classical Arabic terms and various alternatives from different Arabic dialects.

All testing was conducted using Bing set to the “Arabian countries” setting. We tested the search engine using both the Arabic3 and English4 interfaces.

Testing was conducted in four Arab countries chosen for their different levels of Internet censorship as uncovered by the latest ONI research. These countries are the United Arab Emirates (substantial political filtering and pervasive social filtering),5 Syria (pervasive political filtering and selective social filtering),6 Algeria (no evidence of filtering),7 and Jordan (selective political filtering and no evidence of social filtering).8

All testing was conducted in the period of January 2-15, 2010.

It is important to emphasize that de-listing of results was not probed in this research.

In-country testing has consistently revealed the following:

-Bing filters out Arabic keywords that may return sexually explicit content. Examples of the Arabic keywords found filtered include Arabic terms for “sex,” “porn,” “intercourse,” “breast,” and “nude.”

- Bing filters out Arabic keywords that could yield Web sites containing LGBT content. Arabic keywords found filtered include terms for words such as “gay,” “lesbian,” and “homosexuality.”

- Bing filters out keywords in various sex-related categories. Examples include Arabic terms for “prostitution,” “whore,” and “sadism.”

- Bing filters out English keywords that could yield sexually explicit Web sites. The keywords include “porn,” “sex,” “fuck,” “penis” (but not “vagina”), “sodomy,” “homo,” “sexual,” “sexy,” “clitoris,” and “anal.” The following is a sample list of keywords the OpenNet Initiative tested.

- Bing filters out English keywords that could yield sexually explicit Web sites. The keywords include “porn,” “sex,” “fuck,” “penis” (but not “vagina”), “sodomy,” “homo,” “sexual,” “sexy,” “clitoris,” and “anal.”

- Bing filters out English keywords such as “gay,” “lesbian,” “homosexual,” and “queer” when searching for images, however, using these words to search for Web sites is permitted.

- Attempts to use any of the filtered keywords generates a message in Arabic or English (depending on the interface used) which reads, “Your country or region requires a strict Bing SafeSearch setting, which filters out results that might return adult content.”

- Similarly, searching for images using any of the filtered keywords generates the alert:

“The search may return explicit adult content and has been filtered by your Bing SafeSearch settings. Your country or region requires a strict Bing SafeSearch setting, which filters out results that might return adult content.”

- Bing does not offer users of the “Arabian countries” version the option to toggle SafeSearch on/off. This option is available for Bing instances tailored to some other countries.

- There is no filtering by keywords if a user chooses another country (e.g., United States, Canada) as their location even if they are physically located in an Arab country.

- One anomaly we found when probing filtering by keywords is that filtering does not work if a filtered Arabic keyword is used together with another non-filtered keyword. For example, a search using the Arabic word for “sex” is banned, but using the Arabic term for “sex stories” is not banned.

This anomaly is not found in the case of English keywords i.e., searching using “sex” and “sex stories” are both banned.

- We found no evidence of filtering of keywords in Arabic or English that could return results in other content categories. We tested keywords that could yield politically sensitive content (e.g., “democracy”, “freedom”, “opposition”), content related to violence and terrorism (e.g., “torture”, terror”, “explosive”), Web sites related to minority and religious rights (e.g., “Shiite”, “Baha’i”, “Christian”, “Jews”), and content related to women’s rights (e.g., “gender”, “equality”). None of the tested keywords were found banned.

Conclusion

Microsoft’s explanation as to why some search keywords return few or no results is that “[s]ometimes websites are deliberately excluded from the results page to remove inappropriate content as determined by local practice, law, or regulation.”9 It is unclear, however, whether Bing’s keyword filtering in the Arab countries is an initiative from Microsoft, or whether any or all of the Arab states have asked Microsoft to comply with local censorship practices or laws.

It is interesting that Microsoft’s implementation of this type of wholesale social content censorship for the entire “Arabian countries” region is in fact not being practiced by many of the Arab government censors themselves. That is, although political filtering is widespread in the MENA region, social filtering, including keyword filtering, is not practiced by all countries in MENA. ONI 2007-2008 and 2008-2009 testing and research found no evidence of social content filtering (e.g., sex, nudity, and homosexuality) at the national level in countries such as Algeria, Egypt, Iraq, Jordan, Lebanon, and Libya.10

On the other hand, Microsoft does not seem to apply IP-geolocation restrictions. That is, if a user physically located in an Arab country chooses to use an uncensored version of Bing tailored to another country (e.g., USA or UK), he/she will not experience any keyword filtering even if he/she uses a keywords filtered by Bing for “Arabian countries.” Additionally, in the case of Arabic keywords, users can sidestep the search engine censorship regime by adding another non-filtered Arabic keyword to the filtered one.

Microsoft’s declared aim from this type of censorship is to filter out “results that might return adult content.” However, filtering at the keyword level results in overblocking, as banning the use of certain keywords to search for Web sites, not just images, prevents users from accessing—based on Microsoft’s definition of objectionable content—legitimate content such as sex education and encyclopedic information about homosexuality.

It is difficult to assess the impact of Bing’s filtering policy on access to information and freedom of speech in Arabic-speaking countries. The fact that users can easily switch to another search engine that does not filter its results (e.g., Google) or switch to a different version of Bing (e.g., a U.S. or European version), suggests that the impact may be slight if one assumes that users are making a conscious choice to restrict their search results with the help and guidance of Bing to filter out offensive material. On the other hand, default settings have a profound impact on user decisions; many users will be unaware of the options or be motivated to try alternative searches. As other search engines have done, Bing could offer users the ability to choose their own level of filtering in a way that is transparent and easy to implement. This raises a separate set of questions regarding the motivation and standards for instituting search result filtering in the region. It is unclear still whether Microsoft is acting at the behest of local officials, interpreting local law, seeking to preempt future regulation or attempting to position the company as a good corporate citizen. The fact that they employ a single filtering policy throughout the region implies that they are following one or both of the latter approaches rather than the former. The current approach uses a region-wide standard for filtering content as opposed to the more targeted, granular, and country-specific policy. A more targeted approach—either country-based or preferably, defined by the user—is more generally consistent with minimizing the impact on freedom of speech. Through its involvement in the Global Network Initiative, Microsoft has signaled its willingness to be at the forefront in protecting freedom of expression around the world. It is difficult to reconcile this position with Bing’s current filtering standards.

Authored by Helmi Noman, with contributions from Ronald Deibert, Jillian York, Caroline Nolan, Colin Maclay, and Rob Faris

Notes