Google’s search algorithms and autocomplete suggestions have caused offense, prompting Google to take action.
Google just recently removed autocomplete suggestions from its search engine that suggested the word “evil” on search phrases beginnig with “Are Jews…” and “Are women…”. This came after an article by an Observer columnist highlighted that Google autocomplete was suggesting anti-Semitic, sexist and racist entries. In both cases, when users clicked on the suggestion that popped up, they were routed to a number of sites that “proved” that both Jews and women were evil.
Some of the results for “are Jews evil” included pages that claimed that Jews are demonic souls from a different world and a listing of the top reasons why people hate Jews. A link to a YouTube video showcasing why Jews are evil was among the searches. A similar search for women also returned ludicrous results, for example, that every woman has a little evil and a level of prostitute in her. Another site claimed to explain exactly how all women are evil.
On the same day, Google said that it had taken action to claims in the Observer. Soon after, autocompletes and searches for Jews and women did not bring those results back. The searches yielded different results altogether. A Google spokesperson said that autocomplete suggestions are generated by an algorithm based on the search activity and interests of users. He said that terms that sometimes phrases that appear in autocomplete may seem unpleasant or unexpected because, on any given day, there are about 15% new searches. These searches span a wide range of material. He added that as a company, they strongly value a diversity of ideas, perspectives, and cultures and that results do not reflect the opinions or beliefs of Google. However, he admitted that they acknowledge that autocomplete is not an exact science, and they do their best to prevent offensive terms from appearing, but they are working on improving their algorithms.
This is not the first time search algorithms and autocompletes have caused offense. As much as Google's suggestions and answers are not pre-programmed, but are created by algorithms, and are designed to be helpful, they have often backfired. In June, Google's image search for “three black teenagers” automatically returned pictures of criminals. Similar problems were faced on Google Maps.
Google has had to delete autocomplete suggestions in the past. That time, “torrent” was suggested every time one googled a popular artist’s name. A court order required this to be changed. According to its policies, Google removes autocomplete suggestions that include sexually explicit language, hate speech or abusive language or run afoul of the law.
Google's site handles about 63,000 searches a second and 5.5 billion a day. Its mission is to organize and make the world's information useful and universally accessible. The fear among scholars of the subject is that such issues with search algorithms may influence people's views on particular subjects and perhaps actions, especially when suggestions are defamatory of a particular race, gender, religion or person.