News

Google Autocomplete Faces New Lawsuit for “Jewish” Autcomplete Suggestions

rupert murdoch jewish Google Autocomplete Faces New Lawsuit for Jewish Autcomplete SuggestionsGoogle’s autocomplete feature, which has faced many legal challenges over the past several years, is facing a new lawsuit in a French court related to Rupert Murdoch search query suggestions. When a Google user types “Rupert Murdoch,” the search engine suggests they complete a search for “Rupert Murdoch Jewish.”  The lawsuit, which was filed by SOS Racisme, accuses Google of mislabeling celebrities, connecting people to an often persecuted religion, and creating the “biggest Jewish file in history.”

On the webpage describing the autcomplete algorithm, Google states that the algorithm determines the autcomplete content without human intervention:

“Just like the web, the search queries presented may include silly or strange or surprising terms and phrases. While we always strive to algorithmically reflect the diversity of content on the web (some good, some objectionable), we also apply a narrow set of removal policies for pornography, violence, hate speech, and terms that are frequently used to find content that infringes copyrights.”

In addition to the Rupert Murdoch queries, Google’s autocomplete feature has faced several other legal battles. For an unfortunate Japanese man, the autocomplete algorithm displayed search results related to crimes he did not commit. Although he ultimately won the cyber-defamation case in a Japanese court, the feature tarnished his reputation and cost him several job opportunities. In addition, Google was fined $65,000 last December by a French court for an autocomplete suggestion that hurt the reputation of an insurance company by adding the French word for “crook” to the end of the company’s name.

Since Google is able to filter certain terms related to pornography, racism, and violence, it is obvious that Google has the technology to easily apply filters to certain types of queries. Is it time for Google to update its autocomplete algorithm to prevent reputation damage and ensure compliance with each country’s laws?

The initial lawsuit hearing is scheduled to take place in a French court on Wednesday.

Sources Include: Google, Web Pro News, & PC Mag

0c15e0b63451c1383c65f73c9084b747 64 Google Autocomplete Faces New Lawsuit for Jewish Autcomplete Suggestions

David Angotti

After successfully founding and exiting an educational startup in 2009, I began helping companies with business development, search engine marketing (SEM), search engine optimization (SEO), conversion rate optimization (CRO), online marketing, mergers and acquisition, product development, and branding. Now, I am focused on a new startup in the travel and tourism market niche.
0c15e0b63451c1383c65f73c9084b747 64 Google Autocomplete Faces New Lawsuit for Jewish Autcomplete Suggestions

You Might Also Like

Comments are closed.

5 thoughts on “Google Autocomplete Faces New Lawsuit for “Jewish” Autcomplete Suggestions

  1. I don’t think Google should be forced to change their algorithm for every country every time someone gets upset. In the case of the French Insurance company obviously there is content out there that people created labeling them as “crooks” and for it to come up as a suggested search term there must be enough of it. It’s not as if Google decided to randomly apply the term “crook” to that company.

  2. For crying out loud let Google do its job – come on and be a grown up, so what if people get to know that Rupert M is Jewish – there is black sheep in every community, and he happens to be a big one.

  3. Ridiculous, google auto complete helps find pertinent information about the subject you are searching for. Just because some information doesn’t seem relevant to one person does not mean its not helping someone else find information they are looking for.

  4. This seems ridiculous. I have to ask, in the case of the Japanese man, did a search on his name yield similar results that the auto complete was suggesting? I’m sure there is at least one other Casey Anthony in the world. Google’s auto-suggest is likely to come up with some disturbing suggestions for that person, as would the actual results. Besides, the feature is not auto fact, it is auto suggest. Suggestions based on what other people are actually looking for. Also, I have never seen just ONE suggestion. There are always multiple suggestions from which you could choose, or not choose and do your own search. I mean good Lord, it’s hard enough getting accurate search results, now they want accurate search suggestions too?

  5. I have to wonder if the folks making the comments above would feel the same way if they or someone very close to them were the victim of something threatening or humiliating on Google auto-complete, web search results, image results or any other product. Just try to imagine that your name combined with something truly embarrassing showed up in auto-complete every time someone started to search for you. Trust me, you would be mortified by this and you would try very hard to get Google to take the offending message down and you would most likely be unsuccessful.

    Google would take the position that it’s not their job to police the web — and they would be mostly correct. We can make comparisons to gun manufacturers. We can say it’s not the gun manufacturer’s fault if someone uses their product to murder someone. That’s a true statement. There’s really nothing the gun manufacturer can do to prevent misuse. But Google CAN do something when it comes to their products. They can easily control what appears in auto-correct or any other of their products. They demonstrate this constantly. For them, this isn’t an issue of capability, it’s a business issue of scope and scale. If they do it for one, they fear they’ll have to do it for many. And that too would be a true statement.

    But why are people so willing to give Google a pass on these types of issues? I personally love Google products and use many of them on a daily basis but I don’t want to let Google off the hook on this issue. Sometimes, crowd sourcing (which is core to how many algorithms work) is very effective but other times, it’s easily gamed, malicious and unforgiving. Are we really the type of society that’s willing to say “tough luck to the few people who get trampled on, it works the other 99% of the time.”

    I think we can do better than that.