DuckDuckGo says Google's methodology may by inadvertently causing bias - Business Insider

On Tuesday, Google CEO Sundar Pichai was interrogated by the House Judiciary Committee over a number of topics including the company's data collection and potential bias in the search results it serves up to users.

Republican House members — including Rep. Lamar Smith (TX) — didn't hold back on the topic of conservative bias.

"It will require a herculean effort by the chief executive and senior management to change the political bias now programmed into the company's culture," Smith said, citing "irrefutable" studies on the subject. "Google could well elect the next president with dire implications to our democracy."

Pichai responded: "With respect… we don't agree with the methodology [of the studies]."

Read more:Congress grills Google CEO Sundar Pichai for the first time

One person watching closely was Gabriel Weinberg, co-founder and CEO DuckDuckGo — a privacy-focused search engine company that competes with Google Search and last week, revealed a study of its own (not referenced by Rep. Smith).

The study, among other things, found that participants saw vast differences in search results when searching for the same keywords (like "gun control" or "immigration") from different locations across the country. The study controlled for other potential factors of personalization by having its participants log out of their Google accounts and search from an incognito state.

"What [our study] does reveal, or at least suggests, is that Google's collection and use of personal data, including location, which is then used to filter specific search results, is having an effect akin to the effects of a political bias," Weinberg told Business Insider on Tuesday. "That is an important nuance often missed in these discussions.

You're putting a whole ZIP code in a filter bubble

Essentially, Weinberg is saying that even if Google does not create its products with the intent of having a political bias, the fact that location information is used to filter results creates its own form of bias.

"If you live in this ZIP code, we're going to show you the NRA. But if you live in this other ZIP code, we're not going to show you the NRA," Weinberg says could explain the results his team discovered. "If that's what [Google's] doing, then you're putting a whole ZIP code (or whatever the location boundaries) in a filter bubble."


Filter bubbles occur when users get trapped in a cycle being served content that interests them most. The personalization may sound appealing, but the implications of not seeing content that contradicts one's beliefs can have major consequences.

"That's the problem with these algorithms," Weinberg said. "You make these things, you don't even realize what's going on, and then all of a sudden you're potentially influencing tens of millions of people."

A Google spokesperson denied the results of DuckDuckGo's study, saying: "This study's methodology and conclusions are flawed since they are based on the assumption that any difference in search results are based on personalization. That is simply not true. In fact, there are a number of factors that can lead to slight differences, including time and location, which this study doesn't appear to have controlled for effectively."

Weinberg said he anticipated Google's rebuttal and that is why he and his team controlled for both time and location in their research. Also, he argues the findings were far from trivial.

"On the surface level, [Google] said there were slight differences, and that is just totally different from what we saw," he explained. "We saw vast differences."

Weiberg told us that even if Google refutes DuckDuckGo's findings, he hopes it will at least inspires others — especially academics — to dig into the issue further.

"To date, no one has been really doing these studies to hold [Google] accountable," he said. "I don't think they want to acknowledge that it can have the manipulative effect that it can."

Get the latest Google stock price here.

More: Google DuckDuckGo Sundar Pichai Algorithmic bias