In the spring of 2010, while the remains of the Deepwater Horizon oil rig were spewing crude oil into the Gulf of Mexico, I asked two friends to google "BP." They're pretty similar—educated, white, left-leaning women who live in the Northeast. But the results they saw were quite different. One of my friends saw investment information about BP. The other saw news. For one, the first page of results contained links about the oil spill; for the other, there was nothing about it except for a promotional ad from BP. Even the number of results returned by Google differed—about 180 million results for one friend, and 139 million for the other.
Most of us assume that when we google a term, we all see the same results—the ones that the company's famous PageRank algorithm suggests are the most authoritative based on other pages' links. But since December 2009, this is no longer true. Now you get the result that Google's algorithm suggests is best for you in particular—based on everything from your browser to your search history. Someone else may see something entirely different. There is no standard Google result anymore. If the results were that different for two progressive East Coast women, imagine how different they would be for my friends and, say, an elderly Republican in Texas (or, for that matter, a businessman in Japan).
With Google personalized for everyone, the query "stem cells" might produce diametrically opposed results for scientists who support stem cell research and activists who oppose it. "Proof of climate change" might turn up different results for an environmentalist and an oil company executive. In polls, a huge majority of us assume search engines are unbiased. But that may be just because they're increasingly biased to share our own views. More and more, your computer monitor is a kind of one-way mirror, reflecting your own interests while algorithmic observers watch what you click.
For a time, it seemed that the Internet was going to redemocratize society. Bloggers and citizen journalists would single-handedly rebuild the public media. Politicians would only be able to run with a broad base of support from small, everyday donors. Local governments would become more transparent and accountable to their citizens. And yet that era of civic connection hasn't come. Democracy requires citizens to see things from each other's point of view, but instead we're more and more enclosed in our own bubbles. Democracy requires a reliance on shared facts; instead we're presented with parallel but separate universes.
And its not just Google. My sense of unease crystallized when I noticed that my conservative friends had disappeared from my Facebook page. I like to hear what conservatives are thinking, but their links never turned up in my Top News feed. Facebook was apparently doing the math and noticing that I was still clicking my progressive friends' links more than my conservative friends'—and links to the latest Lady Gaga videos more than either. So no conservative links for me. With little fanfare, the digital world is fundamentally changing.
What was once an anonymous medium where anyone could be anyone—where, in the words of the famous New Yorker cartoon, nobody knows you're a dog—is now a tool for soliciting and analyzing our personal data. According to one Wall Street Journal study, the top 50 Internet sites each install an average of 64 data-laden cookies and personal tracking beacons when you visit them. Search for a word like "depression" on Dictionary.com, and the site installs up to 223 tracking cookies and beacons on your computer so that other Web sites can target you with antidepressants. Share an article about cooking on ABC News, and you may be chased around the Web by ads for Teflon-coated pots. The new Internet doesn't just know you're a dog; it knows your breed and wants to sell you a bowl of premium kibble.
While Google has (so far) promised to keep your personal data to itself, other popular Web sites and apps—from the airfare site Kayak.com to the sharing widget AddThis—make no such guarantees. Behind the pages you visit, a massive new market for information about what you do online is growing, driven by low-profile but highly profitable personal data companies like BlueKai and Acxiom. Acxiom alone has accumulated an average of 1,500 pieces of data on each person on its database—which includes 96 percent of Americans—along with data about everything from their credit scores to whether they've bought medication for incontinence.
All of this personalization isn't just shaping what we buy. Personalized news feeds like Facebook are becoming a primary news source—36 percent of Americans under thirty get their news through social networking sites. Web sites from Yahoo News to the New York Times–funded startup News.me cater their headlines to our particular interests and desires. It's influencing what videos we watch on YouTube and a dozen smaller competitors, and what blog posts we see. It's affecting whose e-mails we get, which potential mates we run into on OK Cupid, and which restaurants are recommended to us on Yelp. The algorithms that orchestrate our ads are starting to orchestrate our lives.
Ultimately, the proponents of personalization offer a vision of a custom-tailored world, every facet of which fits us perfectly. It's a cozy place, populated by our favorite people and things and ideas. We're never bored. We're never annoyed. Our media is a perfect reflection of our interests and desires. But it comes at a cost: Making everything more personal, we may lose some of the traits that made the Internet so appealing to begin with.
Want to stop your browser from tracking your clicks? See our handy step-by-step guide.
This post is an edited excerpt from Eli Pariser's new book The Filter Bubble: What the Internet Is Hiding from You.