5 Questions with Eli Pariser, Author of ‘The Filter Bubble’

  • Share
  • Read Later

Eli Pariser is no enemy of the Internet. The 30-year-old online organizer is the former executive director and now board president of the online liberal political group MoveOn.org. But while Pariser understands the influence of the Internet, he also knows the power of online search engines and social networks to control exactly how we get information—for good and for ill. In his new book The Filter Bubble, Pariser explores the ways that personalization—the growing practice of Facebook and Google to craft our online experiences according to our supposed interests—can cloud our ability to see the world clearly. Pariser spoke with TIME’s Bryan Walsh about the book, the politics of personalization and how to ensure that you don’t end up in a search engine ghetto.

TIME: What started you on the journey to writing The Filter Bubble?

Pariser: I was taking a couple of days to get my head around how the way that information online was changing and I came across that post from Google about personalized search at the end of 2009. Immediately I went to Google and started tinkering around, seeing how different the search results were. I was really shocked by the degree of difference. This was like a completely different world from one person to another. That got me interested. At first I just wrote down some notes, but it just kept gnawing at me that this was kind of a big deal, and then I started to notice that Facebook was doing. The New York Times was investing on this News.me site that would do it for news. I realized that all of the profit incentive point in the direction of doing this as much as possible. There’s no reason to expect we wouldn’t keep seeing more and more of this. And it got me worried.

TIME: What’s the downside to personalization? After all, the search engines and social networks are doing it because they say it will deliver a more useful Internet experience, one that helps automatically cut through all the data out there.

Pariser: For one thing it’s invisible. People have always sought our news that fits their own views. But when you turn on MSNBC or Fox News you know something is being left out. And the problem with the way that this is all happening is that most people don’t even know this kind of filtering is happening at all. The idea that these companies are deciding and editing out some results isn’t obvious, and so you don’t know what’s being left out, and you don’t have a good picture of the world. The second problem is that what I call “autopropaganda.” You are basically indoctrinating yourself with your own views and you don’t even know it. You don’t know what you see is the part of the picture that reflects what you want to see, not the whole picture. And there are consequences for democracy. To be a good citizen, it’s important to be able to pout yourself in other people’s shoes and see the big picture. If everything you see is rooted in your own identity that becomes difficult or impossible.

TIME: What about the privacy ramifications here? If they’re going to personalize your search, they need to have personal data in the first place. Are these companies trustworthy?

Pariser: I would say no. They haven’t really grappled with the real responsibilities they have to the people that depend on them to provide these services. For example, given that all of these services rely on the data that a customer reveals to these companies, it is only reasonable to allow customer to see what data they give and have some control over it. Whether it’s Facebook or Google or the other companies, that basic principle that users should be able to see and control information about them that they themselves have revealed to the companies is not baked into how the companies work. But it’s bigger than privacy. Privacy is about what you’re willing to reveal about yourself. But here the question is, what is revealed to you about the world, based on who you are. It’s even more pernicious in a way. You are seeing essentially an edited worldview based on this personal information that you have no control over.

TIME: I’m often struck by the way techies talk almost in the passive tense about these advances. Marissa Mayer says, “Search will be personalized,” as if no one’s really doing this, as if it’s just happening on its own. There’s a sense of well, things go up and then they fall down—as opposed to the idea this is a human-designed system that could presumably be changed.

Pariser: That always frustrates me because there is a strong strand that technology is going in this direction and we are just helping it along its way. And I think that kind of argument is really dangerous because it absolves people of the responsibility of thinking about the consequences of what they’re doing. Morally we know that’s a problematic place to be. We need these folks to recognize that there are big forks in the road here and very different ways this can all play out. What they do in the next few years could make a huge difference.

TIME: So what can you do to escape your own filter bubble?

Pariser: I think the first part is to understand and to notice when this is happening and where it’s happening. One of the sort of scariest things about the filter bubble is the unknown unknown, the fact that because you don’t know on what basis you’re seeing stuff, you don’t know what you’re missing. As you become conscious about that, you can keep an eye out for the things you’re missing. The second thing is that, certainly, there is some individual responsibility here to really seek out new sources and people who aren’t like you. The more you do that, the more you evade these filters. If you’re on Google News and you click on MSNBC and Fox you get a better picture. But in the end a lot of this does come down to these companies kind of accepting responsibility for the editing that they are doing and doing it in a better way. Letting Google and Facebook know that you want them to do that, I think that is the pressure that in the end will make them decide to take this seriously.