top of page
Writer's pictureOurStudio

The Facebook Friend in the Plastic Bubble

The Filter Bubble: What the Internet Is Hiding From You, by Eli Pariser, Penguin, 304 pages, $25.95

Every one of us has our perceptions filtered by the thousands of stories and assumptions and rituals that constitute our culture. Every one of us has held beliefs that seemed self-evidently accurate but were actually contingent elements of the time and place that produced us. This is true not just of the people reading this article, but of every person, in every era, who has been capable of perceiving anything at all. You can stretch those perceptions, expose yourself to new worldviews, learn new things, but you'll always be embedded in a cultural matrix. That's Anthropology 101.

So it's bizarre to hear pundits speaking as though these filters were invented with the Internet. Somehow the Net, the medium that has probably done more than any other to open the channels of communication between cultures, stands accused of encasing us in cocoons. This year the accuser is Eli Pariser, board president of the liberal group MoveOn.org, who levies the charge in his much-discussed book The Filter Bubble. Thanks to changes in our online environment, he writes, "we're more and more enclosed in our own little bubbles. Democracy requires a reliance on shared facts; instead we're being offered parallel but separate universes."

Such complaints have dogged the Web for years. Way back in 1995, the attorney-activist Andrew Shapiro argued in The Nation that the online world was entering an era where "you interact only with people of your choosing and with information tailored to your desires. Don't like antiabortion activists, homeless people, news reports about murders? No problem—you need never encounter them." This was happening, Shapiro explained, because the U.S. was privatizing its part of the Internet backbone. The "crucial step," he warned, came on April 30, 1995, "when the National Science Foundation shut down its part of the Internet." Henceforth we would be at the mercy of corporate giants like AOL, CompuServe, and Prodigy.

To prevent such a future, he suggested, the government should "establish forums in cyberspace dedicated explicitly to public discourse.…These public forums must be visible, accessible and at least occasionally unavoidable—they must be street corners in cyberspace."

By 1999, it was clear that AOL, CompuServe, and Prodigy were not going to control our online experiences. It was also clear that there was far more debate online than before; indeed, debate may already have been more common and more vibrant on the Net than on the street corners that Shapiro had held up as his model. Undeterred, he published a book, The Control Revolution, that plucked one of the background complaints from his Nation article—that "speech in cyberspace can be shut out by unwilling listeners too easily" —and put those unwilling listeners rather than CompuServe at center stage. Now the great threat was "total filtering," a "new level of personal control over experience" that could "solidify a trend toward the elimination of spaces where citizens can confront and engage one another." The argument was echoed by the legal scholar Cass Sunstein, who now runs the White House Office of Information and Regulatory Affairs, in his 2001 tome Republic.com. Sunstein's book arrived just as the blogosphere was exploding into mainstream consciousness, creating a whole new array of "spaces where citizens can confront and engage one another."

You can see the problem here. It's not enough to observe that people might want to establish ideological and cultural cocoons online, nor even to point out that some of them have done just that. You have to remember your Anthro 101 and show that the average Net user is more cocooned now than before. Not only does that not seem to be the case, but a much bigger chrysalis—the mainstream media and its allegedly objective preconceptions—has been falling apart.

Now the fear has taken a new form. Eli Pariser's book offers a more sophisticated version of the argument, one where we're sealing ourselves in bubbles without realizing that we're doing it. Companies like Google will keep track of where we click online, companies like Facebook will keep track of who we know online, and gradually they'll tailor our online experience to what the algorithmic gnomes think we want. We won't know what we're missing, Pariser warns, because the things we'd miss will have been silently swept away. The argument neatly combines the 1995 and 1999 versions of Shapiro's story: The villain here is a quiet conspiracy between big corporations and our inner impulses.

Pariser's picture is wrong, but a lot of his details are accurate. Facebook's algorithms do determine which of your friends' status updates show up in your news feed, and the site goes out of its way to make it difficult to alter or remove those filters. Google does track the things we search for and click on, and it does use that data to shape our subsequent search results. (Some of Pariser's critics have pointed out that you can turn off Google's filters fairly easily. This is true, and Pariser should have mentioned it, but in itself it doesn't invalidate his point. Since his argument is that blinders are being imposed without most people's knowledge, it doesn't help much to say that you can avoid them if you know they're there.)

It is certainly appropriate to look into how these new intermediaries influence our Internet experiences, and there are perfectly legitimate criticisms to be made of their workings. One reason I spend far less time on Facebook than I used to is because I'm tired of the site's hamfisted efforts to guess what will interest me and to edit my news feed accordingly. Of course, that isn't a case of personalization gone too far; it's a case of a company that won't let me personalize as I please.

Above all, Pariser is right that we learn more when we encounter more surprises. He just doesn't make a compelling case that those stray signals are disappearing. "Google is great at helping us find what we know we want," Pariser writes, "but not at finding what we don't know we want." The only conceivable response to this is: Have you ever used Google in your life? It's the world's greatest serendipity machine.

That's the big problem with The Filter Bubble. It does a decent job of discussing the ramifications of its core assumptions, but it never establishes that those assumptions are true. Most importantly, it doesn't establish that we're being herded into ever-tighter filter bubbles.

Pariser contrasts the age of personalization with the days of the mass audience, when editors could ensure that the stories we needed to know were mixed in with the stories we really wanted to read. Set aside the issue (which Pariser acknowledges) of how good the editors' judgment actually was; we'll stipulate that newspapers and newscasters ran reports on worthy but unsexy subjects. Pariser doesn't do the obvious next step, which is to look into how much people paid attention to those extra stories in the old days and how much they informally personalized their news intake by skipping subjects that didn't interest them. Nor does he demonstrate what portion of the average Web surfer's media diet such subjects constitute now. Nor does he look at how many significant stories that didn't get play in the old days now have a foothold online. If you assume that a centralized authority (i.e., an editor) will do a better job of selecting the day's most important stories than the messy, bottom-up process that is a social media feed, then you might conclude that those reports will receive less attention now than before. But barring concrete data, that's all you have to go by: an assumption.

Yes, our media consumption is increasingly personalized. But personalized does not mean isolated. Pariser imagines the Internet becoming a stagnant "city of ghettoes" where "connections and overlap between communities" disappear. But how many people belong to just one online community? A personalized Internet is an Internet geared toward your particular combination of interests, and therefore to your particular combination of human networks. If you're a Methodist Democrat in South Baltimore who watches birds, follows basketball, and loves Elvis, you might be in touch online with people who share your faith but not your politics, and vice versa; your neighborhood but not your hobby, and vice versa; your taste in sports but not in music, and vice versa. That isn't a city of ghettoes. It's a city of crossroads.

And while there may be many good reasons to hate Facebook, an insufficient diversity of views isn't one of them. One of the chief effects of using the site, after all, is to discover your friends' horrifying opinions.

In political terms, that means it's easier, not harder, to break out of those longstanding Red Team and Blue Team bubbles. It's rare for real people's politics to be an exact fit with the standardized boxes provided by the traditional media; Crossfire-style shows might not have much room for pro-life liberals or conservationist conservatives, but the Web does. Our political maps—not just the conventional left-right spectrum, but all the alternatives that people have proposed—can never describe the full range of our perspectives. No matter how you map our political philosophies, someone somewhere will have fused two ideas that you've put on opposite sides of your chart. In a world of hyperlinks, everything is adjacent to everything else. "Left" and "right" become as meaningless as "up" and "down" in outer space.

Nor is it clear that politics are the most important factor in the new filters. At the beginning of the book, Pariser tells us about two friends who searched simultaneously for "BP" during last year's oil leak. Google gave one woman a page of links about the situation in the gulf, while the other friend received a page of investment information. "If the results were that different for these two progressive East Coast women," Pariser writes, "imagine how different they would be for my friends and, say, an elderly Republican in Texas." I'd be a lot more impressed if he had actually included an old Texas Republican in the experiment. Instead all he's established is that two people with the same politics are being sorted in different ways, a result that actually cuts against the idea that we're being autofiltered into ideological bubbles. Either that, or one of his friends did the search wrong.

Even rigid partisans like to visit the other team's outlets. Republican and Democratic blogs scour one another for posts they can link and mock; rumbles break out in the comment threads. Last year Matthew Gentzkow and Jesse Shapiro, two economists at the University of Chicago, did a formal study of the levels of ideological segregation online. Their paper, to be published in an upcoming Quarterly Journal of Economics, noted that the Net "makes it easy to consume news from multiple sources." People who get their information from one source "tend to be light users, and their sole source tends to be one of the large relatively centrist outlets"; meanwhile, "people who visit sites like drudgereport.com or huf?ngtonpost.com, by contrast, are heavy Internet users with a strong interest in politics. Although their political views are relatively extreme, they also tend to consume more of everything, including centrist sites and occasionally sites with con?icting ideology." Not surprisingly, the scholars found "no evidence that the Internet is becoming more segregated over time."

A decade ago, the most quoted cartoon about life online said, "On the Internet, nobody knows you're a dog." Today that's been displaced by a different cartoon, one where a man won't come to bed because "someone is wrong on the Internet." If we're living in bubbles, they're bubbles that sure like to ram into each other. And bubbles that collide are bubbles that are more likely to burst.

0 views0 comments

Comments


bottom of page