Eli Pariser, former Executive Director of MoveOn and current President of their board, has recently written a book, The Filter Bubble: What the Internet Is Hiding from You, and has been speaking out about the book’s message of how search engines keep track of our preferences and have a tendency to filter out information and views that oppose the tendencies suggested by our previous internet searches. [ http://networkeffect.allthingsd.com/20110520/eli-pariser-on-the-downsides-of-personalization-video/?mod=googlenews ]
In my view, Eli Pariser has called out attention to a current and powerful phenomenon—that strongly suggests that our Internet searches, and the Internet searches of others in our society and around the world, are biased, and narrowing in ways that none of us realize. This not only limits our own inquiry, but it perpetuates the ability of those who might be even less curious from being exposed to ideas and information different from our current inclinations. We all have biases and tend to filter out information.
One of the ways to engage in “better” inquiry is to consciously seek out a breadth and diversity of perspectives and information on whatever topics we’re current concerned with. “Science” at its best does this, and the methods of action-research promoted at WISR aim to give us criteria and methods for seeking ways to broaden our experiences and exposure to varied ideas.
Indeed, we might not even become aware of some topics which we could investigate and become curious about, because our Google searches, for example, are now tending to perpetuate our past inclinations and interests—not unlike the ways in which Fox News will only cover certain news items and furthermore, cover news from a particular perspective. Arguably, even progressive media can limit our access to some information and new ideas, but since mainstream media are such a continual contrast to progressive media, we at least have the benefit of those more conventional perspectives unless we make strong efforts to avoid corporate controlled media completely.
As Pariser states, “It’s one thing when you turn on MSNBC or Fox News. When you do that, you know what the editing rule is — what kind of things you’d expect to see there and what kind of things you’d expect to be edited out. But with a Facebook news feed or Google News, you don’t know who they think you are. You don’t know what’s been edited out. It can really distort your view of the world.
Sometimes the unexpected, serendipitous articles or discoveries are some of the very best moments when you learn about some whole new process or way of thinking or topic. It’s sad if we lose that just so a few companies can get more ad clicks.”
[ http://articles.cnn.com/2011-05-19/tech/online.privacy.pariser_1_google-news-facebook-internet?_s=PM:TECH ]
In his talk on TED, Pariser goes on to say that transparency, ethics and responsibility need to be coded into search engine algorithms. [ http://www.ted.com/talks/eli_pariser_beware_online_filter_bubbles.html ]
So, what kinds of steps can we take to continue to expose ourselves to diverse ideas and sources of information? And, what can we do increase the odds that others will also be exposed to a more fertile soil, rich in the nutrients, necessary to the kind of inquiry that will promote curiosity and social justice? Questions such as these deserve our serious attention. Thank you Eli Pariser.