photo by scion_cho
Over at Beyond the Times, Walter wrote about the inevitable echo chamber effect that would follow the introduction of a news aggregator into Facebook update streams. Given all the “likes” assigned a variety of content on the site, it would be an easy feat to develop an algorithm to direct relevant news that fits an individual user’s world view, eliminating any challenges to that perspective.
It’s not as though such formulas aren’t already pervasive on the intertubes. Netflix regularly recommends a variety of films within subgenres that I frequently view, and Amazon.com is constantly tweaking its suggestions to me based on my purchases, viewing and rating of titles.
Eli Pariser recently discussed the filter bubble phenomena with Lynn Paramore of the Roosevelt Institute:
Since Dec. 4, 2009, Google has been personalized for everyone. So when I had two friends this spring Google “BP,” one of them got a set of links that was about investment opportunities in BP. The other one got information about the oil spill. Presumably that was based on the kinds of searches that they had done in the past. If you have Google doing that, and you have Yahoo doing that, and you have Facebook doing that, and you have all of the top sites on the Web customizing themselves to you, then your information environment starts to look very different from anyone else’s. And that’s what I’m calling the “filter bubble”: that personal ecosystem of information that’s been catered by these algorithms to who they think you are.
This technology-induced bubble is particularly problematic in that it is human nature to accept facts and opinions that align with personal beliefs and disregard information that clashes. A recent Yale Law School study published in the Journal of Risk Research found that regardless of political leanings,
Individuals systematically overestimate the degree of scientific support for positions they are culturally predisposed to accept.
Social technology is making it effortless find and follow preferred sentiment and these sites are increasingly becoming the go-to places for news. Forty-two percent of respondents in a Retrevo Gadgetology study admit to checking and updating their Twitter and Facebook feeds first thing in the morning, with 23 percent of iPhone identifying these feeds as their morning news. In a recent Oxygen Media study, more than one third of women 18-34 years old reported checking Facebook before getting out of bed in the morning.
What happens to society when people can no longer have informed discussions of reality and data because of a refusal to acknowledge the very existence, let alone the validity, of information that conflicts with our own world view? Does it increasingly heighten the notion of an “Other” that could destroy a preferred way of living? Should marginalized religions, races and cultures expect increased persecution for being an outlier of mainstream thought?
And most importantly, how do we find ways to be more receptive to ideas that challenge our own? New solutions to old problems could emerge from the discussion that follows.