Breaking through our Filter Bubbles

For the last month, Fox News has been my default landing page. To those that know me, this may come as a surprise - especially for a Daily Show, Guardian, latte-sipping person like myself. And especially as someone who truly hates Bill O'Reilly.

I made this change for a problem that I've noticed in the last year - my internet life is lived within a pre-defined, personalized bubble. The systems that I use to view content - from Facebook to Google, Twitter to The Globe and Mail - have been customized to my previous click behaviors and changed based on links I find most interesting. In an Amazon 'if you like this, you'll like that' way, my streams have become so personalized that I'm predictable. If I could really see who 'my' algorithm thought I was, it would probably be as simple as my current Twitter description.

This phenomenon is not restricted to me and it is not new.

In the novel The Big Sort, Bill Bishop provides an in depth analysis into the physical redistribution that has been occurring in the US for the last 30 years. Because mobility is much easier, people are choosing to live in communities with people like themselves. Democrats want to live with other Democrates, and the same goes for Republicans. A look at constituency Presidential data shows that the swings are getting larger every year, not smaller. This physical distribution has meant that America is divided and that each side cannot relate to the other - someone who voted for Obama can't understand why another would vote for Palin and vice-versa.

Like these bubble-communities, the same process is occurs everyday on the web. Every site tries to tailor itself to each users - their interests, their past histories and the actions that help them learn a bit more about what content they should be serving.

Google is the dominant example here. Before 2009, search results were relatively consistent for most users in a specific region. Today, they are almost individualized based on over 200 'signals' that Google tracks - your IP address, history, location and browser, among others.

Eli Pariser, author of The Filter Bubble, shows these large discrepancies in an excellent TED talk. His novel is an excellent - and scary - look at the effects of personalization on individual users and what could happen as these bubbles grow stronger over time.

Consider this - not only do a few corporations control mass amounts of user specific information (Google search, Gmail, Plus along with Facebook, Amazon and Apple), but other data companies actively scrape users when they are on these sites and then sell their information back to 3rd party networks. Ever noticed that certain sites follow you around the web through banner ads after you've visited? This isn't a coincidence.

The problem with personalization is that it's really attractive - I get shown content that I know I will like, everyday. I find a few topics that I think are interesting (Apple News, Ads, etc) and I self-program to hear more and more about them. Pariser describes this in relation to TV:

'In TV network circles, there's a name for the passive way with Americans make most of those viewing decisions: the theory of least objectionable programming. Researching TV viewers' behaviour in the 1970's pay-per-vie innovator Paul Klein noticed that people quit channel surfing far more quickly than one might suspect. During most of those thirty-six hours a week, the theory suggests, we're not looking for a program in particular. We're just looking to be unobjectionably entertained.'

I believe that this theory is consistent with web content consumption today. We find something we like and continue down the rabbit hole:

'Your identity shapes your media,' continues Parison, 'and your media then shapes what you believe and what you care about. You click on a link, which signals an interest in something, which means your more likely to see articles about that topic in the future, which in turn prime the topic for you. You become trapped in a you loop, and if your identity is misrepresented, strange patterns begin to emerge, like reverb from an amplifier.'

There are many issues with our bubbles and I think that the largest one is that what most users find interesting isn’t what is culturally important. Year after year, the most searched terms are usually celebrities and the most popular articles on sites like the Huffington post are random ‘fail’ videos or 100 Bikini photo slide shows. While this might be interesting content, it’s the junk food of content, not the healthy stuff (to steal another Pariser analogy).

So how to we break out of this stream?

For me, it was the act of reading the opposite source from one news site. It’s helped but it is only the beginning. In the coming weeks, I’ll be thinking about ways in which we can proactively broaden our streams to ensure that we have a mix of ‘junk’ content but also important bits as well. To ensure that we don't lose people with differing perspectives from our social streams – but that these different voices are equal to those with the links that we click on.

In any event, I highly recommend picking up The Big Sort and The Filter Bubble – especially for the digital evangelists out there.