Facebook ‘Filter Bubble’ Is A Sinister Phenomenon

Related info:

Leaving Facebook? ‘Evil Genius’ Social Network Won’t Make It Easy (Daily Mail)

Facebook Loses 6 Million US Users In May (Computerworld)

Facebook Quietly Switches On Facial Recognition Technology By Default (The Register)

Facebook Hires Former Bush Aides As Washington Lobbyists (LA Times)

Facebook founder called trusting users ‘DUMB FUCKS’ (The Register)



Eli Pariser delivers his TED talk (Photo: TED)

The ‘filter bubble’ is a sinister phenomenon. But Eli Pariser’s alternative sounds even worse (Telegraph, June 21, 2011):

Eli Pariser was looking at Facebook one day when he noticed something peculiar. On his news feed, where he usually enjoyed reading through his friends’ comments and links, there was something missing. “I’ve always gone out of my way to meet conservatives,” says Pariser, a liberal tech entrepreneur from New York. “So I was kind of surprised when I noticed that the conservatives had disappeared from my Facebook feed.”

Facebook had quietly scrubbed the feed clean of anything Right-wing – nothing Republican, nothing anti-Obama, was getting through. So what was going on?

“It turns out,” says Pariser, “that Facebook was looking at which links I clicked on. And it noticed that I was clicking more on my liberal friends’ links than on my conservative friends’ links. And without consulting me about it, it had edited them out. They disappeared.”

In other words, Facebook decided that Pariser’s conservative friends weren’t relevant. It didn’t matter that he liked to hear their point of view occasionally; because he clicked on their links less frequently, they had been exiled from his online world.

In his new book, it is this world that Eli Pariser names the “filter bubble” – a place where a hidden code decides what you can and can’t see.

Pariser speaks matter-of-factly about this sinister phenomenon. “There’s a shift in how information is flowing online,” he told the TED conference in California in May. “It’s invisible. And it if we don’t pay attention to it, it could be a real problem.”

It is not always a problem though. On some websites, “algorithmic editing” – the hidden code at work – is reasonably harmless. You might be shopping on Amazon, looking at books written by Malcolm Gladwell and then a few biographies of Mozart.

At this point, Amazon’s algorithm decides that you might enjoy Alex Ross’s books, and it suggests them in “the more items to consider” section. Why? Because Ross writes about the history of music and – like Gladwell – also writes for the New Yorker. Amazon joins the dots so you don’t have to.

But take a different example. Let’s say you run a Google search for “Egypt” on your PC – and a friend of yours does exactly the same, using their MacBook in Edinburgh. You’ll get the same results, won’t you?

Well, surprisingly no. Google can tell where you are, what internet browser you’re using, and the make of your computer. The website picks up dozens of signals that you don’t even know you’re sending – and it engineers what it thinks is the best result.

When Pariser compared the results from two friends who entered the “Egypt” Google search, he was staggered. One saw information about the political crisis and the protests in Tahrir Square. The other, a list of travel agents and factbooks about the country.

It is here – in the political sphere – that invisible algorithms have disturbing implications. Websites like Yahoo News are already “personalising” their coverage, and others like the Huffington Post are flirting with similar technology. Imagine this: you search for “David Cameron” on a supposedly impartial news site. But it knows you have just been browsing LabourList.org, so the results – though you may not realise it – are massively biased. The website knows what you want to read, so it spoon-feeds you more of the same.

It might spoon-feed you some junk, too, just for the fun of it. Your “future aspirational self”, as Pariser calls it, may want to learn about David Cameron’s health policy. But the website knows your “impulsive present self” would like to read about Justin Bieber. So it diverts your attention to his latest interview. More Bieber, less Cameron – that’s the filter bubble at work.

So is there a solution to the filter bubble? Pariser thinks so. But his solution is perhaps more sinister than the phenomenon itself.

The problem, he thinks, is that we’ve removed the gatekeepers to the world’s information and news. Before the invention of the internet, we had human editors with “embedded ethics”. Now we’ve passed the torch from human gatekeepers to algorithmic ones – but they, of course, lack the ethics.

But here is his proposed solution (aimed directly at Google at Facebook):

to make sure that these algorithms have encoded in them a sense of the public life. A sense of civic responsibility. We need you to make sure that they’re transparent enough that we can see what the rules are that determine what gets through our filters. And we need you to give us some control so that we can decide what gets through and what doesn’t.

Along with relevant, then, the hidden codes must also focus on what’s important, uncomfortable, challenging, other points of view.

Pariser received a roar of approval and a standing ovation from his Californian audience when he concluded with these thoughts in May. But though his plea for transparency is commendable, is there not something deeply troubling about the notion of ethical algorithms? Whose duty would it be embed civic responsibility in these codes? And exactly whose idea of civic responsibility would be imposed? It shrieks “thought police” to me.

Arguably, Facebook, Google and others deserve Pariser’s thorough treatment. And their online consumers should demand more transparency. But if the alternative to the filter bubble is an internet edited by the New York Times, you can count me out.

1 thought on “Facebook ‘Filter Bubble’ Is A Sinister Phenomenon”

Leave a Comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.