This “invisible algorithmic editing of the web,” as [Eli] Pariser describes it, “moves us to a world where the Internet shows us what it thinks we need to see, but not what we should see.” Beyond Facebook, Pariser notes the huge diversity of search results his friends find on Google about topics like Egypt, where one friend sees news about recent protests and Lara Logan, while another sees results about travel and vacations.
In turn, Pariser believes we’re collectively creating what he calls a personal “filter bubble,” which is also the title of a book on the subject due out in May. And while he falls short of arguing that the trend towards personalization must end, he says the likes of Facebook and Google need to “have a sense of civic responsibility, and let us know what’s getting filtered … [and offer] controls to let us decide what gets through and what doesn’t.”
What he’d like to see is an information world that “gives us a bit of Justin Bieber and a bit of Afghanistan,” marked by controls that let us filter content by its relevance, importance, comfort (topics that can be difficult to discuss or read), challenge level, and points of view (with an option to see “alternative”).
Of course, much of that goes against the history of the Internet, which has been marked by its ability to connect like-minded people, both for good and for bad. In other words, Google and Facebook could build such controls and even bake more human editing into their algorithms, but do people even want them?
Is the Personalization of the Web Making us Dumber?
Published in Digital Humanities