What is Hollywood, you ask, dear children? A quorum of whores babbling endlessly on about fucking while the bordello is razed for a penny arcade -- Paul Bern

Monday, November 14, 2011


But the debate about tools like Twitter Trends is, I believe, a debate we will be having more and more often. As more and more of our online public discourse takes place on a select set of private content platforms and communication networks, and these providers turn to complex algorithms to manage, curate, and organize these massive collections, there is an important tension emerging between what we expect these algorithms to be, and what they in fact are. Not only must we recognize that these algorithms are not neutral, and that they encode political choices, and that they frame information in a particular way. We must also understand what it means that we are coming to rely on these algorithms, that we want them to be neutral, we want them to be reliable, we want them to be the effective ways in which we come to know what is most important.

Twitter Trends is only the most visible of these tools. The search engine itself, whether Google or the search bar on your favorite content site (often the same engine, under the hood), is an algorithm that promises to provide a logical set of results in response to a query, but is in fact the result of an algorithm designed to take a range of criteria into account so as to serve up results that satisfy, not just the user, but the aims of the provider, their vision of relevance or newsworthiness or public import, and the particular demands of their business model. As James Grimmelmann observed, “Search engines pride themselves on being automated, except when they aren’t.” When Amazon, or YouTube, or Facebook, offer to algorithmically and in real time report on what is “most popular” or “liked” or “most viewed” or “best selling” or “most commented” or “highest rated,” it is curating a list whose legitimacy is based on the presumption that it has not been curated. And we want them to feel that way, even to the point that we are unwilling to ask about the choices and implications of the algorithms we use every day.

CIVIC INFORMATION: an epistemological model (in this case, an algorithm) is also a polity. That is, it does to information what the city or the village does to the individual human -- makes it behave according to its noble vision of the citizen. In no way can it be allowed to be unruly.

The above lament is an almost perfect illustration of the modern gap: we want information to be perfectly MACHINED, that is godlike, free of human interference, so that we can better worship it, and at the same time we recognize that it is but a social creation and thus tainted by our sinful human culpability in shaping it and thus its magic crumbles before our eyes.

No comments: