);
+1 (902) 468 3066 dbb@burnsidelaw.net

Internet Law/Privacy: Fascinating Article on the Ongoing Dangers Encompassed by the “Googlization” of Our Lives

Frank Pasquale’s review of the The Googlization of Everything reflects a capturing of a work that addresses the dangers of control over our identities, our location, the bare facts of our existence.

This is not merely Orwellian in dimension but, per the book’s author, Siva Vaidhyanathan, Kafkaesque in dimension.

Full credit to Pasquale and the wonderful blog Concurring Opinions:

http://www.concurringopinions.com/archives/2011/03/vaidhyanathans-googlization-a-must-read-on-where-knowing-is-going.html?utm_source=feedburner&utm_medium=feed&utm_campaign=Feed%3A+ConcurringOpinions+%28Concurring+Opinions%29

Google’s been in the news a lot the past month. Concerned about the quality of their search results, they’re imposing new penalties on “content farms” and certain firms, including JC Penney and Overstock.com. Accusations are flying fast and furious; the “antichrist of Silicon Valley” has flatly told the Googlers to “stop cheating.”

As the debate heats up and accelerates in internet time, it’s a pleasure to turn to Siva Vaidhyanathan’s The Googlization of Everything, a carefully considered take on the company composed over the past five years. After this week is over, no one is going to really care whether Google properly punished JC Penney for scheming its way to the top non-paid search slot for “grommet top curtains.” But our culture will be influenced in ways large and small by Google’s years of dominance, whatever happens in coming years. I don’t have time to write a full review now, but I do want to highlight some key concepts in Googlization, since they will have lasting relevance for studies of technology, law, and media for years to come.

Cryptopicon

Dan Solove helped shift the privacy conversation from “Orwell to Kafka” in a number of works over the past decade. Other scholars of surveillance have first used, and then criticized, the concept of the “Panopticon” as a master metaphor for the conformity-inducing pressures of ubiquitous monitoring. Vaidhyanathan observes that monitoring is now so ubiquitous, most people have given up trying to conform. As he observes,

[T]he forces at work in Europe, North America, and much of the rest of the world are the opposite of a Panopticon: they involve not the subjection of the individual to the gaze of a single, centralized authority, but the surveillance of the individual, potentially by all, always by many. We have a “cryptopticon” (for lack of a better word). Unlike Bentham’s prisoners, we don’t know all the ways in which we are being watched or profiled—we simply know that we are. And we don’t regulate our behavior under the gaze of surveillance: instead, we don’t seem to care.

Of course, that final “we” is a bit overinclusive, for as Vaidhyanathan later shows in a wonderful section on the diverging cultural responses to Google Street View, there are bastions of resistance to the technology:
One search engine professional, Osamu Higuchi, posted an open letter to Google staff in Japan on his blog in August 2008. The letter urged Google staff to explain to their partners in the United States that Street View demonstrates a lack of understanding of some important aspects of daily life in Japan. Osamu urged Google to remove largely residential roads from Street View. “The residential roads of Japan’s urban areas are part of people’s living space, and it is impolite to photograph other people’s living spaces,” Osamu wrote. . . .

A person walking down the street peering into residents’ yards would be watched right back by offended residents, who would consider calling the police to report such dangerous and antisocial behavior. But with Google Street View, the residents can’t see or know who is peeping. Osamu’s pleas and concerns were shared by enough others in Japan that by May 2009, Google announced it would reshoot its Street View images of Japanese cities with the cameras mounted lower, to avoid peering over hedges and fences.

There are a number of other examples in the book of technology being modified to adopt to cultural norms. But the dominant story is of cultural norms being reshaped by deployment of new technologies.

Public Failure

Progressives often cite “market failure” as a reason for regulation. But the term itself has a hidden laissez-faire bias, implying that markets generally succeed and that intervention is extraordinary. Vaidhyanathan balances the playing field by introducing the idea of the “public failure,” which itself is parasitic on a larger vision of endeavors naturally performed or sponsored by government or civil society. As he explains,

[N]eoliberalism. . . .had its roots in two prominent ideologies: techno-fundamentalism, an optimistic belief in the power of technology to solve problems . . . and market fundamentalism, the notion that most problems are better (at least more efficiently) solved by the actions of private parties rather than by state oversight or investment.

Neoliberalism [included] . . . substantial state subsidy and support for firms that promulgated the neoliberal model and supported its political champions. But in the end the private sector calls the shots and apportions (or hoards) resources, as the instruments once used to rein in the excesses of firms have been systematically dismantled. . . . .

Google has deftly capitalized on a thirty-year tradition of “public failure,” chiefly in the United States but in much of the rest of the world as well. Public failure, in contrast, occurs when instruments of the state cannot satisfy public needs and deliver services effectively. This failure occurs not necessarily because the state is the inappropriate agent to solve a particular problem (although there are plenty of areas in which state service is inefficient and counterproductive); it may occur when the public sector has been intentionally dismantled, degraded, or underfunded, while expectations for its performance remain high.

Vaidhyanathan’s call for a “Human Knowledge Project” in response to this trend is one of the few tech policy proposals that is bold, ambitious, and comprehensive enough to address the challenges posed by privatized knowledge systems. I will address that in more detail in a future post; for those dying to know my thoughts, here is a video where I a) analogize Google’s role in the knowledge system to private insurers’ role in the health system, and b) propose a Medicare-like alternative, or “public option,” to assure a transparent baseline of access to knowledge for those not served by Google and similar systems.

Leave a Reply

Your email address will not be published. Required fields are marked *