Technology and the Commons

By: 
Josh Tenenberg

“Jens had to cultivate a strong, unified mind to counteract the disparate landscapes, societies, conditions. He jumped from a monthlong spring hunt to a helicopter that would take him to Nuuk to testify in front of Parliament. On behalf of the Hunters’ Council, he was working hard to ban the use of snowmobiles and prohibit fishing boats in Inglefield Sound, where the narwhal calve and breed in summer” (Ehrlich 2003). This episode concerning a Greenlandic hunter in the early 21st century encapsulates the main theme of this paper: technologies, the policies that govern them, and their use in particular settings all contribute in dynamic and complex ways to their socio-political effects. Yet intended technological effects are not inevitable, as they are subject to resistance, adaptation, and appropriation by the actors within social settings. Technologies can both exacerbate commons dilemmas as well as contribute to their solutions. A keener awareness of the socio-political implications of technologies will increase the likelihood that people design and use technologies to improve the human condition.

The politics of technology

The relationship of technology to politics and social order has interested philosophers, historians, and technologists during the last 50 years. One view, exemplified by Lewis Mumford (1964), asserts that technologies have inherent political qualities: they structure social relations in their very design. “My thesis, to put it bluntly, is that from late neolithic times in the Near East, right down to our own day, two technologies have recurrently existed side by side: one authoritarian, the other democratic, the first system-centered, immensely powerful, but inherently unstable, the other man-centered, relatively weak, but resourceful and durable.” Under this view, rather than being used differently in different socio-political settings, technologies exert their own political stamp on society regardless of context of use: technology determines subsequent social development. Society is thus “engineered” through technology.

Winner expresses an alternate view, that most technologies are not immanently political, but that “the design or arrangement of a device or system could provide a convenient means of establishing power and authority in a given setting” (1980). Technologies are the means by which social actors achieve political ends. Winner provides the example of Cyrus McCormick’s employment of pneumatic molding machines in his manufacturing plant in the middle 1880s, not because they were more efficient, but because they displaced skilled workers, thereby shifting more power into the hands of managers within the political economy of the plant. Another way that social actors affect technology is in pursuing policies that serve their interests. For example, in a careful legal and historical analysis, Litman (2000) documents how the Hollywood studios, music companies and content industries have been the main actors in crafting copyright policy in the U.S. over much of the 20th century, with the public largely unrepresented.

A third view, promoted by Friedman and Kahn (2002), recognizes that technologies can have political effects, but that these effects are only partly a result of intentional design. As importantly, users of technology in local settings assert their agency to shape, resist, appropriate, and adapt technologies to their intentions. So, for instance, though the planners of Brasília (constructed in the late 1950s) might have had goals to create a thoroughly regularized and rationalized modern city through the very structure of the built environment – its immense (and largely empty) plazas, rectangular apartment blocks, separation of traffic from pedestrians, and segregation of places of work, commerce, and home – the actual residents had other plans. Incrementally constructing an “other” Brasília on the outskirts of the “built” Brasília, originating as squatter settlements of laborers, this non-planned Brasília came to contain 75 percent of the population of the city, winning political recognition and city services only through ongoing political action (Scott 1998).

I take the view that all of these elements – technology, policy, powerful social actors, and technology users in local settings–have complex and reciprocal influences on one another that unfold over time as the different actors plan, take action, and respond to the actions of others. Technologies neither operate autonomously and inevitably to shape social arrangements, nor do individuals and groups simply succumb to the social arrangements enforced by the technologies and policies designed and developed by powerful social actors.

Focal and non-focal effects of technology

Technologies are intentionally designed for particular purposes: nets for catching fish, saws for felling trees, telephones for communicating with others at a distance. These proximal, intended effects are what Sclove (1995) calls the “focal” effects of a technology. Yet it is easy to underestimate the complex ways in which people and technologies are intertwined, so that changes to technology sometimes result in far-reaching effects that extend beyond these immediate, focal effects. Technologies also result in what Sclove calls non-focal effects, the “pervasive, latent tendencies” of technologies to “shape patterns of human relationship” (Sclove 1995). These effects are often unintended, outside the focus of many actors in a setting when a new technology is introduced.

Sclove provides the example of Ibieca, Spain, where residents had indoor plumbing installed during the 1970s, replacing their mutual dependence on a village fountain. As a result, “women stopped gathering at the washbasin to intermix scrubbing with the politically empowering gossip about men and village life”. And by introducing indoor water pipes, donkeys were no longer needed for hauling water, so that they were more likely to be replaced by tractors for work in the field, which led to a higher dependence of the villagers on outside jobs. Thus, Sclove claims, social bonds were weakened, reducing the possibility for collective political action. Both focal and non-focal effects are important to consider when technological choices (about designs, about policy) are being made.

Complex interactions between technology and people

In order to better understand the dynamics of the settings in which social groups interact (including but not limited to commons), Ostrom and colleagues have identified several key elements that can be thought of as the “working parts” of these settings (Ostrom 2005.)1 “These are: (1) the…participants, (2) the positions to be filled by participants, (3) the potential outcomes, (4) the set of allowable actions …(5) the control that an individual has…(6) the information available to participants …and (7) the costs and benefits – which serve as incentives and deterrents – assigned to actions and outcomes” [emphasis added] (Ostrom 2005). These elements can be thought of as having political import, i.e., they affect power and authority within a particular setting, because the actors involved craft rules that affect one or more of these elements, for example: who can access a commons, who can sanction, what the actors know about the state of their commons and one another’s actions, how preferences are combined (such as “the elected leader decides” or “majority rules”), and the penalties associated with rule noncompliance.

Using these same elements, we can inquire as to how technologies affect each of them as an analytic means for examining the impact of technology – both focal and nonfocal – on the social order. For purposes of space, I provide illustrative examples of technologies that affect participants, control and information.2

The participants in a commons are the actors who can derive benefit from the commons (such as access to resources), participate in governance (such as rule making and enforcement), and/or be required to contribute to the maintenance of the commons.3 New technologies of transportation, such as the snowmobile in Greenland, mentioned earlier, can have the direct effect of bringing new participants into a setting. Nonfocally, this can strongly affect a commons, since new participants may place increased demand on resources. In addition, they may not share evolved norms that have developed by long-standing commons participants for sustainably managing their commons.

Who can participate in a commons is strongly influenced by technologies of exclusion.4 The historical record indicates the importance of these technologies. “Between 1870 and 1880, newspapers in the region [the western prairies of the US] devoted more space to fencing matters than to political, military, or economic issues” (Basella 1988). The growth of the barbed wire industry provides a dramatic illustration. From 10,000 pounds of barbed wire in 1874, the first year of commercial production, production jumped to 600,000 pounds in 1875, to over 12 million pounds in 1877, and 80 million pounds in 1880 (Basella 1988). Technologies of exclusion, such as fences or digital rights management5 are used to enclose physical or virtual spaces. Such enclosures can create new commons, by providing commoners a means to exclude others from despoiling a resource, or they can destroy commons, by allowing powerful elites to capture what had previously been held in common. Technologies of circumvention can likewise be used to nullify technologies of exclusion. Ladders and wire cutters can overcome fences, for example, and software programs for descrambling commercial DVD’s can gain access to encrypted content (Touretsky 2001). The latent effects are technological arms races that pit those trying to exclude against those trying to circumvent, a dynamic that we see playing out with digital information on the Internet (Committee on Intellectual Property Rights 2000).

With exclusion, whether by policy or technology, what might have previously required human monitoring for compliance can be replaced by technologies that dramatically reduce the costs of enclosure. For example, razor wire and electric fences not only “monitor” access, but “sanction” the trespasser, replacing human intervention with automation. Thus, technologies of exclusion embody specific rules of exclusion. The ubiquitous “No trespassing!” sign might signal the rule, but it is the fence that enforces.

Technology also affects the way in which control is distributed among the actors in a particular setting. For instance, cutting guides and mechanical linkages directly constrain the motion of the human body when people use physical tools such as saws and lathes. Through the study of this kind of ergonomic micro-structure of technologically-enabled work in the early part of the 20th century, Frederick Taylor built sociotechnical processes to further structure human labor in the industrial factory. The new technologies and workplace policies were systematically designed so as to deskill labor – so that it could be purchased less expensively–and move control over production from the shop floor to manage­ment (Braverman 1974). And yet, nonfocally, as Kusterer (1978) illustrates from his study of “non-skilled” labor in a variety of workplaces, workers are never as compliant as these Taylorist designs suggest: they use ingenuity and learned expertise to increase production and quality, as well as their own autonomy, by working around the managerial and technical constraints of the formal policies that management puts in place. Just as with the case of Brasília, powerful actors may attempt to structure and control sociophysical worlds through idealized, technically shaped visions that fail to take account of on-the-ground realities.

Technologies associated with voting – from paper ballots to punch cards, optical scanners and graphical user interfaces – are increasingly recognized for their role related to political control, particularly following the contested outcome of the 2000 presidential election in the United States. “Election processes are inherently subject to errors and are also historically subject to manipulation and fraud.… Voting is in fact a paradigmatic example of an end-to-end security problem representing a very broad spectrum of technological and social problems that must be systematically addressed – from registration and voter authentication to the casting of ballots and subsequent tallying of results. Each of the current technologies has its own set of vulnerabilities; none is infallible” (Neuman 2004).

Different voting technologies impact the amount of error in vote counting (The Caltech/MIT Voting Technology Project 2001), the security and reliability of the voting process (Felten 2003), and its transparency and auditability (Felten 2003). Focally, a change of technology from paper ballots or punch cards to computer ballots may seem a relatively minor matter of implementation detail. But this overlooks a characteristic of computers: “Most of the time and under most conditions computer operations are invisible” (Moor 1985). Part of the controversy surrounding the use of computerized voting systems concerns the fact that the computational invisibility, when coupled with legally enforced ownership of program source code that excludes all but the owners and their agents from looking at the program internals, makes these systems inherently incapable of audits by disinterested third parties (Massey 2004).

This invisibility of technical operations is one of the ways that technologies affect information, and it is not limited to computer technology. Technologies that do not enable transparency for such things as monitoring other participants’ resource use can result in the abandonment or destruction of the technology.

Lansing’s example (2006) related to rice irrigation in Indonesia during the “Green Revolution” of the 1970s is telling:

This method [the flooding of rice fields on a careful schedule] depends on a smoothly functioning, cooperative system of water management, physically embodied in proportional irrigation dividers, which make it possible to tell at a glance how much water is flowing into each canal and so verify that the division is in accordance with the agreed-on schedule.…Modernization plans called for the replacement of these proportional dividers with devices called “Romijn gates”….The use of such devices makes it impossible to determine how much water is being diverted.

Despite the $55 million dollars that the government spent on installing the Romijn gates, “new irrigation machinery installed in the weirs and canals at the behest of the consultants was being torn out by the farmers as soon as they felt that it was safe to do so” (Lansing 2006).

To summarize from these examples, technology is designed for particular purposes, often by social actors with the economic power to direct resources. Focally, the technologies affect who can participate within a setting, the relative control among the different participants, and the information available. The engineering of materials does affect social structure. This is often the explicit intention of its architects: to shape physical activity, to enable patterns of mobility and communication, to divide, to join, to enclose, to circumvent. And yet, indi­viduals within particular settings are not simply acted upon, not powerless in the face of technically enforced regimes of control. Nonfocal effects, sometimes far reaching and unintended by the technology designers, arise from the complexity of the sociophysical world and from the active adaptation and work-arounds to sociotechnical systems by actors on the ground.

Institutional and technological change

Social scientists since Max Weber (1895–1994) have underscored the impor­tance of rules and rule-like mechanisms (often called institutions) for ordering social life, many of which change over time to adapt to new circumstances. As Douglas North indicates in his Nobel Prize lecture, “Economic Performance through Time” (1993), “It is the interaction between institutions and organizations that shapes the institutional evolution of an economy. If institutions are the rules of the game, organizations and their entrepreneurs are the players.” But what many social scientists (including North) overlook is the importance of technology as it affects institutional and social development. To extend North’s sports metaphor, we can consider technologies to be the equipment of the game. People not only change the rules by which they play, they change the equipment. And sometimes, changes to equipment change the very nature of the game.

Technology and the policies related to them interact over time. Stable technologies provide time for the crafting of social policies that are fit to the technology, the people, and the material environment of use. Schlager illustrates in her comparative study of 33 subgroups of fishers worldwide (1995), “Twenty-two groups (67 percent) limit access to their fishing grounds on the basis of type of technology used.” She continues, “For example, the cod fishers of Fermeuse, Newfoundland, described by K. Martin (1973, 1979), have ‘divided their own fishing grounds, as have many inshore fishing communities, by setting aside certain fishing areas (usually the most productive) for the exclusive use of certain technologies’” (Schaler 1995).

Yet, as technology changes – not as a natural process, but intentionally, often enabled by policy – changes occur throughout the setting in which the technology is used: new participants enter, new information becomes available, different outcomes arise, and costs and benefits are apportioned differently. Because of this, actors in the setting may pursue adaptive responses in policy. These new policies constrain (though do not determine) the development of subsequent technologies, which in turn constrain (though do not determine) further policy responses, continuing in this fashion iteratively and indefinitely.

It would be strange to inquire whether rules and policies are political – how could they not be? And yet it is easy to overlook the political nature of technologies, and their resulting impact on commons, despite the fact that, like rules, they affect the same elements: the participants, the control, the information. But technologies are particularly important to take into account in the construction and maintenance of commons, since they can change quickly and have pervasive effects. And as with rules, they are subject to intentional human design. Technologies are political, but not in a deterministic or inevitable way determined solely by their design. Neither are technologies determined only by powerful social actors through how they direct capital and pursue social policies. As importantly, particularly for the future of commons, individuals and collectives of non-elite actors both comply with and resist technology policy, and as importantly, adapt, appropriate, and alter the technologies to hand to fit the contingencies that they face.

Jens, the Greenlandic hunter mentioned at the start of this paper, does not take a uniform anti-technology stance; he himself depends on technology to survive and to hunt. Nor does he simply accede to the intrusion of snowmobiles into Inglefield Sound, a commons that he shares with several of his countrymen. Rather, he acts in the policy domain (one of the degrees of freedom available to him) to try to prohibit snowmobiles in Inglefield Sound. To ask whether snowmobiles are good or bad, democratic or authoritarian or whether, in general, snowmobiles are political, is beside the point. Rather, the point is how Jens and his fellow Greenlanders will respond to the specifics of a new technology that impacts the commons that they share. Such a response might be technological, e.g., with exhaust mufflers; legal (with a general ban that is enforced); social, e.g., organized vigilante action; or some combination, e.g., a law that requires the use of mufflers at certain times and dates, monitored both by the state and by citizens. What choice they make is political in nature, as are future technological and policy actions that this choice will give rise to.

References

  • Basella, G. 1988. The Evolution of TechnologyCambridge University Press.
  • Braverman, H. 1974. Labor and Monopoly Capital; The Degradation of Work in the Twentieth Century. New York. Monthly Review Press.
  • Committee on Intellectual Property Rights, Computer Science & Telecommunications Board. 2000. The Digital Dilemma: Intellectual Property in the Information AgeNational Academy Press.
  • Ehrlich, G. 2003. This Cold Heaven: Seven Seasons in GreenlandNew York. Vintage.
  • Eisenstein, E.L. 1983. The Printing Revolution in Early Modern EuropeCambridgeCambridge University Press.
  • Felten, E. 2003. “A Skeptical View of DRM and Fair Use.” Communications of the ACM, (46)4:56-61.
  • Friedman, B. and Kahn Jr., P.H. 2002. “Human Values, ethics, and design.” In Human Factors and Ergonomics, 1177-1201.
  • Kusterer, K. 1978. Know-how on the Job: The Important Working Knowledge of “Unskilled” WorkersBoulder, Colorado. Westview Press.
  • Lansing, S. 2006. Perfect Order: Recognizing Complexity in BaliPrincetonPrinceton University Press.
  • Litman, J. 2000. Digital Copyright. Amherst, NY. Prometheus Books.
  • Massey, A. 2004. “But We Have to Protect Our Source: How Electronic Voting Companies’ Proprietary Code Ruins Elections.” Hastings Communications and Entertainment Law Journal (27): 233.
  • Moor, J.H. 1985. “What is Computer Ethics.” Metaphilosophy. 16(4):266- 275.
  • Mumford, L. 1964. “Authoritarian and Democratic Technics.” Technology and Culture. (5)1:1-8.
  • Neumann, P. 2004. “Introduction to the Special Issue on the Problems and Potentials of Voting Systems.” Communications of the ACM. 47(10):28-30.
  • North, D. 1993. Nobel Prize Lecture, “Economic Performance through Time,” http://nobelprize.org/nobel_prizes/economics/laureates/1993/north-lectur....
  • Ostrom, E. 2005. Understanding Institutional Diversity. PrincetonPrinceton University Press.
  • Schlager, E. 1994. “Fishers’ institutional responses to common-pool resource dilemmas.” In Ostrom, E., Gardner, R. & Walker, J. Rules, Games, and Common-Pool ResourcesUniversity of Michigan Press. 247-266.
  • Sclove, R.E. 1995. “Making Technology Democratic,” In Resisting the Virtual Life: The Culture and Politics of Information, 85–101.
  • Scott, J.C. 1998. Seeing Like a State: How Certain Schemes to Improve the Human Condition Have Failed. New HavenYale University Press.
  • The Caltech/MIT Voting Technology Project 2001, “Residual Votes Attributable to Technology: An Assessment of the Reliability of Existing Voting Equipment.”
  • Tenenberg, J. 2008. “The Politics of Technology and the Governance of Commons.” The 12th Biennial Conference of the International Association for the Study of Commons, Cheltenham, England.
  • Touretzky, D.S. 2001. “Viewpoint: Free speech rights for programmers.” Communications of the ACM. 44(8): 23-25.
  • Weber, M. 1895/1994. Political Writings. P. Lassman and R. Speirs, editors and translators. CambridgeCambridge University Press.
  • Winner, L. 1980, “Do Artifacts Have Politics?” Daedalus. (109)1: 121-136.
  • 1. See also Ryan T. Conway’s essay on Institutional Analysis and Development (IAD).
  • 2. See Tenenberg 2008 for a discussion of all seven of the elements indicated above.
  • 3. Editors’ note: In this volume, several articles on concrete practices describe the relationship between common-pool resource management and a careful use of technology. See, for instance, Papa Sow and Elina Marmer, and Gloria Gallardo and Eva Friman.
  • 4. See also Silke Helfrich’s essay.
  • 5. Editors’ note: The term digital rights management (DRM) is used to describe any technology that inhibits uses of digital content that are not desired by the content provider. Sony, Amazon, Apple Inc., Microsoft, AOL, the BBC and others use DRM-technologies. In 1998 the Digital Millennium Copyright Act (DMCA) was passed in the United States to impose criminal penalties on those who circumvent encryption, i.e. DMCA enforces DRM. Because the use of digital rights management inhibits user freedoms, some critics have dubbed it “Digital Restriction Management.”