At present, there are great parallel traditions of research: one, in the science and engineering of digital information and communications networks, and two, in social (that is, ethical, political, and cultural) analyses of these technologies. Recognizing the importance of research that integrates the two, various models have evolved over the past few decades to stimulate cross-disciplinary research collaboration and understanding.
In the project Values in Design of Future Internet Architecture, I have proposed a novel model for sparking such collaboration, one that overcomes barriers to success inherent in some of the other approaches. An EAGER grant would support experimentation with this model, using the Future Internet Architecture Initiative projects as an exciting test-bed. In particular, a grant would support the constitution of a multi-disciplinary team of experts in the social analysis of digital information technologies to work alongside recipients of the Future Internet Architecture Initiative (FIA) large projects grants. The team would serve as analysts and consultants to the FIA projects, helping to identify junctures in the design process in which values-critical technical decisions arise; locating design parameters and variations that differentially call into play relevant values; for and with respective projects, developing rich conceptual understandings of relevant values; for and with project investigators, operationalizing values to enable transition from values conceptions into design features; with FIA investigators, examining the interplay of values embodied in design with respective values embodied in law and policy; and where possible, verifying values in design through prototyping, user testing and other empirical analyses. These activities are all components of Values-in-Design (VID), an approach to analyzing and designing technology that recommends social values serve as design requirements alongside more traditional technical constraints.
Within the extensive community of researchers and scholars there are diverse, hotly contested views on what it means to say that technology is political. There are smaller demonstrations of the claim that features of technology can actively be adjusted to yield morally and politically relevant outcomes. This project promises to make significant, more convincing contributions to this debate by providing opportunities for social analysis and technical experts to collaborate on gritty design features that are politically and ethically suggestive. Beyond potential contributions to the general understanding of social dimensions of technology, the project could yield insights into how to operationalize abstract concepts such as security, privacy, etc. and how to implement them in architecture, protocol, mechanism, etc. Not only could these contributions directly affect the FIAI projects, they could have significance for IT design, generally, and even beyond, to other technologies.
Living with the consequences of technology, it is inevitable that we applaud some and bemoan others. Such evaluation can be retrospective, but for approaches in the Values-in-Design family, there is commitment to pro-action, or to the thesis that adjustments in material elements of systems and devices systematically can affect socially relevant outcomes. Aware of the pitfalls of naïve technological determinism, the payout measured in terms of systems and artifacts that promote--- or do not subvert-- attainment of societal values, is potentially great. The opportunity to test out these ideas in a research context as ambitious as the FIAI holds promise for better technology and deeper understanding of the mutual shaping influence of material design on the one hand and values on the other.
I am applying for an EAGER grant to support a small multi-disciplinary team of experts in values-in-design to work alongside recipients of the Future Internet Architecture Initiative (FIA) large projects grants. The team would serve as analysts and consultants to the FIA projects, helping to identify junctures in the design process in which values-critical technical decisions arise; locating design parameters and variations that differentially call into play relevant values; for and with respective projects, developing rich conceptual understandings of relevant values; for and with project investigators, operationalizing values to enable transition from values conceptions into design features; with FIA investigators, examining the interplay of values embodied in design with respective values embodied in law and policy; and where possible, verifying values in design through prototyping, user testing and other empirical analyses. These activities are all components of Values-in-Design (VID), an approach to analyzing and designing technology.
For many decades, the National Science Foundation has supported research in the ethical and political implications of technology. It has done so through dedicated programs, such as Societal Dimensions of Engineering, Science, and Technology, as well as those primarily focused on targeted scientific and engineering outcomes that require integration into teams of scientists and engineers, selected social scientists, ethicists, or policy/legal analysts. The former model has produced a tremendous body of high quality work useful to those within the fields but, unfortunately, largely ignored by the technical community. Although the latter model would seem to hold greater promise in this regard, it has proven difficult, generally, to sustain the interest of the technical community in non-technical matters, or for multidisciplinary teams to locate problems that are research-worthy in both technical and non-technical arenas.
The difficulties confronting both models, in my view, have little to do with the quality of work or investigators themselves, who generally pursue these efforts with the best intentions and integrity. The first model, however, often produces work that is abstracted (one step removed) from details of the technology, and often is couched in parochial terminology that forms a barrier to outsiders, including technical colleagues. This is in part because the researchers in question are guided by the demands of internal research trajectories and also have insufficient contact (perhaps insufficient opportunities for contact) with the development of the baseline technologies themselves. In the second model, constraints of timing, lack of mutual familiarity with respective research requirements, and reward structures that neglect efforts by technical and non-technical personnel to develop working relationships frequently leave non-technical aspects of projects relatively underdeveloped, or to the extent they are developed, are frequently not integrated into the technical lines of work. Sometimes this is a result of a post-fact mismatch of focus due to insufficient understanding of respective work and methodologies at the time (sometimes frantic) that research teams are formed.
This EAGER proposal aims to support experimentation with a third approach, one that seeks to combine the best of these models while avoiding some of the pitfalls inherent in both. The goal is to structure the interactions between technical and non-technical personnel at points where “the rubber hits the road,” with researchers from both sides self-selecting according to their interests in particular research problems. The strategy for enabling this match of interests and expertise will be discussed below, but first I provide some background on the methodological approach broadly described as “values-in-design,” or VID.
The idea that societal values – political, ethical, and cultural – may be embodied in technology has found voice in a particularly influential line of work that can be traced to Lewis Mumford writing in the 1960s about democratic and authoritarian “technics” [Mumford 1964] through Langdon Winner’s landmark, “Do Artifacts Have Politics.” [Winner 1986] By now, the range of work and opinion that addresses the politics of and politics in technology extends across several disciplines including information law (e.g. famously, Lessig’s Code is Law [Lessig 1999]), philosophy of technology, science and technology studies (STS), history of science and technology, innovation studies, media studies, and CHI/usability studies (famously, [Norman 1989]). Each of these fields has absorbed this idea in its own characteristic ways, expressing it in respective terminology and integrating it in respective, fundamental controversies. For example, in STS and philosophy of technology, the research literature reflects an ongoing debate over what it means to talk of “the social, ethical, or political implications, or impacts of a technology.”(See, for example, [MacKenzie and Wajcman 1995]) While some might allow that these outcomes are determined – even if tempered by contextual factors -- by material system characteristics, others disagree. They argue that social outcomes are “socially constructed,” providing a variety of theories of technologies to explain how technology is socially constructed and how these constructions yield certain outcomes. In a parallel fashion, within the area of information law, scholars split (in multiple ways) in how they answer questions as to whether technology can regulate, autonomously, in defiance of policy, or whether technology can be no more than the instrument of policy.
The launch point for the approaches loosely gathered under the label Values-in-Design, is that material characteristics of technical devices and systems are systematically related to social outcomes (thus eschewing an extreme form of social constructivism), and that some of these outcome bear on ethical and political values, such as security, privacy, freedom, sociality, etc. It is a characteristic activity of the VID approach to locate and write about these systematic relationships, to identify specific aspects or features of design, on the one hand, and values-relevant social outcomes, on the other.
But the characteristic of a VID approach most specifically related to this proposal is its pragmatic and normative turn. It not only recognizes values as standing in a systematic relationship to design decisions, it sets values as among design aspirations: if we accept that architecture, protocol, and mechanisms can embody values, let those who design and produce systems take into consideration, that is, engineer, values as among the functionalities and constraints of their systems. That is to say, in the same way creators of technical systems, devices, and mechanisms set forth technical and functional specifications and constraints as requirements, the VID framework recommends extending that list to include ethical and political values. In certain cases, there might be few that are relevant to the artifact in question, but in others, the values implicated might be numerous and complex.
Historically, digital, information technologies, including, computers, information systems, information infrastructures, mobile devices, and digital media have excited great interest among those who study politics in technology. The case of the Internet – its architecture, protocols, and applications -- has been a lightening rod of interest, producing an enormously rich literature addressing its influences on and consequences for society. It is one thing to connect the Internet with an astonishing array of societal impacts, such as individual freedom and autonomy, the enhancement of democracy, the more equitable distribution of voice, opinion, and influence, and so forth; it is another to roll up one’s sleeves, so-to-speak, and pinpoint specific technical (or material) characteristics and account for how these are responsible for certain outcomes. And, even more challenging, but to the heart of VID, is to advocate for certain design characteristics or alterations, arguing that they will, or are likely to have certain desired outcomes, that is, promote values to which a society is committed.
In computer science, there is clear interest already in linking technical decisions with social. For example, positions staked out in the active debate over how to model identity online may draw on technical points, but also political ones. An Internet that supports anonymous communication may, some argue, afford greater freedom and self-expression, values to which liberal democracies are committed. Another case is burgeoning work on theories, systems and mechanisms touted as “privacy preserving.”
Accordingly, introducing a VID approach into the FIA research arena is not foreign or entirely novel. It does, however, add something significant to the picture, namely expertise and experience with systematic methods for taking values into consideration in design, such as Value Sensitive Design [Friedman, et. al. 2008] and Values-at-Play [Nissenbaum and Flanagan 2007]. Both of these, and others like them, have evolved methods and pursued demonstration application projects in which these methods have been honed and applied to particular systems, from digital games to Firefox extensions. A particular advantage of VID approaches is their recognition of the variety of inputs to a design process and a commitment to take all of them seriously, namely, a deep grasp of values, consideration of users and others affected by the technology in question, and details of the technology itself. Computer scientists and engineers clearly are committed to technical excellence; VID insists that values’ considerations not be compromised either. Well-intentioned approaches risk compromising their final product with mistaken assumptions about components they have neglected to study. For example, a privacy-preserving system might be compromised by a conception of privacy that is faulty, or mistaken assumptions about how users interact with the system.
A VID approach is a promising vehicle for cross-disciplinary collaboration because of its pragmatic turn, as it articulates methods for bringing values into consideration at the point where “the rubber hits the road.” At the same time, the obstacles sketched above, difficulties of cross-disciplinary communication, identifying mutually exciting research topics, timing, reward structures, institutional barriers, and so forth, remain in play.
Seeking to overcome or mitigate some of the obstacles discussed in Section 2, the project I am proposing, conceives of FIA projects as an ambitious test-bed for a broad-based, cross-disciplinary collaboration in which those trained in the spheres of the social sciences, law and policy, and humanities will connect, on-the-fly, with engaged computer scientists and engineers. To my knowledge, this is a novel model and, as such, should be considered experimental with lessons to be learned as we proceed.
The FIA Initiative offers fertile grounds for this experiment because it engages researchers in ground-up, re-design efforts of a system whose current form emerged as one of the spectacular technological achievements of the era not only for its technological prowess, but also for its multi-faceted integration (saturation) into every aspect of social life. Already, in the existing Internet (if one looks at all levels), countless features, protocols, mechanisms, aspects of architecture have spawned outcomes relevant to values and have been recognized as such. One can, therefore, anticipate that FIA projects will confront many equivalent phenomena along the way.
Documents describing FIND and the FIAI call for proposals reveal a pre-existing commitment to a set of values and as such an interest in combining technical innovation with the deep integration of value considerations. These documents speak of trustworthiness, security, privacy, reliability, and usability as requirements of the architectural design. They list scalability, openness, ubiquitous access, innovation- enabling, manageability, evolvability, and economic viability as desired characteristics, and acknowledge that technical decisions taken about how location and identity are managed will have ramifications for the system as a whole. These documents exhort designers to take into consideration the larger context of a free and open society.
While scientists and engineers may read these terms as technical requirements – security, access, etc – in fact, they are terms drawn from social life with rich and contested meanings. How one interprets them can have far-reaching effects on the significance of new generations of networks for society. FIAI offers an exciting opportunity to rethink the technology of the Internet while taking into consideration this richer understanding of values requirements and the significance of the Internet in its social context.
I am requesting an EAGER grant to support the creation of a team of VID expert consultants. The constitution of the team would be developed in discussion with cognizant NSF program officers. At each of the FIA site visits, a sub-group of VID team members (3-4 at a time) would attend. Some parts of these meetings would be devoted to presentations by FIA investigators of ongoing work and accomplishments geared to a non-technically trained audience, in an effort to find a common language with VID team members. There would also be reciprocal (short) presentations by VID team members explaining VID approaches and their particular areas of expertise, also in an effort to locate overlapping themes and expertise.
The goal of these discussions would be to identify potential technology-to-values correlates. The familiar case of identity management is one example where scientists and engineers have already seen that both technical and values issues involve challenges and tradeoffs. When such issues arise in an FIA meeting, the co-presence of technical and social expertise would allow technical and social challenges and tradeoffs to be viewed side-by-side. We can anticipate similar opportunities when dealing with other complex relationships among values such as access, openness, and security, and as before, tradeoffs inherent in them and them in relation to others.
In addition to reflecting on values-to-technology relationships, we anticipate that discussions would also range over questions of enforcement, in particular, which enforcement mechanism can be embedded in technology and at what point it needs to be “handed off” to societal enforcement mechanisms, such as law and policy.
The VID team would include approximately 15-20 experts drawn from wide-ranging disciplines and areas, including media studies, policy studies, law, humanities, social sciences (including history, economics, etc.), information science, and STS. The concern is less with disciplinary origin than with past work and known expertise of individuals who will be approached.
Short Term: For the two years of the EAGER, the ambition is to establish connections among a wide range of scholars with intersecting work and interests devoted to development and analysis of digital network technologies, but coming at it from a variety of perspectives. Meetings would yield short reports posted to a project website.
Medium/Long Term: An ideal outcome of this project would be the development of cross-disciplinary research collaborations devoted to issues within the general area of VID. These could constitute contributions both to technology and to social analysis.
Although the consulting group model is common in the business world, I have not encountered anything like it in academic research. An EAGER would allow for experimentation with this general model. The format described in this proposal, with the potential for variation over the course of the two years and six meeting, allows us to try out different configurations and modes. We expect to learn along the way and, in iterative cycles, feed what we have learned back into the organizational details of each meeting. I do not propose, nor do I anticipate, that this model will replace other predominant models of cross-disciplinary collaboration supported by NSF. Nevertheless, it holds promise as an alternative that could be extended to other scenarios involving the design and creation of complex technical systems with ethical and political considerations.
Flanagan, M., D. Howe & H. Nissenbaum. “Values in Design: Theory and Practice.” In Information Technology and Moral Philosophy. Van den Hoven, J. and J. Weckert (eds) Cambridge: Cambridge University Press, 2007.
Friedman, B., P. Kahn, & A. Borning. “Value Sensitive Design and Information Systems.” Human Computer Interaction in Management Information Systems: Applications. New York: M.E. Sharpe, Inc, 2006.
Lessig, L. "The Law of the Horse: What Cyberlaw Might Teach." 113 Harvard Law Review 501 (1999).
MacKenzie, D. and J. Wajcman (Eds) The Social Shaping of Technology. Milton Keynes: Open University Press, 1985.
Mumford, L. "Authoritarian and Democratic Technics.” Technology and Culture 5.1 (1964): 1-8.
Norman, Donald. The Design of Everyday Things. New York: Doubleday, 1989, 1-33, 81-104.
Winner, L. "Do Artifacts Have Politics?" The Whale and the Reactor. Chicago: The University of Chicago Press, 1986. 19-39.
Hosted by the Department of Media, Culture and Communication at NYU under agreement with NSF.
Contact Jacob Gaboury <firstname.lastname@example.org> with questions or comments.