Contact info

Topics in Digital Media: Contextual Integrity: Technology, Philosophy & Policy*
MCC-GE 2130-001
Thursdays 2:00-4:10 PM
Fall 2015
Helen Nissenbaum, Department of Media, Culture, and Communication
*Course developed with generous funding from the Intel University Program Office


This graduate level course welcomes students with varied backgrounds and skills but prior coursework in either computing (e.g. programming, website creation, data science, etc.) or social, political, and ethical analysis is required.

Intended Learning Outcomes

Familiarity with the contemporary landscape of privacy threats from digital and information technologies, landmark cases, and literatures
Deep knowledge of the selected case areas
Appreciation of philosophical and ethical concepts relating to privacy
Insight into legal, policy, and business issues at stake in privacy threats and proposed solutions
Thorough grasp of the theory of contextual integrity
Working knowledge of key technologies and sociotechnical systems, particularly those relating to the selected cases


The course will be conducted in seminar style with the professor initiating discussion topics and students contributing actively. Because privacy is an active issue in the public sphere, students and professor will engage with unfolding and ongoing questions. Students in this course are expected to engage actively in individual research into topics of interest, sometimes going beyond the materials given in the syllabus.
This course is about privacy in its relation to technology. Its approach to the topic is through the deep study of a handful of cases selected from a domain of contemporary areas in which the interplay of technology with politics, policy, ethics, and society have yielded conflicts over privacy that have been difficult to resolve. The cases are distinctive because they pose equally difficult challenges to technologists, policy makers, and ethicists who -- the course will demonstrate --must understand the mutual impacts of these diverse arenas in order to make progress. Cases to be studied in Fall 2015 are online tracking, big data, Internet of Things (including quantified self), and biometrics. Four modules, each devoted to one of these cases will include a session devoted to relevant technologies in addition to a session devoted to ethical, social, and policy analysis. When considering the origins of respective threats to privacy, as well as possible answers, or ways to address them, the course will evaluate a range of possibilities, including technology, policy, ethics, all three, and in what measure.
The theory of contextual integrity will form the analytic framework for our analyses of the four case areas. To establish common ground and gain familiarity with the theory, the first three sessions will be devoted to learning basic concepts, the analytic framework, and design heuristic.


Students are expected to attend all class meetings and complete reading assignments beforehand. Students should take notes and bring these to class. Readings from disciplinary sources will vary in difficulty particularly in relation to students' backgrounds. Don't be surprised if you have to read some of the articles slowly, or even multiple times. Participation is an important element of the course, both in-class and online; careful reading contributes significantly to the quality of your contributions. You will be also be evaluated on the basis of written, oral, and practical assignments, a collaborative project, and a term paper based on your project.


Participation (in-class and online): 20%
Miscellaneous homework: 20%
Group project and presentation: 30%
Term paper/project write up: 30%


9/3 Introduction to the course

Franzen, Jonathan. "Imperial Bedroom." New Yorker, October 12, 1998: 48-53.

9/10 Readings
Privacy in Context: Introduction, and Chapters 1-3

9/17 Readings
Privacy in Context: Chapters 4-6
U.S. Department of Health, Education & Welfare. "Records, Computers and the Rights of Citizens." Report of the Secretary's Advisory Committee on Automated Personal Data Systems (July 1973): Summary and Recommendations
Warren, Samuel D. and Louis D. Brandeis. "The Right to Privacy." Harvard Law Review 4, no. 5 (December 15, 1890): 193-220.

Auxiliary Readings
Solove, Daniel J. "A Taxonomy of Privacy." University of Pennsylvania Law Review 154, no. 3 (January 2006): 477-564.
Solove, Daniel J. "'I've Got Nothing to Hide' and Other Misunderstandings of Privacy." San Diego Law Review 44, no. 4 (Fall 2007): 745-772.

9/24 Readings
Privacy in Context: Chapters 7-9, and Conclusion

Auxiliary Readings
Calo, Ryan. "The Boundaries of Privacy Harm." Indiana Law Journal 86, no. 3 (July 2011): 1131-1162.
Dourish, Paul and Ken Anderson. "Collective Information Practice: Exploring Privacy and Security as Social and Cultural Phenomena." Human Computer Interaction 21, no. 3 (2006): 319-342.

A medium of communication, transaction, and individual, institutional, and group activity in virtually unending variety, including creativity, sociality, political activism, and production, the Web has served as a spectacular medium of human life. It also has provided the means for monitoring, tracking, data collection, eavesdropping, and surveillance, posing unprecedented challenges to privacy. The failed effort to establish a Do-Not-Track standard demonstrates how deeply entrenched the disparate commercial and political stakeholders are in shaping underlying technological affordances and supporting policy landscape.

10/1 Web Tracking and Online Privacy: Ethics, Law, and Policy

Gellman, Robert. "Fair Information Practices: A Basic History." Bob Gellman, February 11, 2015: Sections I-V.
McDonald, Aleecia M. and Cranor, Lorrie F. "The Cost of Reading Privacy Policies." I/S: A Journal of Law and Policy for the Information Society 4, no. 3 (2008): 543-568.
Hoofnagle, Chris Jay, Soltani, Ashkan, Good, Nathaniel, Wambach, Dietrich J. and Mika D. Ayenson. "Behavioral Advertising: The Offer You Cannot Refuse." Harvard Law & Policy Review 6, no. 2 (2012): 273-296.
Madejski, Michelle, Maritza Johnson, and Steven M. Bellovin. "A Study of Privacy Settings Errors in an Online Social Network." In SESOC '12: Proceedings of the 4th IEEE International Workshop on Security and Social Networking, 2012.
Angwin, Julia (Ed.) "What They Know: The Business of Tracking you Online," Wall Street Journal, 2010. (Selections TBD)

Auxiliary Readings
Hoofnagle, Chris Jay and Jan Whittington. "Free: Accounting for the Costs of the Internet's Most Popular Price." UCLA Law Review 606 (2014): 606-670.

10/8 Web Tracking and Online Privacy: Technology

Guest technologist: Arvind Narayanan, Princeton University

Mayer, Jonathan R. and John C. Mitchell. "Third-Party Web Tracking: Policy and Technology." The Center for Internet and Society at Stanford Law School, March 13, 2012.
Schunter, M. and P. Swire, "What Base Text to Use For the Do Not Track Compliance Specification?" World Wide Web Consortium, Tracking Protection Working Group.

This term refers to an extension of the Internet to devices that have not, traditionally, been conceived as "computers," including mobile phones but more broadly home appliances, TVs, motor vehicles, and more. Self-tracking, one application area of the Internet of Things has burgeoned, from fitness to mood to transportation. The Federal Trade Commission has noticed potential concerns for privacy in how the devices in question are connected and designed.

10/15 Internet of Things

Guest technologist: Yan Shvartzshnaider, Princeton University

Federal Trade Commission. "Internet of Things: Privacy & Security in a Connected World," FTC Staff Report, January 2015.
Bogost, Ian, "The Internet of Things You Don't Really Need," The Atlantic, June 23, 2015.

10/22 Self-tracking

Swan, Melanie. "Sensor Mania! The Internet of Things, Wearable Computing, Objective Metrics, and the Quantified Self 2.0." Journal of Sensor and Actuator Networks 1, no. 3 (December 2012): 217-253.
Regalado, A. Stephen Wolfram on Personal Analytics
Nafus, Dawn and Jamie Sherman. "This One Does Not Go Up to 11: Quantified Self Movement as an Alternative Big Data Practice." International Journal of Communication 8 (2014): 1784-1794.
Connected cars. See for example: http://www.theverge.com/2015/10/6/9460471/porsche-911-carrera-apple-carplay-google-android-auto and https://www.automatic.com/

10/29 Privacy Engineering, Tools, and Technology

Guest technologist: Seda Guerses, Princeton University

Spiekermann, Sarah and Lorrie Faith Cranor. "Engineering Privacy." IEEE Transactions on Software Engineering 35, no. 1 (January/February 2009): 67-82.
Swire, Peter and Annie Anton. "Engineers and Lawyers in Privacy Protection: Can We All Just Get Along?" Privacy Perspectives, January 13, 2014.
Guerses, Seda and Claudia Diaz. "Two Tales of Privacy in Online Social Networks." IIEE Security & Privacy 11, no. 3 (May/June 2013): 29-37.
Surden, Harry. "Structural Rights in Privacy." SMU Law Review 60, no. 4 (Fall 2007): 1605-1629.
Parra-Arnau, Javier, Rebollo-Monedero, David and Jordi Forne. "Privacy-Enhancing Technologies and Metrics in Personalized Information Systems." Studies in Computational Intelligence 567 (2015): 423-442.

Big data offers great promise in key aspects of life, including healthcare, marketing, governance, and education. It will provide an overview of key areas of science and technology that have come together to promote this paradigm, e.g. database hardware and software, machine learning, networked sensors, statistics. It will consider some of the literature that has been critical of big data both from an epistemological perspective, but it will spend most time on ethical issues, focusing on privacy challenges. The "Report to the President on Big Data and Privacy: A Technological Perspective," released this month by the Office of the President and President's Council of Advisors on Science and Technology, will be among the readings.

11/5 Big Data: Privacy, Ethics, Law, and Policy

Review Privacy in Context: Chapter 2 and pp. 201-206.
boyd, danah and Kate Crawford. (2012). "Critical Questions for Big Data: Provocations for a Cultural,Technological, and Scholarly Phenomenon." Information, Communication, & Society 15:5, p. 662-679.
Tene, Omer and Jules Polonetsky. "Big Data for All: Privacy and User Control in the Age of Analytics." Northwestern Journal of Technology and Intellectual Property 11, no. 5 (April 2013): 239-273. [Parts I and II]
Zimmer, Michael. "More on the 'Anonymity' of the Facebook Dataset - It's Harvard College (Updated)." Michael Zimmer, October 3, 2008. Accessed August 25, 2015.
Hays, Constance L. "What Wal-Mart Knows About Customers' Habits." New York Times, November 14, 2004. Accessed August 24, 2015.
Duhigg, Charles. "How Companies Learn Your Secrets." New York Times Magazine, February 16, 2012.
White House. "Big Data: Seizing Opportunities, Preserving Values." Executive Office of the President (May 2014): 1-79. [Chapter 1, quickly scan other chapters]

Auxiliary Readings
Ohm, Paul. "Broken Promises of Privacy: Responding to the Surprising Failure of Anonymization." UCLA Law Review 57 (2010): 1701-1777.

11/12 Big Data: Technology

Guest technologist: Seda Guerses, Princeton University

Barocas, Solon. and Andrew D. Selbst. Forthcoming. "Big Data's Disparate Impact." SSRN, Draft, August 14, 2015: 1-62.

Auxiliary Readings
Vedder, Anton. "KDD: The Challenge to Individualism." Ethics and Information Technology 1, no. 4 (1999): 275-281.
Kosinski, Michal, Stillwell, David and Thore Graepel. "Private Traits and Attributes are Predictable from Digital Records of Human Behavior." Proceedings of the National Academy of Sciences of the United States of America 110, no. 15 (April 9, 2013): 5802-5805.
Yakowitz, Jane. "Tragedy of the Data Commons." Harvard Journal of Law and Technology 25, no. 1 (Fall 2011): 1-67.

The course will survey state-of-the-art in biometrics technologies currently considered most promising for practical applications, including fingerprinting, iris scanning, and facial recognition. What claims may legitimately be made about their efficacy in both authenticating and identifying? In what arenas are we seeing most of the uptake: governmental security and law enforcement or commercial (e.g. Facebook). What are the most serious privacy concerns? We will review material from the ongoing Multi-stakeholder Process currently being managed by the NTIA, looking at the uses of facial recognition in the private sector.

11/19 Biometric Identification - Technology and Policy

Guest technologist: Nasir Memon, NYU Computer Science & Poly Engineering

Dickinson, Casey J. "New Technology Spots Casino Cheats, Crooks." The Business Journal (Central New York) 15, no. 17 (April 27, 2001): 2.
Williams, Timothy. "Facial Recognition Software Moves Overseas Wars to Local Police." New York Times, August 12, 2015. Accessed August 24, 2015.
GAO. "Information Security: Challenges in Using Biometrics." Government Accountability Office, September 9, 2003: 1-23.
GAO. "Facial Recognition Technology: Commercial Uses, Privacy Issues, and Applicable Federal Law." Government Accountability Office, July 2015: 1-49.
NTIA Multi-Stakeholder Guidelines for Facial Recognition in the Commercial Sector.

Auxiliary Readings
Gates, Kelly, "Identifying the 9/11 'Faces of Terror'," Cultural Studies 20, no. 4-5 (2006): 417-440.

11/26 Thanksgiving

12/3 Notice and Consent (Revisited)
12/10 Project Presentations

Mobile Data Flows and Privacy
Examining How Third Party Trackers Interact with a Web Page
Privacy in the Library Context: An Examination
Privacy in Online Music Streaming
Privacy Analysis of IBM Watson Health
Big Data Challenges the Privacy Protection of U.S. Census
Music Streaming and Privacy: Differential Privacy, Contextual Integrity, and Music Business

Last Updated: December 23, 2015
Need Help? Web-Master