Title: Designing Collective Action Systems for User Privacy

Date: Tuesday, June 27, 2023

Time: 12:00PM to 2:00PM EDT

Location: Virtual at https://gatech.zoom.us/j/94660607042pwd=L25XOS9nc1FOTk84eXNlekZMRHRsZz09

 

Yuxi Wu

PhD Student in Computer Science

School of Interactive Computing

Georgia Institute of Technology

 

Committee:

Dr. Sauvik Das (co-advisor) – Human-Computer Interaction Institute, Carnegie Mellon University

Dr. Keith Edwards (co-advisor) – School of Interactive Computing, Georgia Tech

Dr. Joseph Calandrino – Office of Technology Research and Investigation, Federal Trade Commission

Dr. Christopher Le Dantec – School of Interactive Computing, Georgia Tech

Dr. Richmond Wong – School of Literature Media, and Communication, Georgia Tech

Dr. Ellen Zegura – School of Interactive Computing, Georgia Tech

 

Abstract:

People feel concerned but powerless about the ways that large tech institutions have violated their privacy.  In other domains, collective action has been successful in effecting change.  Computer-supported collective action (CSCA) is a framework proposed by Shaw et al. that identifies five key stages of people using computing tools to engage in collective action—(1) Identify a problem; (2) Generate, debate, and select ideas; (3) Coordinate and prepare for action; (4) Take action; and (5) Document, assess, and follow up—as well as the challenges that make it difficult to transition between stages.  

 

In my work, I explore and scope out design spaces to support transition between these stages for user privacy.  First, I show that it is possible to gather and distill a collective’s privacy concerns and their demands for mitigations (Stages 1 and 2); however, these demands don’t mesh well with security and privacy experts’ evaluations, who dismiss them as unrealistic.  To better scaffold a dialogue between users and experts (Stages 2 and 3), I qualitatively analyze users’ negative experiences with online behavioral advertising, contextualizing them against legal literature and forming a typology of privacy harms.  I propose that exploring a systematic, collaborative process for collecting such data as “evidence of privacy harm” can be a viable path for privacy collective action.  In my proposed work, through design fiction and story completion methods, I hope to understand (1) the potential ways that users can collectively participate in this process (Stage 4) and (2) the potential impacts of such collective evidence-gathering (Stage 5).