By Emily F. Keller
Community advocates who are concerned about the impacts of surveillance technologies and algorithmic decision systems (ADSs) will have a new set of examination tools soon. A team at the eScience Institute’s Data Science for Social Good (DSSG) program at the University of Washington (UW) is developing a package of tools designed for conversations with elected officials, policymakers and fellow community members to understand and critique technologies that utilize data about people.
The project responds to concerns about the increasing use of “smart” technologies by government officials to monitor, track, and make decisions about members of the public. The project promotes public accountability in the use of these technologies by equipping stakeholders with strategies for revealing and interrogating their functions and features. Civil rights groups are being consulted in the design of the toolkit to ensure that it benefits their communities. Stakeholder organizations, who spoke to the team anonymously, conveyed concerns that these tools could be used to target, interrogate or detain Muslims, immigrants and other marginalized community members. For example, community advocates are concerned that license plate readers, which are designed for public safety purposes, could be used indiscriminately, such as at houses of worship or political protest, potentially intruding on constitutionally protected activities.
Examples of technologies that could be evaluated by the toolkit include gunshot detectors, social media mining, stingray cell phone monitors, and hacking tools that extract data from mobile devices. “How would a community member, when faced with a new technology that’s being announced or an up-and-coming technology that might be concerning, how can they evaluate that technology?” explained Bernease Herman, the project’s Data Science Lead, and a Data Science Fellow and Research Staff member at the eScience Institute.
The Algorithmic Equity Toolkit will have three primary components:
- A flowchart of technical capabilities to help users identify ADSs, distinguish them from other technological systems and understand different features. For example, this could prompt a user to determine if a camera system is equipped with facial recognition technology. Discussion papers will accompany the flowchart to explain how ADSs work through examples.
- A checklist for asking questions about the tools, such as privacy and data security measures, potential negative impacts (such as the denial of services or the reinforcement of biases against minority or low-income communities), the ability of the tool to treat all users fairly, documentation about the data used to create the tool (such as sample population, how the data was analyzed, and any known limitations) and how to interpret models and outputs. A glossary is included.
- An interactive demo examining predictive policing technologies and facial recognition software with closed-circuit television images, which community advocates can use to educate people about potential harms such as disparate impacts on specific groups. The demo will be created using the tools OpenFace, Docker and Dash.
Fellow Aaron Tam noted the importance of examining technologies from multiple angles. “There’s a common misconception that if you just improve the training data, that the algorithm will be more accurate. The questionnaire tries to confirm this – that there are issues that will be inherent when we use automated decision systems that won’t be fixed just by adding better training data,” he said. Aaron, a master’s student at the UW Evans School of Public Policy and Governance, said his work is informed by his background in political organizing, program evaluation in public policy, graphic design and app design, such as developing personas, questionnaires, and testing to create a prototype.
The toolkit is being developed in partnership with the American Civil Liberties Union (ACLU) of Washington with community advocates as the primary target users. Other potential users are policymakers, elected officials, government agencies and tech companies. The team is testing the accuracy of its definitions with data scientists, and its usefulness with community groups. Surveys are underway to assess stakeholders’ knowledge of ADSs and their comfort using the toolkit to engage politically, along with identifying any areas for improving informational clarity.
Fellow Daniella Raz said in meeting with organizational leaders, the team was, “Trying to gauge what their community members and stakeholders were confused about or would want from a tool like this. What are their experiences? What are the important elements and features it should have?” Daniella, a master’s student in the University of Michigan’s School of Information, said her work is informed by her experience working with a human rights NGO, with Muslim communities, and at social service centers, as well as her background in Arabic.
Fellow Vivian Guetler, a Ph.D. student in the Department of Sociology & Anthropology at West Virginia University, said her social sciences background has been helpful, particularly her understanding of empowering community stakeholders. “Instead of us coming up with solutions, they get to tell us the problems and then we work together to come up with a solution,” she said.
Fellow Corinne Bintz, an undergraduate student in computer science at Middlebury College, is utilizing her background in data visualization and analysis, software engineering and implementing machine learning algorithms. She also reflected on her work with nonprofit organizations, which reminded her of the need to make the toolkit concise. “When you meet with them, that’s a huge chunk of their time, so this tool needs to be quick and easy to use. It can’t be a long, intensive process to just get a basic understanding of this topic,” she said.
The Project Lead is Mike Katell, a Ph.D. candidate at the UW Information School. The project was inspired by research findings during an examination of the creation and implementation of the Seattle Surveillance Ordinance. Research findings suggested that city officials, including technology and policy professionals, lacked clarity about the capabilities and features of city-owned technologies. “The solution is to arm community members with the knowledge necessary to press public officials to do more in explaining and justifying the use of public sector technologies, as intended by the surveillance ordinance,” he said. Mike is working with Community Engagement Lead Meg Young, a Ph.D. candidate at the UW Information School, and Faculty Advisor Peaks Krafft, a Senior Research Fellow at the Oxford Internet Institute.
Toward the end of the program, the team will host 2-3 panels of community stakeholders to solicit feedback on the toolkit through the Diverse Voices method developed by the UW Tech Policy Lab. Each panel will focus on a specific community, including race and social justice activists, immigrants, and people who have been incarcerated. The Critical Platform Studies Group and any available fellows will continue to refine the toolkit once the program ends.