Austin Adams
Communications Manager, Open Technology Institute
This story is part of PIT UNiverse, a monthly newsletter from PIT-UN that shares news and events from around the Network. Subscribe to PIT UNiverse here.
Algorithms and machine learning tools are some of the most influential yet least well understood technologies in use today. Across the public and private sector鈥攆rom social media and online advertising to housing and banking to government and criminal justice and more鈥攐rganizations increasingly rely on algorithms to parse massive amounts of data and make decisions at scale.
Often, these decisions have consequences that are life altering. Today, algorithms can help decide whether someone is considered for or gets a job, receives housing, is granted a loan, gets accepted to college, and even whether someone is released from prison. And while technology has a veneer of objectivity and scientific accuracy, artificial intelligence (AI) is often trained on data that reflect historical biases and injustices. This creates a significant risk of ingrained biases perpetuating historical patterns of marginalization; examples are easy to find in , , , and .
Advocates, including 麻豆果冻传媒鈥檚 Open Technology Institute, are pressing companies and governments to offer increased transparency into how their AI systems are developed and trained, as well as the outcomes they generate. Without this information, many influential algorithms remain 鈥渂lack boxes,鈥 with little to no accountability for potential harms.
There is a clear need for more public interest researchers looking at algorithms and machine learning tools to determine their influence and detect biased outcomes. However, such complex systems require a great deal of expertise to approach. The (AFOG) at the University of California, Berkeley is dedicated to this research, and understanding the real-world effects of algorithms regardless of their creators鈥 intentions. AFOG also engages students and the public through courses and events, encouraging people to, 鈥渢hink deeply and critically about technology, human values, and the social and political implications of technical systems.鈥
Professors Deirdre Mulligan and Jenna Burrell lead AFOG, which won a 2020 Network Challenge grant from PIT-UN focused on creating a career pipeline for PIT scholars looking to pursue research into algorithmic fairness. AFOG will partner with two other UC Berkeley PIT programs鈥 (New Experiences for Research and Diversity in Science) and the 鈥攖o create educational programming and opportunities for PIT students to explore careers in algorithmic fairness.
In a statement announcing the grant, Professor Mulligan previewed hands-on workshops, public lectures, and lunch talks to connect students to the PIT field and grow their understanding of the space. 鈥淏y centering issues of justice,鈥 Mulligan says, 鈥渞ather than technology or specific approaches and methods from STEM fields, this program will develop and train diverse students and scholars with the knowledge and skills to create, use, assess, and critique technologies in service of the public interest.鈥
Claudia von Vacano, Executive Director of the D-Lab, says students and researchers face a number of challenges working in the AI space.
鈥淎s rigorous researchers, the contexts which our students enter are sometimes not as concerned with the questions of diversity and ethics in AI that are fomented in our programs,鈥 says Vacano. 鈥淪o they innovate and have to push against workflows that don't routinely allow them to interrogate and audit issues of bias in ML.鈥
The programming and education provided by the D-Lab, Vacano says, will help prepare students for the field. 鈥淥ur scholars are using and thereby expanding their data science skills at the same time that they are proposing solutions to complex problems in the field. In other words, they鈥檙e applying their knowledge in sophisticated ways with hands-on and collaborative projects.鈥
One student who served as a data science fellow with the D-Lab had high praise for the program and said it helped secure a data science internship this year.
鈥淭he experience has been amazing,鈥 the student says. 鈥淧eople have an open-minded approach to science and are always eager to help you out.鈥
The partnership between AFOG, Cal NERDS, and the D-Lab has already begun to hold events, including beginning a lecture series with a by Ethan Zuckerman. Their next lecture, on April 14, will .
One PhD student who attended Zuckerman鈥檚 talk called it 鈥渁n important conversation challenging the assumption that scale鈥攊n the context of social media, and technology more generally鈥攊s preferable, instead raising bold questions about what small social media ecosystems might me able to provide to foster communities.鈥
The lecture series followed a bootcamp in January that drew 120 registered students to provide an overview of STEM fields, as well as an introduction to coding and conversations with role models in the field. One attendee who had little knowledge of data science beforehand says the bootcamp 鈥渞eally broadened my perspectives of technology and it鈥檚 community,鈥 adding that hearing speakers discuss their own experiences 鈥渞eally has truly comforted me that even though I have not started coding early, I am still capable of great things!鈥
We asked the teams from AFOG, Cal NERDS, and the D-Lab a series of questions. Below, find their compiled answers. Also, interested in working with AFOG? They're hiring a, and would love to support a聽as well.
What challenges do students face in making a career researching the public interest issues around algorithms and machine learning?
Undergraduate STEM education provides little room for interdisciplinary engagement, yet women and students from historically marginalized and non-dominant communities (low-income, first generation to college, re-entry, transfer, student parents etc.) are often profoundly aware of, and intensely interested in, the social and political impacts of science and technology. Likewise, graduate students in public health, social work, public policy, and law may have a strong interest in the implications of technology for their field, but little opportunity to explore these areas, particularly given that graduate students are encouraged to stay in their disciplinary lane. The questions at the center of public interest tech center justice and social welfare alongside data science, coding, algorithms and machine learning in the design of sociotechnical systems, which often require interdisciplinary collaborations.
What are you hoping students take away from the educational and career-focused programming you're planning?
PIT-UN at UC Berkeley has collaborative programming aimed at building a diverse group of undergraduate and graduate students who identify as Public Interest Technologists. We help build technical and critical thinking about information necessary for the field of public interest technology. Through hands-on and often peer-led workshops, students develop coding and data analysis skills alongside expertise using tools and methods to assess how values are embedded in technical systems across society. Together, these workshops provide students with the ability to think critically about the possibilities and limits of technological 鈥榮olutions,鈥 while contributing to ideas to support legal regulation, social movements, or other means to generate social change.
Events with leaders in the public interest tech field, along with days when students can 鈥渟hadow鈥 public interest technical professionals exposes students to the wide range of exciting career opportunities in public interest technology. Our goal is that students appreciate their own resilience, understand that their potential is limitless, and feel that they have a like-minded community that creates a sense of belonging and validates their interest in further pursuing public interest technology.
For more information on programming, please check out AFOG's .