PIT-UN Members Discuss Educating Future Data Workers about Ethics, Bias
This story is part of PIT UNiverse, a monthly newsletter from PIT-UN that shares news and events from around the Network. Subscribe to PIT UNiverse here.
There鈥檚 no question that, based on analytics and big data use, the more data we have the better our predictions can be. However, it鈥檚 imperative for anyone working with data to understand that it鈥檚 very easy鈥攁nd dangerous鈥攖o get carried away. This was an important point made during the March 8 麻豆果冻传媒 Public Interest Technology University Network BetaNYC Open Data Week. (Yes, that鈥檚 a mouthful!)
During the discussion moderated by PIT-UN鈥檚 director Andreen Soley, Meredith Broussard, Kathleen M. Cumiskey, Mihir Kshirsagar, and Mona Sloane took on the idea of ethics and bias in data work and discussed why educating future data workers is so key. The conversation highlighted some of the inherent problems with data, including the fact that there鈥檚 no escaping bias, something all of our experts agreed upon.
鈥淚f we use the idea as a default that technology is discriminating by default, it allows us to more accurately pinpoint exactly why technology is going wrong, who it is biased against who the technology is not serving and then that allows us to remediate those issues if it's possible,鈥 explained Meredith Broussard, data journalism professor at the Arthur L. Carter Journalism Institute at New York University and research director at the NYU Alliance for Public Interest Technology.
Kathleen M. Cumiskey, a professor in the Psychology department, and Women, Gender and Sexuality Studies Program at CUNY Staten Island explained that a good way to keep bias out of data is by bringing everyone into a discussion about its use, saying, 鈥淲e want the people essentially to have access to the tools and to the technology so that their presence is made known right to those who are to the decision makers, the policymakers, to those who are making decisions that will impact their lives without them being a part of it.鈥 It鈥檚 why her own program is 鈥減lace-based,鈥 giving people who may ordinarily be underrepresented more representation in the process.
Mihir Kshirsagar, who runs Princeton University鈥檚 interdisciplinary technology policy clinic at its Center for Information Technology Policy says you can also avoid injecting bias into a data or tech project by identifying data bias risk before that project launches. This lets you get a better idea of its social impact before negative biases can harm a person or cohort. 鈥淚f you鈥檙e already done building your cool new tool [and then thinking about it], that's exactly the wrong way to go about it,鈥 he says.
The discussion culminated with concrete examples of how each PITUN member is doing the work of eliminating bias and starting the discussion about data ethics in their own institutions. Mona Sloane, an adjunct professor at New York University鈥檚 Tandon School of Engineering and senior research scientist at the NYU Center for Responsible AI, for example, explained how her project with the Queens Public Library system that they hope will affect how policymakers work with data and how they make decisions.
Want to hear more? If you weren鈥檛 able to attend yesterday鈥檚 event, or above to watch the entire program on the 麻豆果冻传媒 YouTube channel. And check out the rest of the programming at .