Research

3 min read

Identifying Ethics Gaps and Creating Solutions

Published on
August 10, 2023

Tech pros addressing tech's harm to vulnerable groups. Centering marginalized voices, advocating change, and promoting ethics.

Research Abstract:

The writing is about Cyber Collective, a group of tech professionals focused on addressing unintended harmful consequences of technology on vulnerable communities. They center marginalized voices in tech policy discussions, educate the community, and conduct research. The goal is to improve outcomes by advocating for change and challenging biases in the tech industry. Their approach emphasizes diversity and inclusivity to create a safer and more ethical digital world.

Subscribe to newsletter
By subscribing you agree to with our Privacy Policy.
Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.
“By pulling back the curtain and drawing attention to forms of coded inequity, not only do we become more aware of the social dimensions of technology but we can work together against the emergence of a digital caste system that relied on our naivety when it comes to the neutrality of technology… It includes an elaborate social and technical apparatus that governs all areas of life.“
- Ruha Benhamin, Race After Technology

We are Cyber Collective, a group of technology professionals with backgrounds in cyber security, privacy, public policy, and ethics. Through our collective industry, academic, and lived experience, we have become intimately aware of the ways in which technologies deployed in today’s landscape have produced, and continue to produce, unintended harmful consequences at scale for the vulnerable and marginalized¹ in society.

Our approach to this problem is to bring Black, Indigenous, people of color (BIPOC), women, and marginalized peoples to the center of tech product and policy development. We do this by educating our diverse community of technology users—@cybercollective social media followers and newsletter subscribers which include librarians, teachers, women, BIPOC in tech, and concerned citizens—by teaching them how to challenge the digital world we live in, then using their insights to advocate for change.

We also recognize the work that tech companies and policymakers are doing to increase accountability to people impacted by technologies in our present-day landscape. Our goal is to provide some of the missing pieces of the puzzle in this work: by centering the experiences and voices of the historically marginalized in tech, we aim to identify and improve outcomes produced by technology for the marginalized and society at large.

This research brief, coupled with community events we held in October 2020, is our first step in this work. In this document, we discuss the problem we’re tackling as an organization, detail the approach we have taken to understand our community’s knowledge and knowledge gaps, and share our findings.

Our Problem Space

While marginalized peoples are frequently left out of the product development process in the tech industry, technical fixes to problems in society often end up producing harms experienced by the most vulnerable². Decision-making algorithms, for example, like those that use credit scores to evaluate employment candidates, frequently penalize the poor, while search engine results often reinforce sexist and racist conceptions of women and BIPOC³.

The technologies many people encounter in wide-ranging aspects of life—social media platforms, search engines, and surveillance technologies, for example—have implications beyond simply connecting users with information and people. Ruha Benjamin, associate professor of African American Studies at Princeton University and author of Race after Technology, suggests that today’s technology “includes an elaborate social and technical apparatus that governs all areas of life” and reaches far beyond our screens, ultimately presenting a challenge to our autonomy and free will as human beings⁴.

The far-reaching effects of these technologies have also led to public policy efforts to regulate the collection, handling, and use of the mass amounts of consumer data that make the deployment of such technologies possible⁵. While some ballot measures have provided tangible consumer protections, others have represented the interests of industry—cementing the potential that future technologies continue to produce the harms with which the members of our team have become so familiar.

To their credit, the tech industry has begun the work to understand and mitigate this impact, in large part as a response to outside pressures to make these considerations. Emanuel Moss and Jacob Metcalf of the Data & Society Research Institute have found that a number of players in the tech industry are intentionally working to resolve ethical dilemmas faced at the corporate level, assigning leaders to operationalize ethics and scale these practices⁶.

However, these discussions often lack considerations of race and racial equity, and many technologies continue to “operate within parameters that assume a universal, raceless default subject position synonymous with whiteness”⁷. When the development process lacks insight into the experiences of people who are disproportionately impacted, these technologies will continue to produce harms in similar ways.

By looking at these gaps, researchers have identified one key component for addressing these ethical dilemmas: centering the vulnerable and marginalized in discussions around tech and data ethics⁸. Our approach as an organization is to do just this: to bring into the spotlight the voices and experiences of the marginalized in the tech ethics and policy dialogue.

Our Approach

We combine both traditional and creative research approaches to collect empirical data through grounded theory, using traditional methods to gather qualitative and quantitative data during virtual research events. Event types include workshops, seminars, and community conversation. Throughout each event, our research team recorded participant feedback through dialogue, typed comments, and survey responses.

During the U.S. General Election, we worked with our community members to identify their gaps in knowledge around tech policy and to teach them how to think critically about our digital world more broadly. Our event lineup from October 2020 is below.

Our event lineup for October included:

CyCo 101 - The more accessible that internet, data, and technology become, the larger the threat we face as individuals. Concepts in cyber security need to be digestible, relatable, and accessible to a broad audience in order to impact our dialogue and awareness at the individual, community, and global level. Our first event was an introduction to our organization, National Cybersecurity Awareness Month, and a few best practices for securing our data.

Election Security: Digging Into Our Digital Democracy - During the 2020 U.S. Presidential election, we wanted to hold space for conversation around data exploitation, discuss what happened in the 2016 presidential election, and share actionable steps to ensure secure and fair elections.

Understanding Security/Privacy Policies On Your Ballot - Ballot measures are introduced not only to elect government candidates, but also to pass state and local laws. Three states had data privacy laws on the ballot this November: Michigan, California, and Massachusetts. Our research team investigated the proposed legislation and held conversations around privacy policy awareness.

Election Security 101 - Misinformation and disinformation spread during the 2020 U.S. General Election sparked concern about election hacking and the security of the democratic process in the United States. Maggie MacAlpine, election security specialist, joined us to debunk common myths about election security.

Big Tech Little Ethics: Why 11 States Are Suing Google - Earlier this year, the U.S. Antitrust Committee launched an investigation into the Big Four: Facebook, Google, Microsoft, and Amazon. Now, Google is being sued by 11 states over alleged antitrust violations. This event took the headline down to the consumer level and discussed how this impacts us.

What We Found

Our research team developed a series of questions to gauge our participants’ knowledge level before and after an event, what they learned, and what inferences they made after learning about the event topics. These questions were included in a survey we released during each event. The following section details and analyzes our participants’ responses.

During our events, we shared techniques for protecting ourselves online against common manipulation tactics and challenging mis- and disinformation, finding that many of the participants were not aware of the topics and their implications.

Prior to attending an event, participants on average rated their knowledge level a 2 out of 5. After attending participants knowledge level increased to 4 out of 5. However, once made aware, participants began making their own inferences and seeing the connection between data privacy policy, commodification of data, and their individual relationships with technology.

“Coming to the realization that as consumers we really need to take policy change and data protection into our own hands. And I came away with some resources and suggested reading.”

“How can we connect human rights and privacy law? As someone who is going into this area of law in about a year and a half, I have serious concern over this. I am wondering how we can make the changes even at a local level. I know the UN has been preaching this for years to no avail.”

“In a world where technology advances faster than the federal laws and regulations of data privacy and data capitalism, how do we help get those laws up to speed with the technology ?”

Do you know how your personal information is being handled on the internet?

  • Somewhat
  • Yes
  • No

Are you concerned with how your personal information is being handled on the internet?

  • No
  • Yes
  • When asked, “Should consumers have the right to know what happens to their data?” 100% answered ‘yes’.
  • When asked, “Do you think data protection laws exist in the U.S.?” 23% answered ‘yes’, 55% answered ‘no’, and 22% answered ‘unsure’.
  • When asked, “Do you think data protection laws should exist?” 100% answered ‘yes’.

Next Steps

The ultimate goal of our research is to center diverse voices and experiences in the technology ethics dialogue and to improve outcomes produced by tech for all. After a month of connecting with, educating, and learning from our community, we have gauged both the level of awareness and concerns around data privacy and security and have found that, while many community members were not initially aware of how their personal information is handled online, they are inclined to take action as their awareness increases.

We have learned that creating a space for critical thinking around the intersection of technology, global policy, and ethics led by marginalized voices enables unheard-of representation and original research.

Our next steps in conducting this research include continuing to educate our community members about privacy and security topics as well as continuing to explore, investigate, and document the knowledge gap between our community members and industry experts.

Notes

¹ We use the word “marginalized” to describe folks historically pushed to the margins by decision makers in product and policy development— those who are not centered in the tech industry or tech policy space. Our reason for this is the feeling it evokes in people who know firsthand the experience of being pushed to the margins: it’s recognizable and familiar, and these are the people we are trying to center. This is our approach for now, and we may discover in future research that there’s a better word that captures this meaning more fully.  

² Benjamin, 2019

³ O'Neill, 2016; Noble, 2018

⁴ Benjamin, 2019

⁵ Schwartz,Tien, Tsukayama, and Cyphers, 2019

⁶ Moss and Metcalf, 2020

⁷ Moss and Metcalf, 2020

⁸ Daniels, Nkonde, and Mir, 2019; Mohamed, Png, and Isaac, 2020

Sources

  • Benjamin, R. (2019). Race after Technology: Abolitionist tools for the New Jim Code. Medford, MA: Polity Press.
  • Moss, E., & Metcalf, J. (2020). Ethics Owners. Data & Society Research Institute.
  • Noble, S. U. (2018). Algorithms of Oppression: How search engines reinforce racism. New York, NY: New York University Press.
  • O'Neil, C. (2016). Weapons of Math Destruction: How big data increases inequality and threatens democracy. New York, NY: Crown Publishers.
  • Schwartz, A., Tien, L., Tsukayama, H., & Cyphers, B. (2019, December 31). Consumer Data Privacy in California: 2019 Year in Review. Electronic Frontier Foundation. https://www.eff.org/deeplinks/2019/12/year-reviewconsumer-data-privacy-california
Join our newsletter to stay up to date on features and releases.
By subscribing you agree to with our Privacy Policy and provide consent to receive updates from our company.
Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.
© 2023 Cyber Collective. All rights reserved. Site credits: The Process AutomatorRR Digital Media