Research

8 min read

Contributors
Full name
Tazin Khan
Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.
Follow Us
Research Abstract:

Cyber Collective advocates for stronger privacy protections in the US. Their petition includes diverse subcommittees, annual regulation review, affirmative consent, enforcement power for FTC, federal preemption, burden of proof shift, and a definition of injury. The goal is to protect consumer privacy and increase transparency and accountability. This brief serves as an in-depth look into their petition and aims to shift the culture around privacy and data rights.

Published on
August 2, 2023

Privacy And Protection For All

Cyber Collective advocates for stronger privacy protections in the US. Their petition includes diverse subcommittees, annual regulation revi

An analysis of cyber collective’s federal petition

“[D]ata rights are fundamentally about securing a space for individual freedom and agency while participating in modern society.”
- Martin Tisné

In recent years, the data economy has grown rapidly at the expense of individual autonomy, agency, and privacy, while legal mechanisms to protect consumer privacy in the United States vary by state. Home to tech giants like Google and Facebook, the U.S. has not yet enacted comprehensive privacy legislation at the federal level.

Legislators have been working on national privacy law for several years, and many of the core components currently have bipartisan support. Potential exists for a federal privacy bill to pass the legislature and be signed into law in 2021.

We have introduced a petition to advocate for several mechanisms we believe will strengthen the protections for consumers in this bill. Our goal is to increase transparency and to center the protection and consideration of all people.

Cyber Collective's work focuses on the needs of the marginalized as they relate to technology, and this petition is our way of helping codify these needs at the federal level. We've partnered with Elroi, the company behind an upcoming privacy compliance tool, to launch this petition. They have also provided the legal expertise behind these asks.

The goal of this brief is to provide lawmakers the background behind the asks we make in the petition and to make this information accessible to an audience that may not have experience in the tech industry or policy—though much of this report will be easier to understand as a consumer in the tech space.

We aim to illuminate the "why" behind each of the petition’s asks so that more consumers can become aware of what's possible with privacy legislation that centers their interests and data rights.

A note about our sources:

To conduct research for this brief, we collaborated with Rachel Cash, CEO and Founder of Elroi, who has deep expertise in privacy law and who provided the foundation for this petition. The subsections What this means and Why this matters under each ask are based heavily on an interview we conducted with Rachel in February 2021.

We also cite Targeted by Brittany Kaiser, Co-Founder of Own Your Data Foundation, with whom we have an ongoing working relationship, and whose book is one of the few primary sources about Cambridge Analytica’s involvement in the 2016 U.S. Election.

Ask We Make In This Petition

The petition outlines seven asks of lawmakers:

  1. Create diverse and public subcommittees as part of the regulation drafting.
  2. Conduct an annual review of regulation compared to current technology advancements and interpretations.
  3. Require affirmative consent from individuals prior to processing their sensitive data. Provide individuals right to access, correct and delete personal data.
  4. Grant enforcement power to the Federal Trade Commission (FTC) and state attorneys general, and individuals to institute private right of action.
  5. Federal regulation would preempt state data privacy and security laws, however provides state laws to afford greater protection.
  6. Shift the burden of proof for right of action for individuals to the defendant. A defendant must provide that a violation of the regulation was not a willful or repeated violation.
  7. Provide a definition of injury that would justify actual damages.

This section will cover the backgrounds of each ask—what the ask is, why it matters for consumers, and some real life examples.

Diverse And Public Subcommittees

We ask that the national privacy legislation create diverse and public subcommittees as part of the regulation drafting.

What this means: Create subcommittees that comprise domain experts in technology ethics, technology, Science, Technology, and Society (STS) studies, human rights, and community advocates to advise lawmakers as the bill is drafted.

Why this is important: Experts in these fields can best guide lawmakers to develop legislation that works for all people, with deep understanding of various technologies and their significance in the context of broader society. With this input, lawmakers can better respond to the needs of the people as they relate to technology.

Breadth and depth of expertise should be represented in the development of this legislation, where technologists, human rights experts, tech ethicists, and community advocates provide depth, and where STS experts provide breadth and can connect the dots between these disciplines.

Without domain experts to drive the development of privacy legislation, laws can end up serving government and corporate interests at the expense of the consumer autonomy and agency.

A real-life example: The EARN IT Act, introduced in the U.S. Senate in 2020, would have eliminated end-to-end encryption from platforms like iMessage, Signal, and WhatsApp. End-to-end encryption ensures that only the sender and recipient can read communications sent over these platforms, barring access by service providers, malicious actors, and the government².

The EARN IT Act, had it passed, would have threatened freedom of speech and security online. Domain experts in a subcommittee would have been able to warn against the serious social and technical consequences during the drafting process, had they been present.

Annual Review Of Regulation

We ask that the national privacy legislation include an annual review of regulation compared to current technology advancements and interpretations.

What this means: This ask would require that the privacy legislation be reviewed annually by a group of domain experts, so that they can revise it in light of new and changing technologies.

Why this matters: Technology is ever-evolving and changes rapidly, and the legislation that regulates it should change with it too. Technologies that leverage consumer data in particular are becoming increasingly impactful to people’s lives, freedoms, and agency, so ideally, this legislation is a living document that is reviewed on a regular basis to mitigate potential risks and harms.

Without an annual review of legislation, data and technologies can be used to skirt existing regulation, intentionally or otherwise. For example, private companies can conceivably bypass discrimination laws by deploying machine learning models that reinforce systemic racism at scale.

If legislation does not change and evolve with the tech landscape, companies can absolve themselves of responsibility by shifting blame to the technologies they develop and may not take corrective action. Existing legislation may fail to provide its intended protection. Regulation must keep up with the complexities of evolving technologies to ensure that the outcomes they produce are equitable for all groups of people.

Affirmative Consent

We ask that the national privacy legislation require affirmative consent from individuals prior to processing their sensitive data and provide Individuals right to access, correct and delete personal data.

What this means: This would require companies to gain voluntary and knowing acceptance from consumers before processing their sensitive data, as well as provide consumers the right to access, correct, and delete their personal information.

Why this matters: Information that allows consumers to knowingly and voluntarily consent to the use of their data—like what information is being collected, how and with whom it will be shared and how long it’s kept—is frequently obscured or inaccessible in some way. Product interfaces often take advantage of opt-out methods to collect, use, and share personal data by default, something that consumers may not want, even if they don’t choose to change the default setting.

Without knowing what terms they’re agreeing to and why, consumers can’t make informed decisions about how to protect their privacy and may end up sharing more of their personal information than they would like.

This ask would make privacy policies opt-in, requiring a user to perform an action to accept terms of use, instead of being implied by a consumer’s continued use of a service or product.

A real-life example: Cambridge Analytica, a political consulting firm that worked on Donald Trump’s campaign in 2016, claimed to have data points on every American. They used this information to create psychographic profiles on each American that were then used for behavioral microtargeting, a process that would encourage targets to vote for its client³.

The issue with Cambridge Analytica’s work was the way the data was acquired—without permission from many Facebook users. Instead, Cambridge Analytica got permission from certain users to collect other users’ data, without informing those other users of the data collection. Affirmative consent required by law may have curbed this misuse of Facebook user data and the potential for election interference.

Enforcement Power

We ask that the national privacy legislation grant enforcement power to the Federal Trade Commission (FTC) and state attorneys general, and individuals to institute private right of action.

What this means: This ask would enable the Federal Trade Commission (FTC) and state attorneys general to enforce compliance with the law, and allow for private right of action, the right for individuals to sue a company for mishandling consumer data.

Why this matters: Personal data belongs to the consumer, and companies that collect personal data act as a controller of that data. This makes companies responsible for properly handling, storing, and use of consumer data it collects. Granting power to the FTC to enforce proper handling, storage, and use would protect consumers and their privacy, as violating privacy rights would have real consequences.

Private right of action would give consumers the ability to sue companies if they misuse personal data, where misuse might mean keeping consumer data longer than necessary, not returning data to consumers as requested, or selling the data without permission.

Because regulatory agencies can lack the resources to enforce laws and can also be swayed by industry interests, individual consumers should be given the power to enforce their privacy rights⁴. This serves to develop an accountability structure for enterprises when they misuse data or repeatedly violate consumer rights.

Federal Preemption

We ask that federal legislation preempt state data privacy and security laws, however that it also provides that state laws can afford greater protection.

What this means: This means that federal legislation would supersede state privacy laws unless the state laws were stricter.

Why this matters: Privacy legislation around tech has been fragmented, with different rights being afforded to residents of states like California, Nevada, and Maine⁵. This can make private right of action more difficult or impossible for residents where these protections are not afforded.

Where state laws provide more consumer protection than the federal legislation, however, state laws should take precedence. Some kind of data privacy law has been enacted in each California, Illinois, and Vermont, and many more states have introduced bills for comprehensive privacy laws. States should continue to consider and implement legal mechanisms that potentially provide more protection than a law at the federal level.

Burden Of Proof

We ask that the legislation shift the burden of proof for right of action for individuals to the defendant. A defendant must provide that a violation of the regulation was not a “willful or repeated” violation.

What this means: In cases where an individual consumer sues a company for mishandling their data, the court should assume that the company violated the law and put the burden of proving otherwise on the company. The company would also be required to show that the violation of the law was not on purpose or a repeat violation.

Why this matters: Without this shift, individual consumers would be required to prove that the company violated the law. This is a burden that gives companies an advantage over individuals who may not have the resources to prove wrongdoing—a hurdle that further entrenches privacy as a privilege, as opposed to a right.

Definition Of Jury

We ask that the legislation provide a definition of injury that would justify actual damages.

What this means: The legislation should specify the cost associated with misuse of consumer privacy data that the court can use to figure out what damages a violating company should pay a consumer.

Why this matters: This requirement standardizes a form of reparations for consumers when they are wronged by companies that misuse their data. A definition of injury would also start shifting the cultural conception of privacy from a privilege to a right, something that Cyber Collective has argued for extensively. This change may also shift the conception of data from something that companies can handle in ways that best serve them to something that consumers own for themselves.

A real-life example: Brittany Kaiser describes Cambridge Analytica’s work in the 2016 U.S. Election as “...a uniquely American opportunity. Data laws in countries such as the United Kingdom, Germany, and France don't allow such freedoms”⁶. Culturally, the United States is a place where capitalist pursuits have caused marginalized people to become externalities, where their lives and well-being are the cost paid to realize these pursuits.

Requiring companies to bear the cost of violating consumer privacy rights would place responsibility onto these companies and incentivize them to instead protect privacy rights, or at least move the needle in that direction.

Conclusion

This brief is an in-depth look into our petition, primarily speaking to those in industry and policy, as well as people that will be affected by such legislation. The seven asks outlined in this brief will help protect important aspects of privacy rights in the United States, increasing transparency and accountability to consumers. With these mechanisms in place, consumers, enterprises, and lawmakers can start shifting the culture around privacy and data rights in social and industry spaces.

The current protections around privacy have fundamental failures and have been proven to not be enough to prevent individuals’ data in a modern, technology-driven age. By making these asks, we hope that federal lawmakers and consumers will better understand how to effectively respond to the evolving technology landscape and needs of society.

About Cyber Collective

At Cyber Collective, we work to demystify data ethics, privacy, and cyber security for consumers. While our briefs serve to communicate with those in industry and policy, our daily effort is to arm the public with digital literacy.

Our organization works in the space between the creators and consumers of technology. As professionals and domain experts, we understand the ways in which our digital lives affect our physical lives, agency, and autonomy, and we deeply believe every single person deserves the same understanding. That’s why we host monthly creative research events that function to both educate and understand the way data ethics, privacy, and cyber security affect consumers. Through our work and growing partnerships, we hope to effect positive change in society and technology.

Subscribe to receive our monthly newsletter & exciting announcements!
By subscribing you agree to with our Privacy Policy and provide consent to receive updates from our company.
Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.
© 2023 Cyber Collective. All rights reserved. Site credits: The Process AutomatorRR Digital Media