When Corporations Violate Privacy, They Do Concrete Harm

There are tangible safety implications to consumer privacy violations, says Lindsey Barrett of Georgetown Law.

AccessTimeIconFeb 15, 2020 at 11:30 a.m. UTC
Updated Dec 10, 2022 at 8:06 p.m. UTC
AccessTimeIconFeb 15, 2020 at 11:30 a.m. UTCUpdated Dec 10, 2022 at 8:06 p.m. UTC
AccessTimeIconFeb 15, 2020 at 11:30 a.m. UTCUpdated Dec 10, 2022 at 8:06 p.m. UTC

Lindsey Barretthttps://www.law.georgetown.edu/experiential-learning/clinics/communications-technology-law-clinic-ipr/faculty-staff/, a staff attorney at Georgetown Law, does not hold back when she sees bad actors in the tech space.

She recently vehemently opposed the Sprint-T-Mobile merger in Slate, warning that “further consolidating an already anti-competitive sector” would make “it easier for those companies to gouge their customers.” Social media companies are making our lives a privacy hell, she says.

Part of the Institute for Public Representation (IPR) Communications & Technology Clinic, Barrett advocates for the clinic's non-profit clients in areas relating to technology and the public interest, like consumer privacy, children's privacy, and media accessibility. We spoke with her as part of our Election 2020 package, covering big issues in tech, where the candidates stand, and what politicians can do about the data abusers. This interview has been edited and condensed. 

Ben Powers: What big questions around tech should we be paying more attention to?

Barrett: Corruption is a big one. There’s no major issue where the law isn’t either under-inclusive or deeply skewed towards allowing industry impunity. We can't change things unless we are able to tamp down on how lobbyists are able to shape policy, and ensure the expertise that Congress has access to is independent. Privacy, particularly consumer privacy, is a really big one under the tech umbrella. Privacy can unfortunately get siloed into talking about Facebook and Google and nothing else. But we're talking about data that's collected from us and that law enforcement has access to, in 50 different ways, and none of it is trivial. 

Lindsey Barrett
Lindsey Barrett

We've progressed a lot in how we characterize privacy problems and the real risks they pose. It’s less and less a tenable or serious position for companies to come out and say that a privacy law would cause the industry and its beautiful innovation to come crumbling down. We know that's not true. 

It’s also a less serious position to say that people “don't care about privacy” or “because they don't care, they don't deserve protections from it.” We've had visceral examples demonstrating why that idea isn’t true. We know that ad tech companies and data brokers hoover up every bit of information about us that they can, make assessments of us based on that, and sell them to the highest bidder. 

We know that those assessments can affect or determine whether we can rent an AirBnB, go to a bar, and afford health insurance or college. None of this is trivial. As the rhetoric moves in a positive direction, we need it reflected in meaningful privacy protections and laws that make it possible for people to sue to vindicate privacy violation, executive liability where appropriate, and measures that would make privacy law something companies take seriously because, not laughing it off because their risks for violating It is so low.

SingleQuoteLightGreenSingleQuoteLightGreen
We need a basic level of privacy laws that treat privacy as a civil right and a human right.
SingleQuoteLightGreenSingleQuoteLightGreen

Powers: What are ways that people are harmed by abuses of privacy and data?

Barrett: When a company has bad data security practices, that company lets you get hacked, and now you’re subject to identity fraud, with the anxieties about time, money and everything else that entails. Then you have actual safety risks. There’ve been a whole series of stories and investigations into telecom giants selling location data and you can’t come up with a more horrifying safety risk than a bail bondsman (who can have access to that data) deciding he wants to stalk his girlfriend that day. There are concrete and dangerous safety implications to consumer privacy violations. 

Other harms come in how the data or technology is used. We know many important life decisions are made accessible or mediated through algorithms. The information collected about you determines how you are characterized in ways that you can't see and won't have access to. These can impact everything from educational and job opportunities to being able to rent an Airbnb

Powers: How do you give a privacy law teeth?

Barrett: A big start is understanding how consumer privacy laws are based on an outmoded understanding of privacy decision making. If you frame privacy rights as a consumer nicety or as a privilege, then it's sufficient to  have laws that impute that someone will read the privacy policy and make an informed decision, even when we know they're poorly equipped to do that. Trivializing privacy makes it ok to mediate whether a practice is acceptable through meaningless privacy policies.

But, in so many other consumer protection areas, we acknowledge when people are at a bad informational disadvantage, where they’re not able to assess these kinds of risks. So we make allowances for your right to breathe air that is not full of mold, your right to not be poisoned. Your ability to protect yourself is limited, so we're not going to leave you at the mercy of having to protect yourself. We realize it’s an artificial choice when we say, ‘oh, well, you didn't read the privacy policy so you deserve whatever happened to you.’ 

We need a basic level of privacy laws that treat privacy as a civil right and a human right. We need privacy laws that understand how privacy decision-making is constrained. We need a privacy law that understands how data uses can limit life opportunities. We need a privacy law with penalties that companies take seriously. After the FTC settlements with Facebook and Youtube were reported last year you saw the stock go up. That is a concrete demonstration of how the incentives of our current privacy laws are working. We need better enforcement, whether that’s empowering the FTC, or a new agency. And we need a private right of action.

SingleQuoteLightGreenSingleQuoteLightGreen
Suing is expensive and it’s hard.
SingleQuoteLightGreenSingleQuoteLightGreen

Powers: So do privacy plaintiffs not have the right to sue companies that abuse their own privacy agreements?

Big "it depends" here. The long answer: It depends on the kind of privacy violation, because many privacy laws do not provide individuals with the right to sue violators, but instead vest enforcement authority solely in an agency and/or state attorney general . Even with a privacy law that has a private right of action, the company might have buried an arbitration clause in its terms of service. Plaintiffs are shunted into a process with no transparency where the company is at a strategic advantage, including the choice of arbitrator and applicable rules. And where privacy-plaintiffs are able to sue, courts have long been unduly parsimonious over their perception of privacy injuries for the purpose of standing doctrine. So, the short answer: rarely. Suing is expensive and it’s hard.

Powers: How are campaigns addressing these areas?

Barrett: Some candidates have pushed ideas that have become popular and other candidates have seized on those. Elizabeth Warren's tech proposals have been subsequently embraced by other candidates, which is great because they're really good ideas. [Bernie] Sanders said that he supported a right to repair after she came out for one. [Andrew] Yang said he supported reviving the Office of Technology Assessment in Congress after she did. And the whole field has had to address the problems of anti-competitiveness and consolidation in the tech sector after she put out her plan to break up big tech. Whether or not they're committed to the actual full bones of the idea or just like the way it sounds, is another question. Warren and Sanders appreciate the need for broad legal reforms and recognize a broad corruption problem. 

I find myself gravitating to Warren's tech-related proposals because of her precision, ambition, and her prioritization of rooting out corruption. Her plans reflect careful deliberations and consultation on niche, but crucial, issues — she was the first to suggest a national right to repair, the first to come out for supporting reviving Congress's Office of Technology Assessment, and her push for antitrust reform has entirely reshaped the debate. Her anti-corruption reforms are crucial because at the end of the day, the biggest tech policy difficulty isn't figuring out how to draft effective laws, it's figuring out how to enact anything meaningful at all when industry has billions of dollars to burn on lobbying Congress, state legislatures, and the FTC and FCC.

SingleQuoteLightGreenSingleQuoteLightGreen
The biggest difficulty is figuring out how to enact anything meaningful at all when industry has billions of dollars to burn on lobbying Congress, state legislatures, and the FTC and FCC.
SingleQuoteLightGreenSingleQuoteLightGreen

Sanders has a number of exciting tech policy proposals, and exhibits a clear and necessary capacity to name villains and tackle the biggest policy problems at their root. I'm thrilled that he supports banning law enforcement’s uses of facial recognition; commercial uses are dangerous too, but he's helping to move the conversation in the right direction. His public broadband plan is a little sparse on detail but otherwise excellent. And I love that he supports a tax on digital advertising. The digital ecosystem is heavily skewed towards corporate profitability and against meaningful rights for consumers.

None of the other candidates have demonstrated a desire to constrain corporate power to the extent that Sanders and Warren have, which gives me little reason to think that their policies will be sufficient to restore any kind of equilibrium to our corporate-friendly tech policy ecosystem. 

[Pete] Buttigieg criticized Warren's antitrust plan as being inappropriate for targeting specific companies, which is, well, how antitrust works. His coziness with Sillicon Valley and enthusiasm for a "freedom of choice" framing in healthcare, another area, like privacy, where 'freedom to choose' functionally means  'freedom to be taken advantage of by companies' also bodes poorly for the kinds of policies he would put forward or support. 

Yang had put out a number of tech proposals that strike me as ill-considered and unduly corporate-friendly. Framing privacy rights as property rights doubles down on the bad faith bargaining structure created by the consent model of privacy governance, which is the last possible thing privacy policy should be doubling down on. A "department of the attention economy" based in Silicon Valley and designed to foster public-private partnerships is another proposal that reflects a desire to allow the foxes to keep writing the rules for the henhouse, rather than a basic, necessary understanding of how industry self-interest works. His faith in the unique inspiration of private industry ignores everything that the last 30 years of Silicon Valley companies moving fast and breaking things should've taught us by now. 

Powers:  You argue that Silicon Valley is just one part of the attack on our privacy? Can you explain? 

Barrett: By siloing this conversation in Silicon Valley, we're giving short shrift to companies that are doing the same thing. When it comes to adtech and tracking, AT&T and Verizon are both in the ad tech business. Verizon had the biggest COPPA fine assessed until it was then topped by TikTok and YouTube. They were illegally tracking kids and making money off of them. AT&T is buying reams of regular location data and sex preference information on people from Grindr. These companies are engaging in practices like those of tech companies that are incredibly problematic but they also have their own issues. They’re lobbying against municipal broadband, against any kind of meaningful competition reform, against broadband privacy rules, and against meaningful state and  federal privacy legislation. Not to mention getting net neutrality murdered.


Learn more about Consensus 2024, CoinDesk's longest-running and most influential event that brings together all sides of crypto, blockchain and Web3. Head to consensus.coindesk.com to register and buy your pass now.


Disclosure

Please note that our privacy policy, terms of use, cookies, and do not sell my personal information has been updated.

CoinDesk is an award-winning media outlet that covers the cryptocurrency industry. Its journalists abide by a strict set of editorial policies. In November 2023, CoinDesk was acquired by the Bullish group, owner of Bullish, a regulated, digital assets exchange. The Bullish group is majority-owned by Block.one; both companies have interests in a variety of blockchain and digital asset businesses and significant holdings of digital assets, including bitcoin. CoinDesk operates as an independent subsidiary with an editorial committee to protect journalistic independence. CoinDesk employees, including journalists, may receive options in the Bullish group as part of their compensation.


Read more about