“In truth, there is only one remedy for the violations of the right to privacy within the reach of the American public, and that is but an imperfect one. It is to be found in attaching social discredit to invasions of it . . . . At present this check can hardly be said to exist. It is to a large extent nullified by the fact that the offence is often pecuniarily profitable.”1E.L. Godkin, The Rights of the Citizen: To His Own Reputation, Scribner’s Mag. 58, 67 (1890).
Orrin Hatch, Senator: “How do you sustain a business model in which users don’t pay for your service?”
Mark Zuckerberg, CEO of Facebook, Inc.: “Senator, we run ads.”
Nineteenth century remarks and a twenty-first century exchange on the Senate floor demonstrate that observers have viewed and continue to view US data privacy3Defined as “the use and governance of personal data.” What Does Privacy Mean?, Int’l Ass’n of Priv. Pros., https://perma.cc/LM22-PVKF. as a trade-off between personal liberty and corporate profit. Prescient for his time, E.L. Godkin, editor-in-chief of the New York Evening Post, announced an “imperfect” remedy4Godkin, supra note 1, at 67. for data privacy violations that remains elusive to this day. Following the 2018 Cambridge Analytica scandal,5Carole Cadwalladr & Emma Graham-Harrison, Revealed: 50 Million Facebook Profiles Harvested for Cambridge Analytica in Major Data Breach, The Guardian (Mar. 17, 2018, 6:03 PM), https://perma.cc/MNF9-KUAC. the Federal Trade Commission (“FTC”) fined Facebook, Inc. an unprecedented $5 billion—the largest fine issued in the agency’s history.6Press Release, FTC, FTC Imposes $5 Billion Penalty and Sweeping New Privacy Restrictions on Facebook (July 24, 2019), https://perma.cc/W6VV-QQ5C. But agency experts,7See, e.g., Dissenting Statement of Commissioner Rohit Chopra, Facebook, Inc., FTC File No. 182-3109, at 20 (July 24, 2019), https://perma.cc/LN3W-95SE (“When companies can violate the law, pay big penalties, and still turn a profit while keeping their business model intact, enforcement agencies cannot claim victory.”). politicians,8See, e.g., Harper Neidig, Critics Slam $5 Billion Facebook Fine as Weak, The Hill (July 16, 2019, 6:00 AM), https://perma.cc/M2UA-UPGM (surveying bipartisan criticism on the settlement’s terms); cf. Ryan Tracy & John D. McKinnon, Facebook Penalty Sends Message to Big Tech, Wall St. J. (July 24, 2019, 8:31 PM), https://perma.cc/TH5Y-UFQU (“I expect a lot of board members and CEOs are chatting and texting today about what exactly they need to do to ensure they are within spitting distance of these new best practices.”). But see Berin Szóka, The Facebook/Cambridge Analytica Settlement: How Far Can the FTC Go, Legally?, Tech Pol’y Corner (Mar. 27, 2019), https://perma.cc/3VJN-479Y (stating that the FTC’s settlement with Facebook was too punitive and stands to disincentivize smaller corporations from sharing data). and the press9See, e.g., Opinion, A $5 Billion Fine for Facebook Won’t Fix Privacy, N.Y. Times (July 25, 2019), https://perma.cc/3A9T-LRSW. remain skeptical that the massive fine—nine percent of Facebook’s 2018 revenue10See Press Release, Facebook Investor Rels., Facebook Reports Fourth Quarter and Full Year 2018 Results (Jan. 30, 2019), https://perma.cc/R3BM-SK7G.—will incentivize Facebook, or any other similarly situated technology company, to adjust a business model built on harvesting, aggregating, and using consumer data.11Olivier Sylvain, The Market for User Data, 29 Fordham Intell. Prop. Media & Ent. L.J. 1087, 1088–89 (2019). And as the recent exchange between Senator Hatch and Mr. Zuckerberg illustrates, Congress may lack the expertise necessary to effectively address data privacy.
Continuing a legacy of privacy protections against private entities,12See European Data Protection Directive, Council Directive 95/46/EC, 1995 O.J. (L 281) 31; see also The History of the General Data Protection Regulation, European Data Prot. Supervisor, https://perma.cc/46MV-U8YM. the European Union—following recommendations from the European Data Protection Supervisor,13See Opinion of the European Data Protection Supervisor on the Communication from the Commission to the European Parliament, the Council, the Economic and Social Committee and the Committee of the Regions – “A Comprehensive Approach on Personal Data Protection in the European Union,” at §§ 1.1.1–1.1.6 (Jan. 14, 2011), https://perma.cc/L4J8-D6JQ; Opinion of the European Data Protection Supervisor on the Data Reform Package, at § 1.1.b.(i) (Mar. 7, 2012), https://perma.cc/2YYP-JMWU. the European Commission,14Commission Proposal for a Regulation of the European Parliament and of the Council on the Protection of Individuals with Regard to the Processing of Personal Data and on the Free Movement of Such Data (General Data Protection Regulation), at 1–2, COM (2012) 11 final (Jan. 25, 2012). and the Article 29 Working Party15Article 29 Data Prot. Working Party, Opinion 08/2012 Providing Further Input on the Data Protection Reform Discussions, WP 199 (Oct. 5, 2012); Article 29 Data Prot. Working Party, Opinion 01/2012 on the Data Protection Reform Proposals, WP 191 (Mar. 23, 2012); Article 29 Working Party, European Data Prot. Bd., https://perma.cc/6LJC-RPL6 (describing the Article 29 Working Party as an independent body charged with privacy and personal data until the adoption of the GDPR when it was replaced by the European Data Protection Board).—enacted omnibus legislation protecting EU citizens’ fundamental right to personal data privacy.16See Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the Protection of Natural Persons with Regard to the Processing of Personal Data and on the Free Movement of Such Data, and Repeating Directive 95/46/EC (General Data Protection Regulation), 2016 O.J. (L 119) 1 [hereinafter General Data Protection Regulation]. European law protects all personal data that can identify an individual: names, locations, online identifiers, and other factors specific to a person’s “physical, physiological, genetic, mental, economic, cultural or social identity.”17Id. art. 4(1). This legislation applies to data controllers and processors, both of which include private corporations.18Id. art. 4(7)–(8).
The United States has a markedly different approach to personal data protection laws applying to private corporations. With the exception of a handful of industry-specific federal statutes,19See, e.g., Fair Credit Reporting Act, 15 U.S.C. § 1681 (regulating consumer reporting agencies’ maintenance of customers’ credit information); Gramm-Leach-Bliley Act, 15 U.S.C. §§ 6801–6827 (regulating financial service companies’ maintenance of customers’ financial information); Stored Communications Act, 18 U.S.C. §§ 2701–2711 (regulating network service providers’ maintenance of account holders’ information). most data protection regulations applying to corporations originate at the state level.20See Ieuan Jolly, Data Protection in the United States: Overview, Westlaw Prac. L. (June 8, 2020) (“[T]he US has a patchwork system of federal and state laws and regulations that can sometimes overlap, dovetail, and contradict one another.”). And responding to massive data breaches,21See, e.g., Seena Gressin, The Equifax Data Breach: What to Do, FTC Consumer Info. (Sept. 8, 2017), https://perma.cc/M4SX-5FPH (describing the 2017 Equifax data breach that exposed the personal information of 143 million US consumers over a two-month period). state legislatures have begun to craft—and pass—trans-jurisdictional data privacy bills regulating corporate use of personal information.22For bills signed into law, see California Consumer Privacy Act, Cal. Civ. Code §§ 1798.100–1798.199 (West 2018); Me. Rev. Stat. Ann. tit. 35-A, § 9301 (2019); Nev. Rev. Stat. §§ 603A.010–603A.360 (2019). For bills introduced, see H.B. 1130, 58th Leg., 1st Sess. (Okla. 2021); H.F. 36, 92d Leg., Reg. Sess. (Minn. 2021); A.B. 680, 2021–2022 Leg., Reg. Sess. (N.Y. 2021); S.B. 567, 2021–2022 Leg., Reg. Sess. (N.Y. 2021); S.B. 5062, 67th Leg., Reg. Sess. (Wash. 2021). These bills23For a helpful overview, see Mitchell Noordyke, US State Comprehensive Privacy Law Comparison, Int’l Ass’n of Priv. Pros. (Apr. 18, 2019), https://perma.cc/4683-VKJ2. Other states—Connecticut, Hawaii, Louisiana, North Dakota, and Texas—substituted independent task forces for legislation. Id. set forth broad rights for consumers (e.g., accessing data shared with third parties24See Cal. Civ. Code § 1798.100(a); Minn. H.F. 36 § 3(2); N.Y. A.B. 680 § 1103(1); N.Y. S.B. 567; Wash. S.B. 5062 § 103(1). and private causes of action25See Cal. Civ. Code § 1798.150(a) (limiting a private right of action to consumers whose data was “subject to an unauthorized access” because of a “business’s violation of the duty to implement and maintain reasonable security procedures”); Minn. H.F. 36 § 9(1)(b); N.Y. A.B. 680 § 1109(3); N.Y. S.B. 567. But see Okla. H.B. 1130 § 1(E) (limiting enforcement to the state attorney general); Va. H.B. 2307 § 59.1-579(A) (same); Wash. S.B. 5062 § 111(1) (“A violation of this chapter may not serve as the basis for, or be subject to, a private right of action under this chapter or under any other law.”).) and obligations for corporations (e.g., prohibiting discrimination against consumers who exercise rights under the bills26See Cal. Civ. Code § 1798.125; Me. Rev. Stat. Ann. tit. 35-A, § 9301(3)(B); Minn. H.F. 36 § 8(a); N.Y. S.B. 567; Va. H.B. 2307 § 59.1-574(A)(4); Wash. S.B. 5062 § 107(1)(b)(7). and restricting the ability to process consumer data27See N.Y. A.B. 680 § 1103(4)(a); Va. H.B. 2307 § 59.1-574(A)(2), (4), (5); Wash. S.B. 5062 § 107(1)(b)(6), (8).). But despite their similarities, the bills use different definitions28The CCPA defines an annual revenue threshold—$25 million—for a covered “business” whereas one of the New York bills does not. See Cal. Civ. Code § 1789.140(c)(1)(A); N.Y. A.B. 680 § 1100(4), (5). But see N.Y. S.B. 567 (defining an annual revenue threshold—$50 million—for a covered “business”). Moreover, some bills expand the definition of “personal information” to include any “inferences drawn” from the data. Minn. H.F. 36 § 1(8)(10); N.Y. A.B. 680 § 1100(10)(a)(xii); N.Y. S.B. 567; Okla. H.B. 1130 § 9(a)(10). and assign different penalties,29The CCPA and Minnesota bill allow a civil recovery of whichever is greater between $750 per incident or actual damages whereas one of the New York bills does not set a specific dollar number. Compare Cal. Civ. Code § 1798.150(a)(1)(A), and Minn. H.F. 36 § 9(b), with N.Y. A.B. 680. But see N.Y. S.B. 567 (allowing a civil recovery of whichever is greater between $1000 per incident or actual damages and further allowing a civil recovery of whichever is greater between $3000 per incident or actual damages for “knowing and willful violation[s]”). generating widespread confusion about compliance and enforcement.30See, e.g., Jennifer Huddleston, The Problem of Patchwork Privacy, Tech Liberation Front (Aug. 15, 2018), https://perma.cc/N7DJ-473C; Cathy McMorris Rodgers, Opinion, 4 Warnings about What a Patchwork of State Privacy Laws Could Mean for You, Morning Consult (May 3, 2019, 5:00 AM), https://perma.cc/3SW5-EB56.
Inspired in part by the European Union’s General Data Protection Regulation (“GDPR”), state data privacy legislation threatens to both consolidate markets among large firms capable of complying with regulations and impose contradictory standards on technology- and internet-based industries that span jurisdictions and are critical to the US economy.31See Christopher Hooton, Measuring the U.S. Internet Sector: 2019, Internet Ass’n (Sept. 26, 2019), https://perma.cc/3VJJ-47FX (estimating that the internet generated 10.1% ($2 trillion) of the US GDP—twice its contribution to the US GDP in 2014 ($966 billion), and that between 2012–2018, the internet sector grew by 372% compared to the information sector (59.3%) and the manufacturing sector (26.6%) over the same time period). This Comment proposes a regulatory solution to address personal data privacy concerns within the construct of technology companies’ business practices. The solution contemplates a self-regulatory body that would: (1) promulgate industry standards; (2) differentiate between large and small internet-based businesses; and (3) foster increased transparency surrounding the use of consumer data.
By deferring to businesses, the self-regulatory solution responds to the GDPR without curbing either innovation or economic growth—both of which have since plagued the European Union. Instead, the proposed solution leaves the path forward in the hands of those who are most familiar with the technology and operate within the borderless industry. The solution also leaves enforcement to the FTC—the body tasked with adjudicating violations of existing privacy laws. The solution’s ex ante self-regulatory parameters and ex post federal enforcement mechanism strikes a balance lacking in the two major proposals—comprehensive federal legislation and “patchwork” state legislation—for personal data privacy protections.
The Comment proceeds in four parts. Part I addresses the origins of data privacy laws in the United States: beginning with academic elucidations of a new common law right and extending to congressional legislation across different sectors. Part II then discusses the internet’s impact on consumer data privacy. Beyond its remarkable effect on global business, the internet has also facilitated companies’ greater access to consumer data, often without consumers’ full awareness. Part III presents two different regulatory responses to the internet’s ubiquity and corresponding threat to consumer data privacy: European omnibus legislation and US state-based patchwork legislation. Finally, Part IV articulates an alternative to the omnibus and patchwork responses: self-regulation. It describes models of successful self-regulatory bodies and lays out potential factors a newly formed body of industry-leading companies could consider to serve both consumer and corporate interests.
I. History of and Developments in US Privacy Law
Since the nineteenth century, state common and statutory laws have primarily regulated data privacy32See Jolly, supra note 20. with the exception of a few federal statutes regulating specific industries.33See, e.g., statutes cited in supra note 19. This Part surveys data privacy law’s evolution in the United States by examining its nineteenth century origins, twentieth century developments particular to the Supreme Court’s Fourth Amendment jurisprudence, and contemporary patchwork of federal and state statutes.
A. As Provided by the Common Law
The concept of legally protected data privacy in the United States emerged from the hands of attorney Samuel Warren and then-Judge Louis Brandeis. Responding to the advent and marked increase in the use of commercial photography, Warren and Brandeis published The Right to Privacy in 1890.34Samuel D. Warren & Louis D. Brandeis, The Right to Privacy, 4 Harv. L. Rev. 193 (1890). The authors’ work evaluated the common law’s capacity to adapt to social developments—for example, expanding the reach of torts to reputational injuries given society’s increased recognition of emotional harms.35See id. at 193–95. Just so, Warren and Brandeis urged a new common law right—the right “to be let alone.”36Id. at 195. The authors asserted that this right was essential to preserve an individual’s interest in solitude and privacy.37Id. at 196. And though the authors analogized this interest to the interests recognized in common law torts of trespass and battery, Warren and Brandeis also argued that the right to be let alone had an inviolate personality as a “right as against the world.”38Id. at 205, 213.
This newly articulated right, while expansive, was not unlimited. Warren and Brandeis set forth six exceptions,39Id. at 214–19. two of which are particularly relevant to personal data privacy. First, the authors stated that the right to be let alone would not extend to matters in the public interest.40Warren & Brandeis, supra note 34, at 214–16. To balance collective and individual interests, disclosures in the public interest would not face liability.41Id. For example, the public could scrutinize a political candidate’s private affairs to determine her worthiness for office—the right to be let alone would then not apply.42See West v. Thomson Newspapers, 872 P.2d 999, 1013 (Utah 1994). Second, the authors stated that the right to be let alone would not extend to matters to which the individuals had consented to release.43See Warren & Brandeis, supra note 34, at 216–17. For example, opposing counsel could evaluate a defendant’s conversations following his waiver of attorney-client privilege—the right to be let alone would then, again, not apply.44See Buntin v. Becker, 727 N.E.2d 734, 741 (Ind. Ct. App. 2000).
Shortly after Warren and Brandeis’s article, state legislatures and courts began recognizing individuals’ data privacy interests. In 1903, New York passed a privacy statute providing a cause of action in tort for privacy violations.45See N.Y. Civ. Rights Law § 51 (McKinney 2003). In 1905, the Georgia Supreme Court recognized a common law tort of privacy invasion, reasoning that a right to privacy is based “in the instincts of nature . . . recognized intuitively.”46Pavesich v. New England Life Ins., 50 S.E. 68, 69, 71 (Ga. 1905). Other states affirmatively included the right to privacy in their constitutions.47For codified state constitutional rights to privacy, see, for example, Alaska Const. art. 1, § 22 (1972) (“The right of the people to privacy is recognized and shall not be infringed.”); Cal. Const. art. 1, § 1 (1974) (including privacy within its enumerated inalienable rights). In 1960, Dean William Prosser surveyed state court privacy decisions after the Warren and Brandeis article.48See generally William L. Prosser, Privacy, 48 Calif. L. Rev. 383 (1960). Dean Prosser divided the cases into four categories: (1) intrusion upon seclusion;49See Burns v. Masterbrand Cabinets, Inc., 874 N.E.2d 72, 74, 77 (Ill. App. Ct. 2007) (holding an employee asserted a prima facie case of intrusion upon seclusion when his employer hired a private detective to enter the employee’s home under false pretenses). (2) public disclosure of embarrassing, private facts;50See Doe v. Mills, 536 N.W.2d 824, 830 (Mich. Ct. App. 1995) (holding plaintiffs asserted a prima facie case of public disclosure of embarrassing, private facts when abortion protestors displayed plaintiffs’ names on signs as plaintiffs arrived at clinics for scheduled abortion procedures). (3) publicity placing one in a false light;51See Welling v. Weinfeld, 113 Ohio St. 3d 464, 2007-Ohio-2451, ¶¶ 5, 61 (holding neighbors asserted a prima facie case of false-light invasion of privacy when business owners circulated fliers requesting information about the neighbors’ vandalism of business owner’s property). and (4) appropriation of one’s image or likeness.52See Ainsworth v. Century Supply Co., 693 N.E.2d 510, 513 (Ill. App. Ct. 1998) (holding an employee asserted a prima facie case of appropriation of likeness when his employer used a video of the employee installing tile in an advertisement without employee’s consent). The Restatement of Torts adopted Dean Prosser’s typology, recognizing such torts collectively as “invasion of privacy.”53Restatement (Second) of Torts § 652B–E (Am. Law Inst. 1977). While many state courts accepted all four privacy torts, they did so at different points and to varying degrees (only North Dakota has yet to recognize any of the four common law privacy torts).54See, e.g., Howard v. Aspen Way Enters., 2017 WY 152, ¶ 30, 406 P.3d 1271, 1279 (Wyo. 2017) (recognizing the tort of intrusion upon seclusion); Hougum v. Valley Mem. Homes, 1998 ND 24, ¶¶ 12–13, 574 N.W.2d 812, 816 (stating that North Dakota has not recognized tort actions for invasions of privacy).
As common law privacy torts became more widely recognized, the duty of protecting and interpreting the “right to be let alone” fell primarily on state legislatures and courts. Yet the federal judiciary separately and concurrently interpreted data privacy interests under the Constitution; as particularly relevant for this Comment, the Supreme Court addressed the constitutionality of emerging technologies—and the threat they imposed to personal data privacy—in the hands of law enforcement.
B. As Against the Government: The Fourth Amendment
The Fourth Amendment to the Constitution—aimed at preventing unreasonable searches and seizures—provides, in pertinent part: “[t]he right of the people to be secure in their persons, houses, papers, and effects, against unreasonable searches and seizures, shall not be violated.”55U.S. Const. amend. IV. The Supreme Court has interpreted the objects to which the Fourth Amendment applies in light of the meaning the Founders ascribed to the Amendment at ratification.56See, e.g., Carroll v. United States, 267 U.S. 132, 149 (1925); see also Riley v. California, 573 U.S. 373, 403 (2014) (recognizing the Fourth Amendment’s origins in protecting against British general warrants and writs of assistance). Since ratification, however, the scope of the Amendment’s protections has significantly broadened with law enforcement’s use of electronic surveillance.57Compare Olmstead v. United States, 277 U.S. 438, 466 (1928) (refusing to recognize police warrantless use of a wiretap as an unreasonable search), with Carpenter v. United States, 138 S. Ct. 2206, 2220 (2018) (recognizing police warrantless use of cell phone location information as an unreasonable search).
Beginning in the 1920s, the Court adhered to a trespass-centric view of the Fourth Amendment. In Olmstead v. United States,58277 U.S. 438 (1928). the Court held that law enforcement’s tapping of an individual’s phone lines was not a search because the agents did not physically invade the individual’s property.59Id. at 466. Yet such interpretations were not without opposition. A strong dissent, penned by then-elevated Justice Brandeis, presciently stated that “[t]he progress of science in furnishing the government with means of espionage is not likely to stop with wire tapping.”60See id. at 474 (Brandeis, J., dissenting). He urged the Court to recognize the Founders’ intent to secure for citizens “the right to be let alone—the most comprehensive of rights . . . every unjustifiable intrusion by the government upon the privacy of the individual, whatever the means employed, must be deemed a violation of the Fourth Amendment.”61See id. at 478–79.
While the Court adhered to its trespass-based understanding of the Fourth Amendment for the next four decades, the advent of more sophisticated technology altered the understanding. In Katz v. United States,62389 U.S. 347 (1967). the Court reconsidered its Fourth Amendment jurisprudence, concluding that its myopic focus on trespass could no longer completely control.63See id. at 353. In Katz, government agents—without a warrant—attached an electronic listening device to a public phone booth.64Id. at 348. Though the Court explicitly rejected the proposition that the Constitution provides a right to privacy—maintaining that any such right is left to the states—it held that the Fourth Amendment does protect people and what they seek to keep private.65Id. at 350–51. The Court concluded that the government’s warrantless use of electronic listening devices violated the Fourth Amendment.66Id. at 359.
The lasting impact of Katz lies within Justice Harlan’s concurrence and its two-pronged “reasonable expectation of privacy”67Id. at 360 (Harlan, J., concurring). test, which determines what privacy interests the Fourth Amendment protects. To receive Fourth Amendment protection under Justice Harlan’s test, an individual must (1) manifest a subjective expectation of privacy over the relevant information, and that subjective expectation must (2) be one that society recognizes as reasonable.68Katz, 389 U.S. at 361 (Harlan, J., concurring). As applied to the facts in Katz, Justice Harlan noted that when Katz closed the phonebooth door and paid to place a call, he subjectively manifested his expectation to have a private, undisturbed conversation.69Id. Justice Harlan then determined that society deemed Katz’s expectation reasonable.70Id. Justice Harlan’s concurrence became an additional analytical vehicle—beyond the trespass-centric approach—through which the Court assessed subsequent technology-based “searches” under the Fourth Amendment.71See, e.g., Carpenter v. United States, 138 S. Ct. 2206, 2213 (2018); United States v. Knotts, 460 U.S. 276, 280–82 (1983).
A significant limit to the “reasonable expectation of privacy” test—voluntary disclosure—arose from one of Warren and Brandeis’s original limits to the right to be let alone. In Smith v. Maryland,72442 U.S. 735 (1979). the Court considered whether attaching a pen register to a phone line was a search under the Fourth Amendment.73Id. at 736. A pen register collects information by recording the phone numbers dialed from and received on an individual phone line; it does not record calls’ contents. Pen Register, Black’s Law Dictionary (11th ed. 2019). The Court differentiated the capabilities of pen registers in Smith from the capabilities of listening devices in Katz.74Smith, 442 U.S. at 741–45. Relying on its prior decision in United States v. Miller,75425 U.S. 435 (1976). the Court held that individuals do not retain an expectation of privacy in information voluntarily conveyed to telephone companies.76Smith, 442 U.S. at 745. The holdings in Smith and Miller established the “third-party doctrine”: when a sharer willingly provides information to a third party, the sharer loses his or her privacy expectation in that information.77See Smith, 442 U.S. at 743–44; Miller, 425 U.S. at 443–445 (holding individuals have no expectation of privacy in records voluntarily conveyed to banks).
Continued technological development defined many of the cases that the Court subsequently considered under the Fourth Amendment. Information that law enforcement acquires often can and does fall within the third-party doctrine, removing it from Fourth Amendment protection.78See United States v. Davis, 785 F.3d 498, 511 (11th Cir. 2015) (en banc) (“[L]ike the bank customer in Miller and the phone customer in Smith, Davis has no subjective or objective reasonable expectation of privacy in MetroPCS’s business records showing the cell tower locations that wirelessly connected his calls . . . .”); In re Application of the U.S. for Historical Cell Site Data, 724 F.3d 600, 610 (5th Cir. 2013) (“Where a third party collects information in the first instance for its own purposes, the Government . . . can obtain this information later . . . .”). Even so, some justices have expressed reservations about the extent to which the third-party doctrine should apply.79See Carpenter v. United States, 138 S. Ct. 2206, 2217–20 (2018) (declining to extend Smith and Miller to government use of cell phone location information); United States v. Jones, 565 U.S. 400, 417 (2012) (Sotomayor, J., concurring) (urging the Court to reconsider the applicability of the third-party doctrine to the digital age where “people reveal a great deal of information about themselves to third parties in the course of carrying out mundane tasks” through text messages, e-mails, and websites visited). For example, in Carpenter v. United States,80138 S. Ct. 2206 (2018). the Court outlined factors leading to its conclusion that the warrantless access of cell-site location information was a Fourth Amendment search to which the third-party doctrine did not apply.81Id. at 2220. The Court reasoned that individuals have a reasonable expectation of privacy in location information, noting that the information is “a detailed chronicle of a person’s physical presence compiled every day, every moment, over several years.”82Id. at 2219–20.
Tracing the Court’s Fourth Amendment data-privacy-related jurisprudence provides a useful lens apart from state-based privacy torts to consider US data privacy law. And examining the advent of the Court’s Fourth Amendment reasonable expectation of privacy test provides a potential avenue to address the impact of technology on the reasonableness of privacy intrusions.83See, e.g., id. at 2220; Riley v. California, 573 U.S. 373, 386 (2014) (police warrantless access of data stored on a cell phone was unreasonable); Jones, 565 U.S. at 403–04 (police access of twenty-eight days’ worth of GPS data to track a suspect’s physical location without a valid warrant was unreasonable); Kyllo v. United States, 533 U.S. 27, 34–35 (2001) (police warrantless use of thermal-imaging device to examine suspect’s home was unreasonable). Even so, since the Fourth Amendment’s proscriptions run against the government, Congress is the appropriate actor to next consider in the evolution of US data-privacy law.
C. As Against Industries: A Patchy Statutory System
In addition to state common law and Supreme Court Fourth Amendment jurisprudence, Congress has actively legislated within the data privacy field. To secure reasonable protections for citizens and regulated entities, congressional data privacy legislation rests on five Fair Information Practice Principles: (1) notice and awareness; (2) choice and consent; (3) access and participation; (4) integrity and security; and (5) enforcement and redress.84See Secy’s Advisory Comm. on Automated Pers. Data Sys., Dep’t of Health Educ. & Welfare, Records Computers and the Rights of Citizens, at xxi (1973). As privacy issues have emerged, Congress has embraced a sectoral approach, promulgating individual statutes or subsections of statutes to proactively address or respond to public concerns.85See Daniel J. Solove & Paul M. Schwartz, Information Privacy Law 37 (6th ed. 2018). For a helpful chart, see W. Gregory Voss & Kimberly A. Houser, Personal Data and the GDPR: Providing a Competitive Advantage for U.S. Companies, 56 Am. Bus. L.J. 287 app. (2019). For example, following the Court’s decision in Olmstead, Congress promulgated section 605 of the Federal Communications Act,86Pub. L. No. 73-416, § 605, 48 Stat. 1103 (1934) (codified at 47 U.S.C. § 605). accepting the Court’s invitation to legislate on electronic surveillance.87See Olmstead v. United States, 277 U.S. 438, 465–66 (1927) (“Congress may of course protect the secrecy of telephone messages by making them, when intercepted, inadmissible in evidence in federal criminal trials, by direct legislation, and thus depart from the common law of evidence.”).
Two statutes exemplify Congress’s methods of legislating and enforcing data privacy protections in particular industries. The first, the Health Insurance Portability and Accountability Act (“HIPAA”),88Pub. L. No. 104-191, 110 Stat. 1936 (1996) (codified at 42 U.S.C. § 201). was not specifically passed to address data privacy concerns. Instead, because of HIPAA’s relatively narrow subject matter, Congress later delegated the promulgation of a specific data privacy rule to a federal administrative agency. By contrast, the second, the Children’s Online Privacy Protection Act (“COPPA”),89Pub. L. No. 105-277, § 1301, 112 Stat. 2681–728 (1998) (codified as 15 U.S.C. §§ 6501–6506). was specifically passed to address ongoing data privacy concerns. Despite the statutes’ differing origins, Congress left enforcement of both to federal agencies.
1. Health Insurance Industry: HIPAA
HIPAA provides minimum requirements for health data privacy in the United States. When Congress initially enacted HIPAA, it sought to streamline the transferability of employees’ healthcare coverage rather than specifically protect data privacy.90Solove & Schwartz, supra note 85, at 509. To that end, HIPAA established uniform codes to facilitate the information exchanges.91Id. In doing so, Congress noted the potential for data privacy concerns to arise.92Id. Instead of directly addressing the issue, Congress passed the act, giving itself two years to either address data privacy concerns or authorize the Department of Health and Human Services (“HHS”) to do so.93Id. at 509–10.
Years later, the HHS promulgated data privacy protections for health information. The agency issued its final version of HIPAA privacy regulations—the Privacy Rule9445 C.F.R. §§ 160.101–160.552.— in December of 2000, and the regulations became effective in 2003.95Solove & Schwartz, supra note 85, at 510. The Privacy Rule “establishe[d], for the first time, a set of national standards for the protection of certain health information.”96Dep’t of Health & Hum. Servs., Summary of the HIPAA Privacy Rule 1 (2003). It specifically covers personally identifiable health information.97See 45 C.F.R. §§ 160.102(a), 160.103. The Privacy Rule applies to covered entities—namely, health plans, health care clearinghouses, and health care providers.98Id. § 160.102(a). It protects oral and recorded information about a patient’s past, present, or future ailments.99See id. § 160.103. Covered entities, schools, public health authorities, insurers, and employers can create this information.100See id. The information must either identify the subject to whom it refers or create in others a reasonable basis for ascertaining the subject’s identity.101See id. And the relevant information retains its HIPAA protections as it passes from the hands of covered entities to other entities—covered or non-covered.102Id. § 160.105.
As enacted, the Privacy Rule assigned certain responsibilities to covered entities and regulated the use and disclosure of the protected information. For example, under the Privacy Rule, covered entities must adopt written privacy policies and provide accompanying training to employees.10345 C.F.R. § 164.530(b)(1), (h)(i)(1). And the Privacy Rule prohibits covered entities from disclosing protected health data without authorization.104Id. § 164.508. The person to whom the records refer must authorize the disclosure in writing and indicate the recipients of the information.105Id. § 164.508(c)(ii)–(iii), (vi). The authorization must have an expiration date after which the recipient can no longer access the information.106Id. § 164.508(c)(v). Lastly, the Privacy Rule provides individuals with the opportunity to file a complaint alleging HIPAA violations with HHS.107See id. § 164.520(b)(1)(vi). In passing these provisions, Congress left a fair amount of enforcement and regulatory control to the states. Under HIPAA, states can craft data privacy laws regulating health information so long as those regulations are “more stringent” than those provided by the federal statute.108See id. § 160.203(b).
2. Online Advertising to Children: COPPA
A federal statute intended to protect data privacy, COPPA regulates the type of data online service providers can collect and retain from children.109See Julia Jacobson & Heather Egan Sussman, Protecting Children Online: New Compliance Obligations for Digital Marketing to Children, Bos. Bar J., Summer 2013, at 1, 1. Unlike its motivation for enacting HIPAA, Congress enacted COPPA to address concerns about the effect of increased internet access and internet advertising on children.110See FTC, Privacy Online: A Report to Congress 31–32 & fig.7 (1998) (finding that eighty-nine percent of 212 websites surveyed collected personal information—including, in order of frequency, e-mail address, name, postal address, age, gender, and phone number—from children). In the years before COPPA, advertisers used “advergames” to present branded and gamified internet communications to children.111Elizabeth S. Moore, Kaiser Family Found., It’s Child’s Play: Advergaming and the Online Marketing of Food to Children 1–2 (2006). In exchange for a child’s personal data, companies offered sweepstakes and free games.112See id. at 13–14; Melanie L. Hersh, Note, Is COPPA a Cop Out? The Child Online Privacy Protection Act as Proof That Parents, Not Government, Should Be Protecting Children’s Interests on the Internet, 28 Fordham Urb. L.J. 1831, 1852 (2001). Psychological data indicated that children of certain ages were less likely to perceive content presented to them as advertisements.113See FTC, supra note 110, at 4–5 (stating that websites’ collection of children’s personal information online “present[s] unique privacy and safety concerns because of the particular vulnerability of children, the immediacy and ease with which information can be collected from them, and the ability of the online medium to circumvent the traditional gatekeeping role of the parent”); see also Children’s Online Privacy Protection Rule, 16 C.F.R. § 312.7 (“An operator is prohibited from conditioning a child’s participation in a game, the offering of a prize, or another activity on the child’s disclosing more personal information than is reasonably necessary to participate in such activity.”); Brian L. Wilcox, Dale Kunkel, Joanne Cantor, Peter Dowrick, Susan Linn & Edward Palmer, Report of the APA Task Force on Advertising and Children 5 (2004) (indicating that children below the ages of four or five cannot consistently differentiate between television programming and advertisements); Deborah Roedder John, Consumer Socialization of Children: A Retrospective Look at Twenty-Five Years of Research, 26 J. Consumer Rsch. 183, 188, 204 tbl.2 (1999) (stating that while children, by the age of four, can distinguish between television programming and advertising, ninety percent of four-year-old children cannot distinguish between the intent behind television programming (entertainment) and advertising (persuasion)).
COPPA regulations are wide-reaching. The statute applies to “operator[s] of a website or online service directed to children, or any operator that has actual knowledge that it is collecting personal information from a child.”11415 U.S.C. § 6502(a)(1). COPPA also applies to third-party sites operating on websites and establishes a strict liability regime on host websites for third-party violations.115Children’s Online Privacy Protection Rule, 78 Fed. Reg. 3972, 3975 (Jan. 17, 2013) (codified at 16 C.F.R. § 312.2). Personal information restricted under COPPA includes street, city, and state location data; email addresses; screen names; and “persistent identifier[s] that can be used to recognize a user over time and across different internet sites or online services.”11616 C.F.R. § 312.2. With respect to prohibited methods of data collection, COPPA defines “collection” broadly, stating that it can occur “by any means”; the statute provides two, non-exhaustive examples: first, requesting online information, and second, providing a form that would make identifiable information publicly available.117Id. Websites subject to COPPA must provide privacy policies indicating the type of information the website collects and the use and disclosure of such information.11815 U.S.C. § 6502(b)(1)(A)(i).
COPPA’s broad language has hindered enforcement and generated confusion among regulated parties. COPPA explicitly indicates that it preempts state law.119Id. § 6502(d). And while individuals are not provided a private cause of action for violations, states can sue for injunctive relief on behalf of their citizens.120See id. § 6504(a)(1). Because the statute defines violations as unfair or deceptive practices, the FTC is charged with enforcement.121See id. § 6502(c). But the FTC has minimal guidance on how to determine whether content is “directed to children.”122Jacobson & Sussman, supra note 109, at 2. One of the FTC’s recent COPPA enforcement actions, brought against Google and YouTube (both owned by Alphabet, Inc.), resulted in a $170 million settlement—the largest since COPPA’s enactment.123Press Release, FTC, Google and YouTube Will Pay Record $170 Million for Alleged Violations of Children’s Privacy Law (Sept. 4, 2019), https://perma.cc/BBP6-TQFY. The complaint, filed by the FTC and the New York Attorney General, alleged that YouTube gathered identifiable information from children under thirteen years old without receiving parental consent.124Complaint ¶¶ 49, 50(c), FTC v. Google LLC, No. 19-cv-2642 (D.D.C. Sept. 4, 2019). The complaint alleged that some YouTube channels directly targeted children and that the company knew it would reach children.125Id. ¶¶ 23, 28, 29 (alleging that YouTube promoted its advertising service to companies as “today’s leader in reaching children age 6-11 against top TV channels,” toy company Mattel used YouTube channels for well-known brands like Barbie and Hot Wheels, and channels’ “About” sections stated they were geared to children). The complaint further alleged that YouTube generated nearly $50 million in advertising on these child-targeted channels.126Id. ¶ 41. Under the terms of the settlement between YouTube, Google, and the FTC, the companies agreed to “create a mechanism” for publishers to indicate when their videos are “directed to children.”127Kristin Cohen, YouTube Channel Owners: Is Your Content Directed to Children?, FTC (Nov. 22, 2019, 12:56 PM), https://perma.cc/F5GH-MW4C; see generally Stipulated Order for Permanent Injunction and Civil Penalty Judgement, FTC v. Google LLC, No. 19-cv-2642 (D.D.C. Apr. 25, 2019). Even after the settlement, widespread confusion about determining whether content was “directed to children” remained.128See, e.g., Allison Schiff, The FTC Gets an Earful on COPPA from YouTube Content Creators, AdExchanger (Dec. 13, 2019, 5:04 PM), https://perma.cc/Z37Q-L52H (summarizing scathing public comments the FTC received following the agency’s YouTube COPPA enforcement action).
While these two statutes represent two different approaches to data privacy regulation—indirectly through subsequently enacted rules (HIPAA) and directly (COPPA)—both responded to relatively uniform demands for privacy regulation. With HIPAA, members of Congress recognized that health insurance legislation could jeopardize patients’ privacy. Similarly, with COPPA, members of Congress responded to the research of marketers, psychologists, and parents, all of whom encouraged data privacy protections for children. The parties to whom both regulations apply (healthcare providers and online service providers), the beneficiaries of the regulations (patients and children), and data regulated (health information and information about children) were fairly finite. Both data privacy regulations indicated broad agreement among members of Congress—and US citizens by proxy—that the country needed increased privacy protections in the two fields. By contrast, such broad agreement has yet to materialize within Congress and among US citizens about consumer data privacy.129See Susanne Barth & Menno D.T. de Jong, The Privacy Paradox – Investigating Discrepancies between Expressed Privacy Concerns and Actual Online Behavior – A Systematic Literature Review, 34 Telematic & Informatics 1038, 1039 (2017) (defining the “the privacy paradox” as when “users claim to be very concerned about their privacy but do very little to protect their personal data”).
II. Global Climate Precipitating Regulatory Changes
An understanding of the historical foundations of privacy law and the bodies that promulgate and enforce it provides necessary context to the increasingly technologized global landscape that demands some level of responsive data privacy regulation within the United States. This Part addresses the expansion in internet usage and surveys the efforts of technology companies to develop data-generating and free-to-consumer programs that run on such an architecture. Despite the economic growth that these programs have fostered, they have also contributed to concerns about the use and transfer of data collected. These concerns have culminated in the passage of the GDPR in the European Union and various state-based privacy laws in the United States.
A. The Internet’s Ubiquity and Economy of Information Sharing
Since the inception of the internet,130Internet, Cambridge Dictionary, https://perma.cc/SNF5-F2XH (“[T]he large system of connected computers around the world that allows people to share information and communicate with each other . . . .”). the world has evolved into a digital ecosystem of interconnectivity. The internet is pervasive: whereas in 2000, fifty-two percent of adults in the United States used the internet, ninety percent do so today.131Internet/Broadband Fact Sheet, Pew Rsch. Ctr. (June 12, 2019), https://perma.cc/RT5P-G76P. Smartphones capable of accessing the internet have become today’s essential tool; as of February 2019, eighty-one percent of the US population use them.132Mobile Fact Sheet, Pew Rsch. Ctr. (June 12, 2019), https://perma.cc/VQT4-J2F6. Indeed, a recent study reveals that eighty-one percent of adults in the United States also go online at least once per day, with twenty-eight percent indicating that they are online “almost constantly.”133Andrew Perrin & Madhu Kumar, About Three-in-Ten U.S. Adults Say They Are ‘Almost Constantly’ Online, Pew Rsch. Ctr. (July 25, 2019), https://perma.cc/2QTW-ZGS9.
Widespread use of the internet has led to the emergence of the digital economy, “the digital-enabling infrastructure needed for an interconnected computer network.”134Initial Estimates Show Digital Economy Accounted for 6.5 Percent of GDP in 2016, Bureau of Econ. Analysis (Mar. 15, 2018), https://perma.cc/GK5B-UPGW. A recent United Nations report found that the US economy accounted for sixty-eight percent of the market capitalization value of the seventy largest digital platforms.135See U.N. Conf. on Trade & Dev., Digital Economy Report, at xvi, 2 (2019) (listing the major players in the digital economy space as including Microsoft, Apple, Amazon, Google, and Facebook). This report estimated 2017 e-commerce sales in the United States to be $8.9 billion, forty-six percent of the nation’s 2017 GDP.136Press Release, U.N. Conf. on Trade & Dev., Global e-Commerce Sales Surged to $29 Trillion, U.N. Press Release UNCTAD/PRESS/PR/2019/2007 (Mar. 29, 2019). And the digital economy has facilitated a borderless ecosystem of goods and services with firms operating across the nation and world.137See U.N. Conf. on Trade & Dev., supra note 135, at xvi (“The economic geography of the digital economy does not display a traditional North-South divide.”).
Technology companies and consumers alike have benefitted from the internet’s ubiquity. Facebook and Google in particular have profited off increased internet use by offering free-to-consumer platforms and social media sites.138Gigi B. Sohn, A Policy Framework for an Open Internet Ecosystem, 2 Geo. L. Tech. Rev. 335, 340 (2018). As Gigi Sohn, a Georgetown fellow, states, “the user is paying [Facebook and Google] with his or her personal data and receiving ‘free’ online services in return.”139Id. While an estimated seventy-two percent of adults have an account on at least one social media platform,140Social Media Fact Sheet, Pew Rsch. Ctr. (June 12, 2019), https://perma.cc/H7F2-FZR4. the two most popular sites by percentage of adult users are Google’s YouTube (seventy-three percent) and Facebook (sixty-nine percent).141Id. Of those who use Facebook, seventy-four percent report that they check the website at least once per day.142John Gramlich, 10 Facts about Americans and Facebook, Pew Rsch. Ctr. (May 16, 2019), https://perma.cc/3HEK-YG7B.
Smartphone access to social media sites is facilitated in part by user-focused application programming interfaces (“APIs”).143Derrick Oien & Michael Umansky, Smartphone Wars: Data Ownership, Access and Storage in the New Era, 4 J. Legal Tech. Risk Mgmt. 34, 35 (2009). APIs integrate third-party sites within host applications (“apps”) to create seamless user experiences.144See id. at 36. On the front end, APIs enable user access to multiple sites.145Id. at 35. For example, the integration of Amazon’s free Product Advertising API on the Weather Channel app enables users of the app to easily access Amazon sites for products. On the back end, APIs facilitate data sharing.146Id. Amazon gains a wider pool of customers and data about those customers. The Weather Channel gains an additional source of revenue from users who buy Amazon products and limited access to Amazon data.147See Amazon Web Servs., Product Advertising API: Developer Guide 21–22 (2014), https://perma.cc/VR2E-PRSM; see also Jack Morse, Facebook Isn’t the Only One with Too Much of Your Data. Just Ask Google and Amazon, Mashable (Apr. 14, 2018), https://perma.cc/UYV8-T9SA. But see Louise Matsakis, Amazon Cracks Down on Third-Party Apps over Privacy Violations, Wired (Sept. 18, 2019, 1:02 PM), https://perma.cc/6Y66-ZZZX (discussing Amazon’s recent focus on limiting third parties’ access to its API due to privacy concerns arising from misuse of customers’ personal information for, among other purposes, targeted advertising).
Data is a valuable asset crucial to a modern technology company’s financial success. In a May 2014 report, the FTC stated that “Big Data is big business.”148FTC, Data Brokers: A Call for Transparency and Accountability, at i (2014), https://perma.cc/57AY-LUBX; see Stacy-Ann Elvy, Commodifying Consumer Data in the Era of the Internet of Things, 59 B.C. L. Rev. 423, 426 (2018); Anne Logsdon Smith, Alexa, Who Owns My Pillow Talk? Contracting, Collateralizing, and Monetizing Consumer Privacy through Voice-Captured Personal Data, 27 Cath. U. J.L. & Tech. 187, 188 (2018); The World’s Most Valuable Resource Is No Longer Oil, but Data, The Economist (May 6, 2017), https://perma.cc/3QCP-6PGD (arguing that antitrust authorities should “take into account the extent of firms’ data assets when assessing the impact of deals”); see also Data is Giving Rise to a New Economy, The Economist (May 6, 2017), https://perma.cc/ZXU3-D3KN (stating that Uber’s then-$68 billion valuation was due in part to the company’s ownership of consumer data related to transportation). Just so, many large companies have business ventures dedicated exclusively to data brokering—the business of “selling information about companies, markets, [and consumers].”149See Data Broker, Cambridge Bus. Eng. Dictionary, https://perma.cc/8JVB-TW8M; see also Michal Wlosik, What Is a Data Broker and How Does It Work?, Clearcode (Feb. 4, 2019), https://perma.cc/SLM5-DGHR (stating that some data brokers “have up to 1,500 pieces of information about a person”). Acxiom, a credit reporting agency, reports to have “multi-sourced insight into approximately 700 million consumers worldwide.”150Acxiom, Annual Report 12 (2018), https://perma.cc/EW9F-P9W8. Information can be public (from, for example, social media sites or government programs) or private (from, for example, APIs within apps).151Max Eddy, How Companies Turn Your Data into Money, PCMag (Oct. 10, 2018), https://perma.cc/99HQ-AGHS. Companies like Acxiom splice and combine data into more than 1,500 demographic, socio-economic, and lifestyle segments.152Acxiom, supra note 150, at 11. These segments are sold to marketers as audiences to which advertisements can be targeted.153Id.
B. Privacy Threats from Online Practices
While some remain confused about how companies profit off free-to-consumer programs,160See Transcript of Mark Zuckerberg’s Senate Hearing, supra note 2. Facebook, Amazon, and Alphabet primarily generate revenue through information collected and funneled into their advertising businesses.161See James L. Heskett, What’s the Antidote to Surveillance Capitalism?, Harv. Bus. Sch. Working Knowledge (Mar. 4, 2019), https://perma.cc/G8V5-7ML8 (defining surveillance capitalism as “a new economic order that claims human experience as free raw material for hidden commercial practices of extraction, prediction, and sales”); Sylvain, supra note 11, at 1090–91. Alphabet has its own digital marketing platform—Enterprise Advertising and Analytics Solutions (previously known as DoubleClick Marketing)—which serves data-layered advertising placements that target segments of internet users.162See Todd Spangler, Google Killing Off DoubleClick, AdWords Names in Rebranding of Ad Products, Variety (June 27, 2018, 9:06 AM), https://perma.cc/P8UR-CDNK. Alphabet’s Enterprise Advertising and Analytics Solutions and similar tools like Amazon’s Marketing Services and Facebook’s Facebook Analytics are built on algorithms that compile users’ traits and behavior for marketing purposes.163See Sohn, supra note 138, at 339; Heskett, supra note 161. At a minimum, these companies collect IP addresses, cookies, and geolocation information.164See Sohn, supra note 138, at 351. For Facebook specifically, users voluntarily share additional information like pictures and brand preferences.165Paul Ohm, Broken Promises of Privacy: Responding to the Surprising Failure of Anonymization, 57 UCLA L. Rev. 1701, 1725 (2010). Google and Amazon, each with extensive digital footprints, can combine data points across services and devices to create a persistent identity graph for each user.166See Chitra Iyer, What Is an Identity Graph? Definition, Why You Need It, and Examples, Toolbox (May 28, 2020), https://perma.cc/9GTC-F9RH (defining “identity graph” as “a database that creates linkages between all identifiers that are associated with an individual customer”); see also Alexander Tsesis, Marketplace of Ideas, Privacy, and the Digital Audience, 94 Notre Dame L. Rev. 1585, 1589–90, 1608 (2019); id. at 1602 (“Since the dawn of the internet, the default of U.S. internet companies has been to assume that they can make business use of all data acquired . . . .”); Chuck Moxley, It’s 8 AM. Do You Know Where Your Customers Are?, Toolbox (Apr. 11, 2018), https://perma.cc/L9KY-4P4J (reviewing the data points Amazon has on its consumers through consumers’ shopping behavior on the Amazon website and app, questions asked to Alexa, and reading, listening, and watching preferences on Kindle, Amazon Music, and Fire Stick, respectively). These persistent data sets allow the companies to identify a user across platforms, understand that user’s preferences, and craft personalized offers.167See Stacy-Ann Elvy, Paying for Privacy and the Personal Data Economy, 117 Colum. L. Rev. 1369, 1379 (2017) (“Companies can pervasively monitor consumers’ activities and behaviors and create a ‘device graph’ . . . [and] combine a consumer’s device graph with offline information for data analytics and targeted advertising purposes.”); Moxley, supra note 166.
In 2018, an explosive news story revealed that Facebook allowed researchers from Cambridge Analytica, a political consulting firm, to collect and study fifty million user profiles without the users’ knowledge or permission.168Cadwalladr & Graham-Harrison, supra note 5. Cambridge Analytica then sold this information to political candidates who used the data for campaigning.169Id. The public outcry against Facebook was swift and severe. As the news spread, seventy-four percent of surveyed adults in the United States said that they did not know the extent to which Facebook specifically used their interests and information to sell to advertisers.170Gramlich, supra note 142. More generally, fifty-one percent of that same group stated that they were not comfortable with Facebook using this information at all.171Id. As a whole, sixty-two percent believed that it is not possible to go through their days without companies collecting data about them.172John Gramlich, 19 Striking Findings from 2019, Pew Rsch. Ctr. (Dec. 13, 2019), https://perma.cc/3AVW-S8UQ; see Elvy, supra note 167, at 1380 (collecting recent statistics on consumers’ “self-help measures to protect their privacy” and concluding that “consumers have some level of awareness about the extent to which their online activities are being monitored”). Concerns about corporations’ data practices and a lack of regulatory accountability invigorated calls for comprehensive data privacy legislation.
III. Two Global Responses to the Current Climate
Internet-enabled technology has transformed the global economy, and threats to personal consumer privacy have emerged as collateral. Yet while technology continues to develop, corresponding regulation has not kept pace.173See Margaret Ryznar, Protecting Users of Social Media, 94 Notre Dame L. Rev. Online 141, 142 (2019); see also Stuart L. Pardau, The California Consumer Privacy Act: Towards a European-Style Privacy Regime in the United States?, 23 J. Tech. L. & Pol’y 68, 72 (2018). This Part surveys and contrasts two approaches to data privacy regulation. The first approach, that of the European Union, provides overarching data privacy protection through omnibus legislation. The second approach, that of the United States, provides sectoral data privacy protection through two separate bodies: the FTC and the states.
A. Privacy as an Intrinsic Right: European Omnibus Legislation
The European response to consumer data privacy concerns reflects the value Europeans place on personal privacy. The 1950 European Convention on Human Rights resolved to protect citizens’ private lives and data against both the government and corporations.174Charter of Fundamental Rights of the European Union, art. 8(1), 2000 O.J. (C 364) 1, 10 (“Everyone has the right to the protection of personal data concerning him or her.”); Case C-144/04, Mangold v. Helm, 2005 E.C.R. I-10013, I-10019 (indicating that EU citizens’ rights against the government apply equally against the private sector). Such a protection was grounded within the European conception of privacy as a fundamental human right.175See Charter of Fundamental Rights of the European Union, art. 8. To Europeans, “privacy” entails an individual’s image, reputation, and name.176James Q. Whitman, The Two Western Cultures of Privacy: Dignity Versus Liberty, 113 Yale L.J. 1151, 1167 (2004). As Professor James Whitman states, Europeans view privacy “as a value primarily threatened by two forces: the excesses of the free press and the excesses of the free market.”177Id. at 1171. In this sense, then, the European concept of privacy battles against two key guarantees deeply entrenched in US society: free speech and laissez-faire economics.178Id.
The GDPR reflects this fundamental nature of European privacy rights. Effective May 25, 2018, the GDPR gave EU citizens explicit rights to their personal data, as well as a means to ensure its protection.179See Paul M. Schwartz & Karl-Nikolaus Peifer, Transatlantic Data Privacy Law, 106 Geo. L.J. 115, 118 (2017); see also Chris Jay Hoofnagle, Bart van der Sloot & Frederick Zuiderveen Borgesuis, The European Union General Data Protection: What It Is and What It Means, 28 Info. & Commc’ns Tech. L. 65, 93 (2019). The legislation is based on six “[p]rinciples relating to processing of personal data”: (1) fairness and transparency; (2) purpose limitation; (3) data minimization; (4) accuracy; (5) storage limitation; and (6) confidentiality.180General Data Protection Regulation, supra note 16, art. 5; see also The History of the General Data Protection Regulation, European Data Prot. Supervisor, https://perma.cc/D7M3-GN3M. Among the rights granted to EU citizens under the GDPR are the right to object to a company’s use of personal information, know how the data is used, request the information used, and erasure (commonly known as the right to be forgotten).181Roslyn Layton & Julian Mclendon, The GDPR: What It Really Does and How the U.S. Can Chart a Better Course, 19 Federalist Soc’y Rev. 234, 234 (2018).
The GDPR restricts the activities of data controllers and data processors. Controllers are defined as natural or legal persons that “determine the purposes and means of processing the of personal data,” while processors are defined as bodies that “process personal data on behalf of” controllers.182General Data Protection Regulation, supra note 16, art. 4(7), (8); Lukas Feiler, Nikolaus Forgó & Michaela Weigl, The EU General Data Protection Regulation (GDPR): A Commentary 14 (2018). Controllers need not be based in the European Union to be subject to the regulation.183General Data Protection Regulation, supra note 16, art. 3(2). Indeed, if a company offers goods or services in Europe or uses EU citizens’ data for any purpose, that company falls within the GDPR’s purview.184Id. The GDPR covers all data that could objectively identify the individual.185Id. art. 4(1); Feiler et al., supra note 182, at 15. And to collect information, corporations need to demonstrate a legitimate interest—one that is not merely marginal.186General Data Protection Regulation, supra note 16, art. 6; Feiler et al., supra note 182, at 84. Violations of GDPR can result in fines of up to €20 million or four percent of the offending company’s global annual revenue.187General Data Protection Regulation, supra note 16, art. 83(5).
To avoid GDPR violations and fines, many US-based companies have geo-targeted their content so that it is not viewable in the European Union.188See Amy Kristin Sanders, The GDPR One Year Later: Protecting Privacy or Preventing Access to Information?, 93 Tul. L. Rev. 1229, 1239–40 (2019); see also Alexander Tsesis, Data Subjects’ Privacy Rights: Regulation of Personal Data Retention and Erasure, 90 U. Colo. L. Rev. 593, 595 (2019). As companies choose to shut down operations rather than face potential compliance costs, there has been a marked decrease in the availability of news and entertainment services in the European Union.189Layton & Mclendon, supra note 181, at 244–45. As of March 2019, there were 1,129 US-based websites that were unavailable in the European Union simply because the US-based company did not comply with GDPR regulations.190Websites Not Available in the European Union after GDPR, VerifiedJoseph (Mar. 20, 2019, 7:53 PM), https://perma.cc/U4GX-MAYU (noting also that many unavailable websites are local US-based newspapers). And the US-based companies that continued to operate within the European Union spent more than $7 billion to comply with the new regulations.191Jennifer Huddleston, An Analysis of Recent Federal Data Privacy Legislation Proposals, Mercatus Ctr. (2019), https://perma.cc/68EL-98MR; Oliver Smith, The GDPR Racket: Who’s Making Money from This $9bn Business Shakedown, Forbes (May 2, 2018, 2:30 AM), https://perma.cc/D75H-M44E. On the eve of the GDPR’s enactment, sixty-eight percent of 200 US-based companies surveyed budgeted between $1 million and $10 million for compliance costs.192Press Release, PwC, GDPR Compliance Top Data Protection Priority for 92% of US Organizations in 2017, According to PwC Survey (Jan. 23, 2017), https://perma.cc/8NC7-EUN7.
Significant economic effects have resulted from the GDPR. Within Europe, there has been increased consolidation among technology companies.193For a partial list of small- and medium-sized corporations that have entirely shut down or stopped operating in the European Union across blockchain, marketing, and video game industries, see Alec Stapp, GDPR after One Year: Costs and Unintended Consequences, Truth on the Market (May 24, 2019), https://perma.cc/C8RQ-V8R2. See also Andrea O’Sullivan, The EU’s New Privacy Rules Are Already Causing International Headaches, Reason (June 12, 2018, 12:01 AM), https://perma.cc/8VAF-VJUA. Many have attributed this consolidation to the hefty fines facing violators.194O’Sullivan, supra note 193. The advertising industry provides one pertinent illustration of market consolidation at work. Before the GDPR, Google took in fifty percent of EU advertising expenditures.195Id. The day after the GDPR went into effect, Google took in ninety-five percent of EU advertising expenditures.196Id. Moreover, in the four months following the GDPR, the number of advertising trackers placed on the top 2000 websites visited by Europeans decreased by four percent.197Natasha Lomas, GDPR Has Cut Ad Trackers in Europe but Helped Google, Study Suggests, TechCrunch (Oct. 10, 2018, 2:00 AM), https://perma.cc/NU7N-SCB8. A closer look at the players in the advertising-tracking landscape reveals that the number of trackers placed by smaller companies decreased by between eighteen and thirty-one percent.198Björn Greif, Study: Google Is the Biggest Beneficiary of the GDPR, Cliqz (Oct. 10, 2018), https://perma.cc/9NNX-39NM. Within the same period, Facebook’s trackers decreased by only seven percent while Google’s trackers increased by nearly one percent.199Id. Omnibus legislation has had a disproportionate effect on smaller corporations to the benefit of corporations able to establish compliance infrastructure.
Similarly, a recent study found that in the GDPR’s first year, external funding of European technology companies decreased by $3.38 million per week.200Jian Jia, Ginger Jin & Liad Wagman, The Short-Run Effects of GDPR on Technology Venture Investment, VOX CEPR Pol’y Portal (Jan. 7, 2019), https://perma.cc/EMC8-5VNU. Decreases were found across the sheer number of deals made (seventeen percent) and the level of investment expended within those deals (thirty-nine percent).201Id. Among new technology companies (defined as having been in the market for zero to three years), the effect is more pronounced—the number of deals decreased nineteen percent.202Id. Researchers extrapolate the decrease in both deals and investment in new technology companies to between a four to eleven percent reduction in job creation.203See id. Jia, Jin, and Wagman do caution that these short-term effects may not continue in the future and may be a product of investors’ uncertainty about the GDPR’s effect on new companies. While the GDPR raised global consciousness about the degree to which consumer data can be protected, it has also acted as a regulatory case study as policymakers across the Atlantic consider and debate the relative merits of regulating technology corporations.
B. Privacy as a Tradeoff: US Statutory and State-Based Regulation
In stark contrast to the European foundations of and approach to data privacy, US citizens do not have a federally recognized right to privacy. What legally cognizable privacy interest US citizens do have is largely limited in scope by the Supreme Court’s interpretation of Fourth Amendment protections.204See, e.g., Riley v. California, 573 U.S. 373, 403 (2014). And while some states have recognized an individual’s right to privacy,205See Prosser, supra note 48, at 401–02. that recognition is far from uniform. Relatedly, Congress’s reliance on a sectoral approach to privacy has not resulted in federal data privacy regulation,206See generally Consumer Online Privacy Rights Act, S. 2968, 116th Cong. (2019); Data Care Act of 2018, S. 3744, 115th Cong.; Information Transparency & Personal Data Control Act, H.R. 6864, 115th Cong. (2018); Social Media Privacy Protection and Consumer Rights Act of 2018, S. 2728, 115th Cong.; Balancing the Rights of Web Surfers Equally and Responsibility Act of 2017, H.R. 2520, 115th Cong.; Data Security Act of 2015, S. 961, 114th Cong.; Personal Data Privacy and Security Act of 2014, H.R. 3990, 113th Cong.; see also Cameron F. Kerry, Breaking Down Proposals for Privacy Legislation: How Do They Regulate?, Brookings (Mar. 8, 2019), https://perma.cc/XC9T-FUJZ; Opinion, Why Is America So Far Behind Europe on Digital Privacy?, N.Y. Times (June 8, 2019), https://perma.cc/AF2K-6TTR. despite some legislators maintaining that such a regulation is a priority, even in the midst of a global pandemic.207See Dwight Weingarten, House Members Renew Calls for Privacy Bill after Pandemic Pause, MeriTalk (July 13, 2020, 9:19 AM), https://perma.cc/9YCK-V3XL. Instead, the US approach to corporate use of consumer data is fractured between ex post FTC adjudications and ex ante state regulation.
1. Ex Post: Adjudication under Section 5 of the FTC Act
The FTC has jurisdiction over violations of privacy policies and subsequent consent decrees under its mandate to prevent unfair or deceptive trade practices.208See Federal Trade Commission Act, 15 U.S.C. § 45; FTC v. Wyndham Worldwide Corp., 799 F.3d 236, 245 n.4, 255–57 (3d Cir. 2015) (interpreting section 5 of the FTC Act as giving the FTC the authority to challenge a corporation’s data security procedures). And the FTC has engaged in privacy settlements as early as 1951.209Chris Jay Hoofnagle, Woodrow Hartzog & Daniel J. Solove, The FTC Can Rise to the Privacy Challenge, but Not Without Help from Congress, Brookings (Aug. 8, 2019), https://perma.cc/3NZQ-LKXG. Yet the FTC has conservatively interpreted its statutory authority and has opted to adjudicate individual cases rather than institute rules that prospectively regulate consumer data privacy.210See Christine S. Wilson, Comm’r, FTC, Privacy and Public/Private Partnerships in a Pandemic 18–19 (May 7, 2020), https://perma.cc/Y6VA-P936; Lindsey Barrett, Confiding in Con Men: U.S. Privacy Law, the GDPR, and Information Fiduciaries, 42 Seattle U. L. Rev. 1057, 1074–75 (2019); Hayes Hagan, Note, How to Protect Consumer Data? Leave It to the Consumer Protection Agency: FTC Rulemaking as a Path to Federal Cybersecurity Regulation, 2019 Colum. Bus. L. Rev. 735, 751; Jack Karsten, How Should the US Legislate Data Privacy?, Brookings (July 30, 2018), https://perma.cc/3EVM-T3MW. Section 5 of the FTC Act21115 U.S.C. § 45. requires a finding of a “clear theory of substantial likelihood of harm to consumers that is not outweighed by any countervailing benefits.”212Terrell McSweeny, Psychographics, Predictive Analytics, Artificial Intelligence, & Bots: Is the FTC Keeping Pace?, 2 Geo. L. Tech. Rev. 514, 522 (2018). And given the level of ambiguity inherent in many corporate privacy policies, the FTC retains “a fairly narrow sliver” for enforcement.213Barrett, supra note 210, at 1075.
2. Ex Ante: Differing State Legislation
Following the GDPR and building upon FTC adjudication, states have begun crafting their own consumer data privacy legislation. These bills, three of which are discussed below, subject businesses to contradictory regulations and may stifle competition—potentially mirroring the effect of the GDPR on the EU economy.220Contra Bilyana Petkova, The Safeguards of Privacy Federalism, 20 Lewis & Clark L. Rev. 595, 606 (2016) (maintaining that state experimentation in privacy laws stands to benefit US consumer data privacy interests).
a. California Consumer Privacy Act
Facially similar to the GDPR, the California Consumer Protection Act221Cal. Civ. Code § 1798.100 (West 2018). (“CCPA”) was signed into law on June 28, 2018.222Id.; Pardau, supra note 173, at 72. The CCPA became effective January 1, 2020; it provides residents of California with a number of rights, including the right to: (1) know what personal data is collected and how it is shared; (2) “opt out” of a business’s data sales; and (3) force a business to delete data.223Cal. Civ. Code §§ 1798.100, 1798.105, 1798.110, 1798.120; see id. § 1798.140(g) (defining “consumer” as “a natural person who is a California resident”). Importantly, the CCPA provides Californians a private cause of action for data breaches from a “business’s violation of the duty to implement and maintain reasonable security procedures,” although not for violations of the Act’s enumerated rights.224Id. § 1798.150(a)(1). Damages under the private cause of action can be severe: the CCPA provides the greater of statutory damages between $100 and $750 per consumer, per incident, or actual damages.225Id. § 1798.150(a)(1)(A).
The CCPA’s broad definition of the types of companies to which its regulations apply extend beyond California’s borders. Businesses are potentially liable under the CCPA if they: (1) have annual revenues of more than $25 million; (2) annually “buy,” “receive,” “sell,” or “share” personal information from 50,000 or more California residents, households, or devices; or (3) obtain fifty percent or more of their annual revenue from selling California residents’ personal information.226See id. § 1798.140(c)(1)(A)–(C); Pardau, supra note 173, at 92. Unlike the GDPR, the CCPA does not require a covered business to provide a basis for their data collection or processing practices.227Compare Cal. Civ. Code § 1798.100(a)(1) (“A business shall not collect additional categories of personal information or use personal information collected for additional purposes that are incompatible with the disclosed purpose for which the personal information was collected . . . .”), and id. § 1798.100(c) (“A business’ collection, use, retention, and sharing of a consumer’s personal information shall be reasonably necessary and proportionate to achieve the purposes for which the personal information was collected or processed . . . .”), with General Data Protection Regulation, supra note 16, art. 6(3) (“The basis for the processing . . . shall be laid down by (a) Union law; or (b) Member State law to which the controller is subject.”). Under the CCPA, the California Attorney General may bring a civil action against a covered business and recover up to $2,500 per violation or $7,500 per intentional violation.228Cal. Civ. Code § 1798.155(b) (detailing the enforcement scheme and a thirty-day cure period).
Critics of the CCPA argue that legislators did not realize the broad effect that the law, passed just one week after it was drafted,229See Marc Vartabedian, California Passes Sweeping Data-Privacy Bill, Wall St. J. (June 28, 2018, 9:36 PM), https://perma.cc/6TLQ-4LLQ. would have on businesses.230See Eric Goldman, An Introduction to the California Consumer Privacy Act (CCPA) 5 (July 1, 2020) (unpublished manuscript), https://perma.cc/72RV-2NQV (“[T]he law potentially applies to many legitimate activities and data transfers that are not straight cash-for-data.”). While the three-pronged definition of a “business” appears to limit the number of covered businesses, a mere 127 daily hits on a website could subject the owner of that website to liability.231Alan McQuinn & Daniel Castro, The Costs of an Unnecessarily Stringent Federal Data Privacy Law, Info. Tech. & Innovation Found. (Aug. 5, 2019), https://perma.cc/8TGM-6726. Local news sources—with far less scale than Google and Facebook—would fall into this definition, requiring expensive compliance infrastructures that could drive them out of the market.232See Goldman, supra note 230 (manuscript at 2); see also Rita Heimes & Sam Pfeifle, New California Privacy Law to Affect More Than Half a Million US Companies, Int’l Ass’n of Priv. Pros. (July 2, 2018), https://perma.cc/JKT3-WNQY (noting that the majority of businesses affected by the CCPA will be small- and medium-sized businesses).
In the days before January 1, 2020, companies scrambled to ensure their compliance with the new law,233See Companies Should Take California’s New Data-Privacy Law Seriously, The Economist (Dec. 18, 2019), https://perma.cc/7BD2-KYW7 (detailing that half a million US-based businesses spent $55 billion on CCPA compliance costs). even though the California Attorney General could not bring enforcement actions until July 1, 2020.234See Cal. Civ. Code § 1798.185(c) (West 2018); John Stephens, California Consumer Privacy Act, Am. Bar Ass’n (Feb. 14, 2019), https://perma.cc/34KZ-REVM. A survey of 250 businesses reported that seventy-one percent budgeted hundreds of thousands of dollars on compliance costs while nineteen percent budgeted at least $1 million.235Roslyn Layton, The Costs of California’s Online Privacy Rules Far Exceed the Benefits, Am. Enter. Inst. (Mar. 22, 2019), https://perma.cc/W88L-38RF. Since then, the California Attorney General created a “Frequently Asked Questions” page that is continually updated to reflect the ongoing proposed regulations under the law.236See generally California Consumer Privacy Act (CCPA), Off. of the Att’y Gen. (2021), https://perma.cc/PY82-SH64. Recent regulations include, for example, specific requirements for businesses selling personal information of consumers younger than thirteen years old, Cal. Code Regs. tit. 11, § 999.330 (2020), and verification requirements when a consumer requests that a company delete the consumer’s personal information. Id. § 999.323.
b. Washington Privacy Act
In 2021, Washington state senators proposed a bill entitled the Washington Privacy Act (“WPA”).241S.B. 5062, 67th Leg., Reg. Sess. (Wash. 2021). The state’s third attempt at a data privacy law, the WPA provides Washington residents with many of the same rights established under the CCPA.242See id. § 101(7) (defining “consumer” as “a natural person who is a Washington resident acting only in an individual or household context”); see also Rick Morgan, State Lawmakers Take Another Crack at Protecting Consumer Data Privacy, Puget Sound Bus. J. (Jan. 20, 2021, 6:16 PM), https://perma.cc/2HWK-D2NT. Under the bill, Washington residents are provided the right to: (1) access categories of information processed about them; (2) correct inaccuracies; (3) request deletion; (4) receive their collected personal information; and (5) opt out of information processing for, among other purposes, targeted advertising and third-party sales.243Wash. S.B. 5062 § 103(1)–(5); Cynthia J. Larose & Christopher J. Buontempo, The Ongoing March toward Privacy Law in the US – A State Legislative Roundup, Nat’l L. Rev (Feb. 16, 2021), https://perma.cc/9RT9-YLV8.
Yet compared to the CCPA, the WPA narrows the threshold of companies that fall under the statute’s purview. The bill applies to companies that either: (1) process personal information of more than 100,000 Washington residents; or (2) derive more than twenty-five percent of their annual revenue from selling personal information and process personal information of more than 25,000 Washington residents.244Wash. S.B. 5062 § 102(1)(a)–(b). Notably, the WPA lacks a revenue threshold.
And again compared to the CCPA, the WPA limits Washington residents’ ability to recover for violations. Unlike the CCPA, the WPA does not provide a private cause of action for Washington residents.245Id. § 111(1) (“A violation of this chapter may not serve as the basis for, or be subject to, a private right of action under this chapter or under any other law.”). But see Mike Hintze, Opinion, Washington Privacy Act: Amendments Jeopardize Protections for Consumers, Seattle Times (Mar. 4, 2020, 11:01 AM), https://perma.cc/4R6L-5ES8 (“[C]onsumers already have a private right of action under the existing Consumer Protection Act, and nothing in this new legislation will take that away.”). Only the Washington Attorney General may enforce violations of the WPA and recover up to $7,500 per violation.246Wash. S.B. 5062 § 112(1), (5). Critics of the WPA argue that the bill—which Amazon and Microsoft support—is “toothless [and] corporate-centric” with “vague language, a laundry list of exemptions and a provision that explicitly prohibits people from holding companies accountable when they violate people’s digital privacy rights.”247Jennifer Lee, Opinion, Con: The People’s Privacy Act, not the Washington Privacy Act, Is the Better Bill to Protect Consumers’ Civil Rights and Civil Liberties, Seattle Times (Feb. 5, 2021, 5:31 PM), https://perma.cc/BP5H-ZTXY. Meanwhile, supporters argue that the bill offers “clear, accessible and enforceable privacy protections”248Reuven Carlyle, Opinion, Pro: The Washington Privacy Act Empowers Consumers to Retake Control of Their Identity Online in Our Data-Fueled Economy, Seattle Times (Feb. 5, 2021, 5:28 PM), https://perma.cc/QP9K-P6P7. and is “one of the strongest globally”249Irene Plenefisch, 2021 Washington State Legislative Session Priorities, Microsoft: Microsoft on the Issues Blog (Jan. 27, 2021), https://perma.cc/VH6B-37UX. with a “robust privacy framework.”250Letter from Brian Huseman, Vice President, Pub. Policy, Amazon, to the Honorable Reuven Carlyle, Chairman, Energy, Env’t, & Tech. Comm., Wash. State Senate (Jan. 14, 2021), https://perma.cc/QP3W-NV5W.
c. New York Privacy Act
In January of 2021, New York state senators reintroduced the previous legislative session’s New York Privacy Act (“NYPA”).251N.Y. A.B. 680, 2021–2022 Leg, Reg. Sess. (N.Y. 2021). The NYPA is currently in the Committee on Consumer Affairs and Protection as Assembly Bill 680.252Bill No. A00680, N.Y. State Assembly, https://perma.cc/5A34-GBXK. Broader in scope than both the CCPA and WPA, the NYPA does not define the term “business” and lacks a revenue or consumer threshold above which would expose companies to liability.253See generally N.Y. A.B. 680; Jack Karsten & Raj Karan Gambhir, Proposed New York Bill Expands Scope of Data Privacy Debate, Brookings (June 24, 2019), https://perma.cc/D9DQ-7RCG. Indeed, upon introducing the bill, State Senator Kevin Thomas stated that he “want[ed] to capture as many businesses as possible.”254Jeff Stone, New York Could Soon Pass Its Own GDPR-Inspired Data Security Law, Cyberscoop (May 29, 2019), https://perma.cc/ZQ7J-38UH. Further unlike the GDPR, CCPA, and WPA, the NYPA contains a “data fiduciary” clause.255N.Y. A.B. 680 § 1102. This clause establishes a duty of care that businesses owe to consumers in maintaining the consumers’ data.256Id.; see Joseph V. DeMarco, Implications of the ‘Data Fiduciary’ Provision in the Proposed New York Privacy Act, N.Y.L.J. (Feb. 28, 2020, 2:00 PM), https://perma.cc/KJL7-9AMH (“This term is . . . defined quite broadly to include direct or indirect financial loss, physical harm, psychological harm, significant inconvenience or time expenditure, adverse employment outcomes, stigmatization or reputational harm, disruption and intrusion from unwanted commercial communication, [and] price discrimination . . . .”). The clause states that businesses “shall act in the best interests of the consumer, without regard to the interests of the entity, controller or data broker.”257N.Y. A.B. 680 § 1102. Notably, this standard of care exceeds that owed to corporations’ shareholders.258Karsten & Gambhir, supra note 253. Not only will businesses face a higher standard of care, but they will also need to determine, potentially, each consumer’s “best interests.”259Id.
The NYPA also provides broad enforcement mechanisms that may lead to a flood of suits in state courts. The Act provides “any person who has been injured by reason of a violation of this article” with a private cause of action, and the NYPA does not limit the scope of damages.260N.Y. A.B. 680 § 1109(3); id. § 1109(4) (“When calculating damages and civil penalties, the court shall consider the number of affected individuals, the severity of the violation, and the size and revenues of the covered entity.”). While the NYPA defines “consumers” as “New York resident[s],”261See N.Y. A.B. 680 § 1100(3). it does not define “person,” further expanding the universe of potential plaintiffs. And the NYPA exposes businesses to tremendous liability, noting that “[e]ach provision of this article that was violated counts as a separate violation.”262Id. § 1109(3); Karsten & Gambhir, supra note 253. Observers have already noted that compliance with the NYPA as currently written “likely would prove impossible.”263Kyle Faith & Melinda McLellan, New York Legislature Introduces CCPA Clone with Private Right of Action, JDSupra (Jan. 8, 2021), https://perma.cc/QYL9-SB5T.
VI. A Third Response: Self-Regulation
Pending state legislation and FTC actions against companies like Facebook demonstrate the need for uniform regulation of consumer data privacy. Scholars seem to agree that the solution exists in one of two options, either federal legislation or state legislation.264Compare Carol Li, Note, A Repeated Call for Omnibus Federal Cybersecurity Law, 94 Notre Dame L. Rev. 2211, 2229 (2019) (“Companies may face overlapping state and federal actions with duplicative costs.”), and Nuala O’Connor, Reforming the U.S. Approach to Data Protection and Privacy, Council of Foreign Rels. (Jan. 30, 2018), https://perma.cc/7QYJ-3X9W (advocating for baseline federal privacy legislation), with Paul M. Schwartz, Preemption and Privacy, 118 Yale L.J. 902, 946 (2009) (“A federal omnibus information privacy law with strong preemption provisions would be an unfortunate development. It would limit further experimentation in federal and state sectoral laws . . . be difficult to amend, and would, therefore, become outdated as technological changes undermine such a statute’s regulatory assumptions.”). But neither of the two preferred options fully addresses the values at stake: innovation, free speech and transparency, personal privacy preferences, and state sovereignty.
The first solution—state-based legislation—secures personal consumer data privacy protections at the state level. A key advantage of state-based privacy protections—as opposed to federal protections—lies in the states’ comparatively greater ability to account for localized preferences. But a fifty-state solution is not a long-term solution. Recent scholarship has highlighted the potential unconstitutionality of state-based consumer privacy regimes.265See Jennifer Huddleston & Ian Adams, Potential Constitutional Conflicts in State and Local Data Privacy Regulations, Regul. Transparency Project 4 (Dec. 3, 2019), https://perma.cc/CM5L-344H (arguing that state and local consumer privacy protections are unconstitutional under the Constitution’s dormant Commerce Clause and First Amendment). Legislation in California, Washington, and New York differs in scope and application, subjecting both large and small businesses to conflicting standards.266See generally supra Part III.B.2. Most of these companies transact in multiple—if not all—states.267See generally sources cited at supra notes 134–37. While privacy torts, articulated at the state level, provide relief for individual injuries occurring in one location, the same cannot be said for data privacy harms. Faced with the choice between multiple standards, businesses may opt for the strictest, forgoing some state regulations altogether and subjecting their data practices to civil liability. Alternatively, regulations with inconsistent definitions and differing penalties may force large businesses to establish practices fulfilling the lowest common denominator.268Before the H. Comm. on Energy & Com., Subcomm. on Consumer Prot. & Com., 116th Cong. (2019) (Oral statement of Christine S. Wilson, Comm’r, FTC), https://perma.cc/TS6Q-ER68 (acknowledging the contradictions and confusion inherent in state-enacted privacy regulations). And as further states legislate consumer data privacy, companies may face steep fines, jeopardizing their financial viability. Some corporations may choose to forgo transacting in one state altogether. A corporation’s choice to halt business in one state may lead to consumer frustration and shrink the body of products and services available to the detriment of both consumers and businesses.
The second solution—federal legislation modeled on the GDPR—is also not the answer for multiple reasons. First, the United States lacks an intrinsic or guaranteed right to privacy—the cornerstone of the GDPR—on which federal legislation could be based. Second, a GDPR clone is likely to produce many of the effects that have already hampered the EU economy: increased market consolidation, limited access to sources of news and entertainment, confusion among business practitioners, and barriers to entry for startup corporations.269See supra Part II.A. Additionally, potential economic costs of GDPR-modeled federal legislation are vast—scholars have estimated resulting compliance costs and market inefficiencies at nearly $122 billion per year in the United States.270See McQuinn & Castro, supra note 231 (including within compliance costs oversight requirements (data protection officers, privacy audits, data infrastructure) and consumer data rights (data access, data portability, data deletion, data rectification), and including within market inefficiencies duplicative enforcement, lower consumer efficiency, less access to data, and lower advertising effectiveness). While not catastrophic to the Googles and Facebooks of the country, compliance costs could obliterate startup corporations that generate jobs, contribute to national productivity, and increase innovation.271See, e.g., Elizabeth Gonzalez, Understanding the SBA’s Definition of Small Business and Why It Matters, The Blueprint (July 17, 2020), https://perma.cc/R5ZP-YHUT (“Small businesses make up more than 99 percent of businesses and create 65 percent of new jobs in the U.S. . . .”); Bryan Ritchie & Nick Swisher, The Big Small: The Economic Benefits of Startups, Univ. of Notre Dame IDEA Ctr. (July 15, 2018), https://perma.cc/86VA-QZJ7. And third, a federal solution jeopardizes state autonomy in matters highly personal to individuals. State legislation protecting privacy is potentially overinclusive—protecting privacy more than one group desires—or underinclusive—protecting privacy less than another group desires.272See Huddleston & Adams, supra note 265.
Assessing the strengths and weaknesses of the two data privacy solutions illustrates many of the desirable goals of future regulation. First, data privacy protections must be a product of robust discussion. Policymakers and legislators at both the state and federal level must collaborate with economists and industry participants. In working with economists, policymakers can examine the effects of the GDPR in Europe, estimate direct and indirect costs of regulation, and pursue cost-benefit analyses of various proposals. Just so, in working with industry participants, policymakers can become educated about the highly technical internet ecosystem and consider any regulation’s impact on future innovation.
Second, data privacy protections must distinguish between large, medium, and small businesses. Research has already suggested that small and medium-sized businesses disproportionally bear the costs of data privacy regulations.273See Jia, et al., supra note 200; Stapp, supra note 193. And many of the larger corporations which regulators seek to reach have actually benefitted from legislation.274See sources cited at supra notes 197–99. By tiering protections according to revenue, policymakers can narrow regulations to particular companies’ business practices and curb damaging effects on smaller actors.
Third, data privacy protections must address existing efforts in states to attempt to harmonize contradictory standards. Current state-level data privacy bills and statutes articulate distinct areas of focus, liability standards, data subjects, duties of care, and damages. Some bills provide a private cause of action while others do not. Some bills articulate a revenue threshold for corporations while others do not. Some bills address minors’ online privacy protections while others address entire households’ privacy protections. For any company’s general counsel, navigating the maze of state standards, requirements, duties, and damages presents a daunting task.
And lastly, data privacy protections must consider the level of privacy and transparency US consumers need. Many consumers receive value from the convenience “free” internet-based products offer. For this subset of consumers, then, strong privacy regulations could eliminate much of the value derived from exchanging personal data for services.
While a perfect answer that achieves each of these four goals is undoubtedly difficult, there is a third solution beyond federal and state-based legislation. A self-regulatory body, headed by industry leaders working in tandem with the FTC,275See Jennifer Huddleston, A Primer on Data Privacy Enforcement Options, Am. Action F. (May 4, 2020), https://perma.cc/5TU8-FZEF (“By focusing on specific ex post redress (reviewing the violation and giving fines after it has occurred) rather than a broad ex ante approach (scrutinizing all the possible data activities and violations in advance), the FTC does not limit both the direction of innovation and the options available to consumers.” (emphasis omitted)). would most adequately address many of the goals without unduly hindering innovation.276The Better Business Bureau does oversee a self-regulatory body of advertisers focused on privacy and internet protections through its Digital Advertising Accountability Program (“DAAP”). Digital Advertising Accountability Program, Better Bus. Bureau, https://perma.cc/PJR8-WZHQ. This program, however, lacks the appropriate enforcement mechanisms to ensure that companies remain accountable to consumers; corporations need not adopt the steps recommended. See DAAP Decisions and Guidance, Better Bus. Bureau, https://perma.cc/2K9G-9TU5 (“When DAAP determines that there may be a compliance issue regarding a company’s adherence to one or more of the Principles, DAAP may, in its discretion, send the company a letter of inquiry . . . .”). Since its inception, DAAP has only undertaken 120 formal reviews. Id. And DAAP does not provide information on the number of inquiries it has sent to potential violators. See generally id. Nor is DAAP focused solely on consumer data privacy: the body also oversees a “Political Advertising Transparency Project.” Political Ads 2020: A Tale of Two Platforms, Better Bus. Bureau, https://perma.cc/FG8P-WLT7. Additionally, DAAP has not offered program guidance “to promote compliance with [its data privacy] Principles” since May of 2017. See DAAP Decisions and Guidance, supra. And unlike other self-regulatory bodies under the Better Business Bureau, DAAP does not include a list of corporate partners. See generally Digital Advertising Accountability Program, supra. And technology companies can emulate existing self-regulatory approaches. For example, the Children’s Food & Beverage Advertising Initiative (“CFBAI”), part of the Better Business Bureau, represents nineteen major companies that advertise to children.277Program Overview: Children’s Food and Beverage Advertising Initiative, Better Bus. Bureau, https://perma.cc/QX4U-S664 (listing partners such as Burger King, Coca-Cola, General Mills, Kellogg’s, KraftHeinz, McDonald’s, PepsiCo, and Unilever). In 2009, these companies accounted for eighty-nine percent of spending for children’s food marketing. FTC, A Review of Food Marketing to Children and Adolescents, at ES-7–8 (2012), https://perma.cc/9Y54-HCFU. In 2006, these corporations banded together to form a voluntary program marked by annual self-assessments.278FTC, supra note 277, at ES-1. Companies could also emulate the approach of the Distilled Spirits Council of the United States (“DISCUS”). Following the repeal of the Eighteenth Amendment, leading US-based distillers established a voluntary Code of Responsible Practices.279Code of Responsible Practices for Beverage Alcohol Advertising and Marketing, Distilled Spirits Council, https://perma.cc/GH6L-TU4S. To set forth a framework for a self-regulatory body on data privacy, both programs’ models are further discussed below.
A. Promulgating Industry Standards
The first goal of the self-regulatory body would be to establish standards for corporations that gather consumer data. The two self-regulatory body examples provide such standards. Member companies of the CFBAI created the Initiative’s Core Principles280See generally CFBAI, Better Bus. Bureau, CFBAI Core Principles (2020), https://perma.cc/B3PK-DDW2. and Uniform Nutrition Criteria,281See generally CFBAI, Better Bus. Bureau, Category-Specific Uniform Nutrition Criteria (2011), https://perma.cc/Y7R3-7NRM. which regulate advertisements to children under the age of twelve. The Principles and the Criteria demonstrate the companies’ commitment to promoting foods with lower calorie, salt, sugar, and fat contents on children-directed television programs and mobile media.282Id. The Children’s Advertising Review Unit (“CARU”)—a separate body under the Better Business Bureau—monitors compliance with the voluntary program.283The Children’s Advertising Review Unit, Better Bus. Bureau, https://perma.cc/5S52-3N39. Similarly, DISCUS members established limits on content, target audiences, product placements, and locations of advertisements.284Distilled Spirits Council, Media ‘Buying’ Guidelines: Demographic Data/Advertisement Placement Guidelines (2011), https://perma.cc/LR74-WAH3. Further, DISCUS has a Code Review Board for consumer complaints and follows a review process.285Distilled Spirits Council, Code of Responsible Practices for Beverage Alcohol Advertising and Marketing 3, https://perma.cc/SR35-U96Y. The FTC monitors compliance with both self-regulatory approaches.286For DISCUS and the FTC, see generally Alcohol Advertising, FTC (2013), https://perma.cc/CQ27-HU2Z; Major FTC Report Commends DISCUS Advertising Code and Rigorous Standards, Distilled Spirits Council (Mar. 20, 2014, 9:12 AM), https://perma.cc/DDU5-TTR9 (summarizing a 2014 FTC report on DISCUS’s self-regulatory approach that found that seventy two percent of advertisements conformed to DISCUS’s requirements). For the CFBAI and the FTC, see generally Maureen Enright & Lauren Eskenazi, Better Bus. Bureau, The Children’s Food & Beverage Advertising Initiative and the Children’s Confection Advertising Initiative in Action (2019), https://perma.cc/P99T-KRM2.
In early September 2019, the chief executives of Amazon, AT&T, Comcast, Dell, Ford, IBM, Macy’s, Mastercard, Proctor & Gamble, SAP, Salesforce, Target, Visa, and Walmart (among others), sent an open letter to Congress, asking for “a comprehensive federal consumer data privacy law.”287Harper Neidig, 51 Major CEOs Ask Congress for Federal Data Privacy Law Blocking State Rules, The Hill (Sept. 10, 2019, 2:23 PM), https://perma.cc/7V9X-8D6E. Citing the many state-level bills, the companies emphasized a need for uniform data privacy laws to ease consumer confusion and enable continued cross-jurisdictional business.288Letter from Ajay S. Banga, President & CEO, Mastercard, et al., to Hon. Mitch McConnell, Majority Leader, U.S. Senate (Sept. 10, 2019), https://perma.cc/XX83-N5X3. In coming together to craft such a letter, these corporations have already begun the process of self-regulation akin to the CFBAI and DISCUS. To build upon this letter’s legacy, corporations small and large could set forth a code of responsible data privacy practices resembling the CFBAI and DISCUS practices and core principles. Many of these corporations have taken significant, industry-leading steps to establish appropriate protections for consumer data privacy.289See, e.g., Jason Cipriani, iOS 14 Just Made Your iPhone More Private and Secure: 3 Things That Changed, CNET (Oct. 6, 2020, 4:23 PM), https://perma.cc/N7GD-C2FV; Erin Egan, Making Data and Privacy Easier to Understand through People-Centered Design, Facebook (July 14, 2020), https://perma.cc/SEC4-XBPR (“Today, we’re publishing a white paper that highlights the need for companies to better communicate privacy information by putting people at the center of privacy design decisions . . . . [E]nsuring people understand their choices when it comes to data and privacy is something no individual company or government can solve alone. That’s why the paper also highlights the importance of collaboration among companies, policymakers and other experts.”). A set of core principles governing data privacy could be the corporations’ first steps toward defining sustainable data practices.
B. Differentiating Based on Corporate Scale
An omnibus data privacy law mirrored after the GDPR could produce consolidation at the expense of small companies.290See Jia, et al., supra note 200; Stapp, supra note 193. Seeking to prevent this chilling effect, a self-regulatory body working with the FTC could establish a tiering system that lays out data practices for companies depending on revenue, size, and scale. Within each tier, the self-regulatory body could establish governance boards to promulgate differing regulations. In this sliding scale of requirements, the body could set forth baseline protections for corporations entering the market without jeopardizing innovation and economic growth.
To start, the self-regulatory body can use existing frameworks to differentiate players in the consumer data privacy sphere. One helpful resource is the US Small Business Administration, which has laid out differing definitions of “small business” depending on the firm’s industry, revenue, and employees.291See 13 C.F.R. § 121.101 (defining small business size standards); Small Bus. Admin., Frequently Asked Questions (2012), https://perma.cc/3426-A9AT. To address lock-in concerns, the body could review corporate size annually, updating its tiering to reflect current market conditions. The Administration, like the US Census Bureau, further divides those small businesses according to the type of business292Small Bus. Admin., supra note 291.—from corporation to sole proprietor. To begin the process of differentiating the standards to which differently sized companies must comply, the body can adopt these definitions, protecting smaller businesses from inordinate regulatory burdens.
C. Increasing Transparency on the Use of Consumer Data
Just as voluntary self-regulatory bodies have reaped the benefits of corporate social responsibility and public relations, so also could the technology companies engender consumer trust and political respect through a self-regulatory approach.293See Schwartz, supra note 264, at 939–41. By working with consumer advocacy groups, the self-regulatory body could draft versions of privacy policies written for the average consumer. Instead of clicking on a box to signify agreement to ambiguous terms, these privacy policies could be minimally intrusive, consistent across the internet, and comprehensible. Like DISCUS and CFBAI, the regulatory body could publish white papers on technological best practices, research how consumers process consent notifications, and obviate the immediate need for federal data privacy legislation.
Tracing the history of privacy protections in the United States demonstrates an absence of a fundamental right to data privacy guaranteed to US citizens at the federal level. While the Supreme Court has articulated privacy interests protected by the Fourth Amendment, Congress has legislated sectoral data privacy protections and state courts and laws provide remedies for tortious interferences with privacy interests. Despite concerns arising from undisclosed data sharing like the recent Cambridge Analytica controversy, Congress has not passed federal legislation that would protect consumer data privacy.
A fundamentally different approach to privacy exists in Europe, where citizens are guaranteed a right to privacy. The GDPR entrenched this right, but it also led to market stagnation and consolidation. Even so, the GDPR increased global consciousness about consumer data privacy and its fragility in the age of digitalization. The pace at which technology develops shows no signs of slowing, and it remains a key aspect of the US economy. With six of the top ten companies by market capitalization based in the United States, a sweeping federal data privacy bill may stifle this economy.294Luca Ventura, World’s Largest Companies 2020, Glob. Fin. (Nov. 30, 2020), https://perma.cc/8REL-MT78. And the exchange between Senator Hatch and Mr. Zuckerberg295Transcript of Mark Zuckerberg’s Senate Hearing, supra note 2. demonstrates that Congress may lack the necessary expertise. Even worse, however, is the prospect of fifty competing state data privacy laws with differing revenue thresholds and enforcement mechanisms.296See, e.g., Huddleston, supra note 30; McMorris Rodgers, supra note 30.
Large companies urge Congress to act, but these companies already have the solutions. In writing to Congress and jointly stating their wishes for broad federal legislation, companies have demonstrated the capacity to band together to ensure—at the very least—continued growth and innovation. While state legislation continues to sputter in committees and Congress debates the proper way forward, technology companies must consider self-regulation mirrored after the alcohol industry’s DISCUS and the food and beverage industry’s CFBAI. Self-regulation establishes an appropriate balance between personal liberty and corporate profit, providing a twenty-first century realization of an “imperfect” nineteenth-century remedy.297Godkin, supra note 1, at 67.