The Potential Impact of Proposed Changes to Section 230 on Speech and Innovation

Jennifer Huddleston
Volume 28
,  Issue 4

Introduction

Section 230, a law that establishes critical liability protection for a range of online services that carry users’ content, has faced an increasing amount of criticism across the political spectrum. Users may be aware of Section 230 as a result of conversations around decisions made by social media platforms such as YouTube, Facebook, and Twitter, but it applies to a wide range of “interactive computer services” that carry user-generated content, including review sites, message boards, comment sections, and even Wikipedia. The criticism of Section 230 has been part of a broader “tech lash” accompanied by calls for additional regulation of various tech companies. These calls for additional regulation include calls to change Section 230 or limit its applicability. The criticisms of Section 230 are often incompatible with one another, with some policy makers criticizing interactive computer services for engaging in too much moderation by “censoring” certain voices while others accuse these services of not engaging in enough content moderation and seek to hold them accountable for the bad behavior of their users. Ironically, Section 230 is not the problem; it actually may be part of the solution by allowing interactive computer services to carry user-generated content and engage in content moderation practices that best serve their audiences while keeping market entry barriers low for new interactive computer services. This practice allows for a wider variety of platforms to meet the different preferences of consumers.

This Article examines some of the numerous proposals introduced over the course of the 116th and the first four months of the 117th Congress to change or repeal Section 230. Part I discusses Section 230, its development, and current interpretation to lay a foundation for understanding the impact of potential changes. Part II categorizes various proposals and explains the potential consequences of these proposals to online speech and innovation. In addition, Part II examines concerns regarding an increasing number of states that have sought to impose their own regulations on content moderation standards. Part III shows that despite a dislike or distrust of some specific content moderation decisions continuing, congressional action remains unlikely, and Section 230 is most likely to support continued innovation and remain aligned with free speech principles. The conclusion suggests that policy makers should be cautious about the consequences of changes to Section 230 and that underlying concerns may be better addressed in other ways.

I.     Understanding Section 230

In anger over content moderation decisions, critics are quick to blame Section 230 for nearly any action or inaction they find distasteful. In this heightened rhetoric, however, many of the core principles underlying Section 230 have been forgotten or misunderstood. To understand the impact that potential changes to Section 230 could have, it is important to first understand what Section 230 actually does rather than what discourse has made it out to be.

Section 230 began as a bipartisan bill called the Internet Freedom and Family Empowerment Act, co-sponsored by Republican Chris Cox and Democrat Ron Wyden.1CDA 230: Legislative History, Elec. Frontier Found., https://perma.cc/XGC8-K5TP. It sought to overcome a court ruling in Stratton-Oakmont, Inc. v. Prodigy Services Co.2No. 31063/94, 1995 WL 323710 (N.Y. Sup. Ct. 19945). where the court found that by engaging in content moderation, an interactive computer service could be held liable for user content that it did not create.3Id. at *13–14. Other early cases reached a different conclusion, and Section 230 established that no interactive computer service would be treated as a publisher of user content and provided legal certainty about the ability to engage in content moderation without changing the interactive computer service’s legal liability.4See Brent Skorup & Jennifer Huddleston, The Erosion of Publisher Liability in American Law, Section 230, and the Future of Online Curation, 72 Okla. L. Rev. 635, 649–50 (2020). To accomplish this, Section 230 has two key components. First, it specifies that no interactive computer service shall be treated as the publisher of a third party’s content.547 U.S.C. § 230(c)(1). Second, it clarifies that interactive computer services can engage in content moderation and are not liable for decisions to remove or leave up certain content.647 U.S.C. § 230(c)(2).

This legislation is far from being a special privilege for the tech sector as critics allege and instead reflects similar norms that, when Section 230 was introduced, had already emerged for libraries or newspapers carrying wire service articles.7Skorup & Huddleston, supra note 4, at 650–51. In fact, surveys have found that most Americans support the underlying principles of Section 230 that allow platforms to engage in content moderation and to set these standards themselves, even if they may not be aware of or support Section 230 directly.8See The Future of Tech Policy: American Views, Knight Found. (June 16, 2020), https://perma.cc/98E9-QTSD. While Section 230 or certain decisions may be unpopular, the core principles behind it have enabled an explosion of user-generated content and kept barriers low for smaller interactive computer services that may seek to disrupt existing giants.9Jennifer Huddleston, Section 230 as Pro-Competition Policy, Am. Action F. (Oct. 27, 2020), https://perma.cc/QS9X-RHXY.

While Section 230 proposed to encourage innovation and speech, it was inserted into the Communications Decency Act, which sought to target and restrict various materials on the internet. The Communications Decency Act became Title V of the Telecommunications Act of 1996 but was found unconstitutional except for Section 230 in Reno v. ACLU.10See 521 U.S. 844 (1997). Some, including Justice Clarence Thomas, have recently argued that Section 230 should be more narrowly interpreted, in part due to its inclusion in the Communications Decency Act.11See, e.g., Zachary Evans, Clarence Thomas Suggests Section 230 Immunities Applied Too Broadly to Tech Companies, Nat’l Rev. (Oct. 13, 2020, 12:02 PM), https://perma.cc/S79X-6HLF. However, such an approach would neglect that the 1996 Telecommunications Act was broadly deregulatory.12See Stuart N. Brotman, Was the 1996 Telecommunications Act Successful in Promoting Competition, Brookings Inst. (Feb. 8, 2016), https://perma.cc/TH6L-ZM2X. Justice Thomas’s proposed approach also ignores the accompanying legislative history, and further statements by the drafters indicate that Section 230 was intended to encourage a blossoming technology to achieve its potential, not to limit it.13See Christopher Cox, The Origins and Original Intent of Section 230 of the Communications Decency Act, Rich. J.L. & Tech. (Aug. 27, 2020), https://perma.cc/EJ2A-CJPK.

II.     Proposed Reforms to Section 230 and Their Potential Consequences

Twenty-five years after its initial passage, Section 230 is now at the center of many calls for further regulation of the internet. The rationales behind calls to change Section 230 vary and are often incompatible with one another. As a result, numerous proposals for changes or repeals of Section 230 have been put forward, but few, if any, have gained momentum. These proposals fall into a few different trends and categories. The categories discussed throughout Part II help to understand the various impacts changes to Section 230 might bring, even if some legislative proposals do not fit perfectly within one category. Likewise, there are numerous legislative proposals introduced in the current and previous Congresses and this section should not be considered exhaustive.

A.     Proposals to Completely or Partially Revoke or Repeal Section 230

While some advocates for Section 230 reform claim it is hyperbolic to say these proposals would repeal Section 230 in its entirety, at least some legislative proposals have sought to do just that.14Cf. Kiran Jeevanjee, Brian Lim, Irene Ly, Matt Perault, Jenna Ruddock, Tim Schmeling, Niharika Vattikonda & Joyce Zhou, All the Ways Congress Wants to Change Section 230, Slate (Mar. 23, 2021, 5:45 AM), https://perma.cc/VR7V-BGX9 (noting the various reform proposals related to Section 230). Notably, a Republican draft of the December 2020 COVID relief bill unsuccessfully attempted to insert a complete repeal of Section 230 into the proposal.15S. 5085, 116th Cong. (2020). Additionally, a 2021 proposal, the Stop Shielding Culpable Platforms Act, by Representative Jim Banks (R-IN), would repeal the critical (c)(1) immunity for interactive computer services and instead find interactive computer services to be distributors of their users’ content.16See Stop Shielding Culpable Platforms Act, H.R. 2000, 117th Cong. (2021).

These changes would likely result in recreating the problems that Section 230 sought to solve. Interactive computer services would face several disincentives regarding content moderation that would likely result in over- or under-moderation, often referred to as the Moderator’s Dilemma.17See Bobby Allyn, As Trump Targets Twitter’s Legal Shield, Experts Have a Warning, NPR (May 30, 2020, 11:36 AM), https://perma.cc/78L2-BEQU. If the hosts of user-generated content could not engage in content moderation without protection from liability, they would be more likely to either severely limit the content they allow or alternatively engage in no moderation whatsoever.18See id. Even in the case of only repealing (c)(1), there would still be significant consequences. Making interactive computer services liable for users’ content would make them less likely to carry that content, and thus users would lose out on many uses of the internet that they find beneficial.19See Billy Easley, Revising the Law that Let’s Platforms Moderate Content Will Silence Marginalized Voices, Slate (Oct. 29, 2020, 5:43 PM), https://perma.cc/4ZAF-QB8M. Such a change would impact not only social media interactive computer services but a wide range of internet services, including Wikipedia and review sites like Yelp.20See, e.g., Hassell v. Bird, 420 P.3d 776 (Cal. 2018) (rejecting an injunction against Yelp); CDA § 230 Success Case: Wikipedia, Elec. Frontier Found., https://perma.cc/X9AN-ADZL (advocating that Wikipedia is impacted by changes to the statute).

Repealing Section 230 in whole or with significant changes to the key elements would likely lock in existing tech giants rather than diminish their influence. This lock-in would happen because of two key reasons. First, these interactive computer services are more likely to have the resources both to engage in the cost of continuing moderation and defending the lawsuits that would arise without Section 230.21See Evan Engstrom, Primer: Value of Section 230, Engine (Jan. 31, 2019), https://perma.cc/D62U-FJZN (illustrating the potential cost of litigation over user-generated content). Second, these interactive computer services would have had the advantage of emerging with the protection granted by Section 230. This advantage would provide them with the necessary legal protections when seeking investors and determining the content moderation that meets their audience’s needs.22See Derek E. Bambauer, What Does the Day After Section 230 Reform Look Like?, Brookings Inst.: TechStream (Jan. 22, 2021), https://perma.cc/3XYQ-ML8H. In contrast, even if courts eventually arrived at limitations on the liability for interactive computer services, new interactive computer services would face additional costs for carrying user-generated content and might find investors more skeptical of their products given the heightened liability.

B.     Proposals that Create Contingency for Receiving Section 230 Protection

Several other proposals do not repeal Section 230 but undermine the intentions of the liability protection by creating requirements for online interactive computer services to receive the benefits of Section 230. The result in many cases would be to insert government regulation into lawful speech and private decisions or to remove beneficial features such as encryption.

The changes necessary to receive Section 230 protection vary among the bills. In some cases, under these proposals, interactive computer services would be required to prove that they are neutral to a government regulator to receive such protections.23Ending Support for Internet Censorship Act, S. 1914, 116th Cong. (2019). Another proposal initially sought to make Section 230 contingent on the information not being encrypted or having a backdoor for law enforcement to access the encrypted information.24See Patrick Hedger, The Flawed EARN IT Act: Rights and Common Sense Should Not Have to Be Earned, Competitive Ent. Inst. (June 24, 2020), https://perma.cc/CX7B-Q6EP. More recently, the “See Something, Say Something Act” would result in interactive computer services losing Section 230 immunity if they failed to report suspicious activity.25See Something Say Something Online Act of 2020, S. 4758, 116th Cong. § 2.

In the case of illegal activity, Section 230 already has a carve-out for federal crimes.2647 U.S.C. § 230(e)(1). Making Section 230 contingent on certain moderation actions would limit the ability of interactive computer services to adapt to novel issues in a timely fashion.27See Hedger, supra note 24. Many of these proposals ignore the difficulties of content moderation at scale and the sheer volume of posts that large interactive computer services must deal with on a daily basis.28See Claire Jenik, A Minute on the Internet in 2020, Statista (Sept. 21, 2020), https://perma.cc/XZR3-GD2X (“[A] single internet minute holds more than 400,000 hours of video streamed on Netflix, 500 hours of video uploaded by users on Youtube and nearly 42 million messages shared via WhatsApp.”). Even the best-intentioned interactive computer service runs the risk of a bad actor misusing their service and so would face a similar problem to the dilemma faced in a world without Section 230. Additionally, these proposals often overlook the benefits of technologies or current moderation decisions to average users in favor of putting all resources towards targeting the bad actors.

Making Section 230 contingent on interactive computer services proving neutrality or other necessary bureaucratic approval of their products would result in more government intervention into speech. This contingency would likely quickly run afoul of the First Amendment by resulting in government bureaucrats determining what speech was allowed on interactive computer services, potentially violating the rights of both users and the services.29See Riana Pfefferkorn, The EARN IT Act is Unconstitutional. First Up, The First Amendment, Ctr. for Internet & Soc’y (Mar. 9, 2020, 10:53 PM), https://perma.cc/9Y4D-PEJH.

C.     Proposals Narrowing Section 230’s Applicability

Some proposals seek to narrow Section 230’s applicability but are not as broad as the proposals discussed in Section A above, nor are they as specific as the carve‑outs discussed in Section D below. These proposals tend to be based on the belief that Section 230’s original principles are beneficial but that it has been applied too broadly. The best example of this approach is the 2020 Online Freedom and Viewpoint Diversity Act that sought to remove the term “otherwise objectionable” that serves as a broad catchall from Section 230 and replace it with a narrower list of bad acts to which interactive computer services were allowed to engage in moderation and receive Section 230 protection.30Online Freedom and Viewpoint Diversity Act, S. 4534, 116th Cong. § 2(1)(B)(i)(II) (2020).

Such changes would make it particularly difficult for interactive computer services to deal with novel issues. First, the advantage of having a catchall such as “otherwise objectionable” is that it is impossible to foresee all the issues an interactive computer service may face. Without such a catchall, interactive computer services would be unable to moderate to remove much of the content that users do not want to see. Second, this approach would require interactive computer services to leave untouched a great deal of “lawful” content that the average internet user would not want to see on his or her feed.31See Eric Goldman, Sen. Graham Cares More About Trolls Than Section 230 (Comments on Online Content Policy Modernization Act), Tech. & Mktg. L. Blog (Sept. 30, 2020), https://perma.cc/9YBA-JJ5F. This unwanted content includes racist and anti-Semitic language, doxing, or content that is merely off-topic from a service’s intended purposes.32See id.; Mike Masnick, GOP Senators Release Latest Truly Stupid Section 230 Reform Bill; Would Remove ‘Otherwise Objectionable’; Enable Spamming, Techdirt (Sep. 9, 2020, 9:37 AM), https://perma.cc/8C89-8AZL. In short, “otherwise objectionable” remains an important element to allowing interactive computer services to be the Good Samaritans that policy makers say they want them to be.

One could argue that some of these problems could be solved by including a more expansive list than the one proposed in the CASE Act, but it is unlikely that a list could anticipate all possible scenarios that an interactive computer service would need to potentially moderate for. Without such a catchall, there would be legal uncertainty around how to deal with complicated and novel content moderation issues. For example, consider a platform faced with an onslaught of videos during the Summer 2018 “Tide Pod challenge” when teenagers began uploading videos of themselves biting, chewing, or cooking laundry detergent packets.33Lindsey Bever, Teens are Daring Each Other to Eat Tide Pods. We Don’t Need to Tell You That’s a Bad Idea, Wash. Post (Jan. 17, 2018), https://perma.cc/7P5Z-C75F. As Google CEO Sundar Pichai pointed out in a congressional hearing, Section 230 is critical to an interactive computer service’s ability to rapidly make decisions when faced with a harmful new trend such as the Tide Pod challenge.34Devin Coldewey, Section 230 Barely Rates a Mention in Senate’s Hasty Pre-Election Flogging of Tech CEOs, TechCrunch(Oct. 28, 2020, 6:55 PM), https://perma.cc/R78X-5UGV. Without such a catchall, these services may be uncertain if it is considered dangerous enough to fall under “self-harm” or health misinformation and less able to consider engaging in content moderation of such information.35See id. Even if a decision to remove such content or otherwise engage in content moderation around such an issue may be vindicated in courts under the First Amendment,36Masnick, supra note 32. this returns the risks associated if the legal protection was removed more generally. Small interactive computer services would likely be particularly cautious about engaging in such decisions given the potential business-ending cost of litigation.

While there are debates about whether “otherwise objectionable” has been too broadly interpreted under the rules of statutory construction,37See Evans, supra note 11. a catchall of some sort is necessary to allow interactive computer services to meet their consumers’ needs and respond to novel issues.

D.     Proposals that Create Additional Carve-outs to Section 230

Many proposals would not seek to overhaul Section 230 in its entirety the way the proposals in the above categories do but rather specify that interactive computer services will no longer be able to claim Section 230 protections for certain categories of content. While these proposals may have the best of intentions, they may be unnecessary given existing exceptions for federal criminal law, they may result in silencing legitimate speech, and they may undermine the certainty that the existing law provides.

One of the most notable examples of this was the 2018 Stop Enabling Sex Trafficking Act (“SESTA”) that created additional liability for content that could be linked to sex trafficking.38Stop Enabling Sex Traffickers Act of 2017, S. 1693, 115th Cong. § 3 (2018). Other policy makers have suggested similar carve-outs to Section 230 could be utilized for everything from opioid sales39Samantha Cole, Senator Suggests the Internet Needs a FOSTA/SESTA for Drug Trafficking, Vice (Sept. 5, 2018, 2:47 PM), https://perma.cc/U4LX-QUR7. to illegal vacation rentals.40Bethany Patterson, The PLAN Act and Its Un-Planned Negative Consequences on American Families, Digit. Liberty (Jan. 7, 2020), https://perma.cc/D3WD-BWPU. In the case of many of these proposed carve-outs, including SESTA, the content is already covered by the federal criminal exceptions to Section 230.41Brent Skorup & Jennifer Huddleston Skees, Target Criminals Online, Not Tech Companies Caught in the Middle, Plain Text (Aug. 25, 2017), https://perma.cc/3M2Q-P3JD. There are often good intentions behind such proposals. However, they are often drafted in such a way that would also target speech well beyond the harmful content.

A carve-out-based approach has a few key problems. As illustrated by SESTA, these carve-outs are often drafted in a way that has a negative impact on speech and may even cause additional harm. By allowing for a private right of action, carve-outs may bring similar cost and burden of litigation concerns that a full repeal would, particularly for smaller companies. Second, as the number of carve-outs increases, this would diminish the certainty granted by Section 230 and could result in an even more complicated legal regime for startups than a full repeal would.

Carve-outs can impact far more than just websites that were “bad actors” and were frequently used for sex trafficking such as Backpage.com. For example, following the passage of SESTA, Craigslist removed its personals section due to concerns that it would be held liable for users’ content under the new law.42Jennifer Huddleston Skees, A Cautionary Tale on Internet Freedom Carveouts, The Hill (Jan. 17, 2019, 2:30 PM), https://perma.cc/ZX77-6D5S; see also Aja Romano, A New Law Intended to Curb Sex Trafficking Threatens the Future of the Internet as We Know It, Vox (July 2, 2018, 1:08 PM), https://perma.cc/A9HG-FVYE. This was out of an abundance of concern that this section of the website might open up the company to additional risk despite its use for legitimate purposes.43See Merrit Kennedy, Craigslist Shuts Down Personals After Congress Passes Bill On Trafficking, NPR (Mar. 23, 2018), https://perma.cc/7YJ4-JQ32. Additionally, removals of conversations around sex work, for fear of increased liability, increased the risk to sex workers who, because they were unable to share information about unsafe clients online began looking for clients on the street.44See Cathy Reisenwitz, The SAFE TECH Act Will Make the Internet Less Safe for Sex Workers, OneZero (Mar. 22, 2021), https://perma.cc/3QDL-X8Y2. Similarly, it made it more difficult for groups working to help victims of sex trafficking; by making it harder to identify or assist victims, perpetrators were forced further into the dark web.45See Siouxsie Q, Anti-Sex-Trafficking Advocates Say New Law Cripples Efforts to Save Victims, Rolling Stone (May 25, 2018, 7:01 PM), https://perma.cc/2GWF-ZQHZ. The potential to impact other speech would be even more pronounced for some proposed carve-outs. For example, a bill targeting opioids could also impact conversations and information sharing for those who are seeking addiction treatment and support or those with legitimate pain management concerns.

These carve-outs also raise the risks associated with increased litigation that would accompany a repeal of Section 230 if they allow a private right of action. While the government was able to take down the notorious Backpage.com before SESTA was signed into law, services like Salesforce and Mailchimp found themselves subjected to lawsuits for actions related to sex trafficking for which bad actors had used their services.46See Mike Masnick, Civil FOSTA Suits Start Showing Up in Court; Prove that FOSTA Supporters Were 100% Wrong About Who Would Be Targeted, Techdirt(Jan. 9, 2020, 9:25 AM), https://perma.cc/W5GP-4NHU. While in many cases companies have been able to defeat such lawsuits, without Section 230 protection, such litigation can quickly reach tens or hundreds of thousands of dollars and could bankrupt smaller companies and startups.47Engstrom, supra note 21. Such concerns are worsened by drafting that lowers the requirements. A better solution to go after harmful content already covered by federal criminal law would be for policy makers to ensure that law enforcement has the resources to respond and the training to properly utilize the information tech companies provide about misdeeds, such as sex trafficking, illicit drug sales, or child exploitation.48Skorup & Huddleston Skees, supra note 41.

Some of the impacts of carve-outs will depend on how narrowly tailored they are and what changes they make to Section 230’s liability protection. However, if the number of carve-outs increases, it would create a costly and burdensome system that undermines the certainty provided by Section 230. Uncertainty results because no longer would the same liability standards generally apply with narrow exceptions, but rather interactive computer services would have to determine what liability protection, if any, is associated with each individual piece of content. This uncertainty would come at a great deal of compliance cost. Large platforms might have more resources and staff to navigate the compliance with many different standards depending on the content, but small platforms would quickly encounter increased legal questions. In some cases, platforms might find it easier to presume the toughest rules (or, in this case, the least amount of protection) applies to the content in question. If numerous carve-outs were passed, the result would be a de facto death by a thousand cuts for Section 230. Because of the nature of the carve-outs, it would also be less likely that alternative protection at common law could emerge the same way as if the courts were able to use existing standards for intermediaries to the online world.49See id.

E.     State Proposals Regulating Content Moderation

While not directly changing the federal law, several states, including Utah50See Stuart Adams & Mike McKell, Sens. Adams and McKell: Big Tech Poses a Major Threat to Free Speech, Deseret News (Mar. 2, 2021, 8:00 PM), https://perma.cc/F39S-AEDQ. and Florida,51See John Kennedy, Gov. DeSantis Says ‘Big Tech’ Looks Like ‘Big Brother’, Sarasota Herald-Tribune (Feb. 2, 2021, 4:39 PM), https://perma.cc/GH2A-9XSL. have considered proposals that would place additional burdens on interactive computer services and the way they engage in content moderation. These proposals are typically aimed at claims that social media platforms engage in anti-conservative bias or support “cancel culture.”52See id. The specifics of each state’s proposal vary, but many of them would prevent states from removing certain kinds of content53See Robert Winterton, Florida Governor’s Plan Exposes Children to More Sexual Content, Extremism on the Internet, Mia. Herald (Feb. 16, 2021, 10:21 AM), https://perma.cc/U3JZ-85EG. or would subject platforms to bureaucratic processes for reviewing their terms of service.54Cf. Trevor Wagener, The High Cost of State-by-State Regulation of Internet Content Moderation, Disruptive Competition Project (Mar. 17, 2021), https://perma.cc/ZGR5-A5S6 (noting the cost of compliance and reporting requirements for each moderated post under the Utah proposed legislation).

Regardless of how one feels about the need for revised internet regulation, state-level content moderation is likely unconstitutional. Not only do these proposals, like many other Section 230 reforms, have significant First Amendment concerns,55See Ian Richardson, Can Iowa Legally Penalize Social Media Companies for ‘Censorship’? Here Are the Issues at Play, Des Moines Reg. (Mar. 11, 2021, 6:38 PM), https://perma.cc/3FJN-MY63. but they would also undermine the consistent federal legal framework created by Section 230. First, Section 230, as currently drafted, likely preempts such state-level laws.56See Cathy Gellis, Utah Prematurely Tries to Dance on Section 230’s Grave and Shows What Unconstitutional Garbage Will Follow If We Kill It, Techdirt (Mar. 1, 2021, 3:30 PM), https://perma.cc/2DXU-ZD47. The internet, by its nature, crosses borders, and this has been one of its great advantages. State laws regulating content moderation like similar proposals around net neutrality and data privacy could result in a splintering of the internet where the same services would not be available in every state.57See Jennifer Huddleston & Ian Adams, Regul. Transparency Project, Potential Constitutional Conflicts in State and Local Data Privacy Regulations 1, 8, https://perma.cc/F23N-7UE8 (discussing these issues in the context of state data privacy laws). While user content may be made by a user in a single state, the interactions that occur around user-generated content and the services that host this content are often in multiple other locations. As a result, it is likely these laws would significantly impact interstate commerce in a way that is not balanced by the state interest.58See id. at 7. This result means there would likely be constitutional concerns regarding the dormant commerce clause.59See id. Florida’s content moderation law almost immediately faced a legal challenge, and the US District Court for the Northern District of Florida recently issued a preliminary injunction blocking the law from going into effect during the litigation.60Cat Zakrzewski, Federal Judge Blocks Florida Law that Would Penalize Social Media Companies, Wash. Post (June 30, 2021, 11:28 PM), https://perma.cc/G6SH-LBYP.

III.     Is Section 230 Reform Likely?

While there have been multiple proposals for Section 230 reforms, it does seem unlikely that legislative changes will occur at a federal level although enough bipartisan overlap may exist that additional carve-outs occur. While congressional action on Section 230 may be unlikely, it is still plausible that there could be concerning actions either through the executive branch or at a state level.

While both Republicans and Democrats have called for changes to Section 230, their motivations behind these reform proposals are incredibly different and typically incompatible with one another. Democrat proponents of Section 230 reform typically argue that Section 230 is resulting in under-moderation by various services around issues such as hate speech, misinformation, or election interference.61See Mike Masnick, Now It’s the Democrats Turn to Destroy the Open Internet: Mark Warner’s 230 Reform Bill Is a Dumpster Fire of Cluelessness, Techdirt (Feb. 5, 2021, 10:55 AM), https://perma.cc/H7UQ-8Q25. Republican proponents of Section 230 reform typically argue that Section 230 is resulting in wrongful over-moderation by platforms who are engaged in anti-conservative bias.62See Jennifer Huddleston, The Problem with Calls for Social Media “Fairness”, Tech. Liberation Front (Sept. 6, 2018),https://perma.cc/CG74-64H6. Between these differences and closely divided chambers of commerce, it is unlikely that a proposal would find bipartisan support. Additionally, some members in their comments in hearings recognize that even if they dislike the decisions of these private platforms, allowing the government to intervene in these decisions could set a dangerous precedent and raise serious First Amendment concerns.63E.g., Reply Comments of Co-Authors of Section 230 of the Communications Act of 1934 at 19–21, In the Matter of National Telecommunications and Information Administration Petition for Rulemaking to Clarify Provisions of Section 230 of the Communications Act of 1934, RM-11862 (FCC Sept. 17, 2020) [hereinafter Reply Comments]. While there will likely continue to be heightened rhetoric and debate around the future of Section 230 and the appropriate level of regulation of tech companies, it seems unlikely that this dichotomy could be overcome with a single proposal.

Some proposals have garnered bipartisan attention and support, but this seems to be reserved for those that create additional carve-outs for content that is generally agreed to be harmful. It seems if any action were to occur in the current Congress, it would be the passage of additional carve-outs. As discussed previously in Section II.D, these carve-outs can still have a significant harmful impact on both speech and new or smaller entrants. To address concerns about harmful content, a more successful approach to target the underlying behavior would be by either passing additional criminal law as necessary or providing the needed resources to law enforcement or other appropriate programs to respond to the concerns.

The last few years have shown, however, that while Congress may be the most appropriate and logical place for debates around Section 230 to occur, there also may be significant changes or threats of changes made in other ways. Most notable in 2020, following an executive order from then-President Donald Trump, the National Telecommunications and Information Administration petitioned the Federal Communications Commission (“FCC”) to engage in interpretative rulemaking regarding Section 230 and limit its current applicability.64See Makena Kelly, Donald Trump Signs Executive Order Targeting Social Media Companies, The Verge (May 28, 2020, 4:32 PM), https://perma.cc/VTW8-8EBJ. The FCC determined that it could engage in such rulemaking on the basis of precedents that have concluded that the Telecommunications Act of 1996 that Section 230 is a part of the modified 1934 Act over which it had authority.65Thomas M. Johnson Jr., The FCC’s Authority to Interpret Section 230 of the Communications Act, Fed. Commc’n Comm’n (Oct. 21, 2020, 10:30 AM), https://perma.cc/8YDN-59KF. However, the authors of Section 230 submitted comments regarding the petition and noted that the FCC had not been delegated authority regarding Section 230.66Reply Comments, supra note 63, at 4. The FCC did not reject the petition, but it did not engage in rulemaking prior to the end of the Trump administration. While the Biden Administration does not currently seem motivated to engage in executive action regarding Section 230 like the prior administration, during his campaign, President Biden did express his belief that Section 230 should be repealed and also expressed other concerns indicating a desire for additional regulation of the tech sector.67See Makena Kelly, Joe Biden Wants to Revoke Section 230, The Verge (Jan. 17, 2020, 10:29 AM), https://perma.cc/Z9GV-5YRH. Additionally, statements from the Biden Administration in July 2021 regarding health misinformation on social media platforms might indicate that it, too, is considering action regarding content moderation.68Adi Robertson, The Biden Administration Should Take the First Amendment as Seriously as Facebook Misinformation, The Verge (July 21, 2021, 9:17 AM), https://perma.cc/5VFV-J46H.

Additionally, as discussed in Section II.E, states have been increasingly active in their attempts to impose additional regulations that would undermine Section 230. A proposal in Utah passed both chambers of the legislature but was vetoed by the governor.69See Bryan Schott, Utah Gov. Spencer Cox Vetoes Controversial Social Media Legislation, Salt Lake Trib. (Mar. 24, 2021, 9:49 AM), https://perma.cc/BQS9-562W. In his veto statement, the governor noted several concerns with the constitutionality of the bill.70See id. State legislatures can move quickly, and it is unclear whether any other proposals will pass this term, and if so, whether other governors would also recognize these issues. If a state proposal became law, it could be disruptive to both speech and innovation, even if it was later found constitutional without appropriate steps. A court challenge can take significant time, and a law would go into effect if an injunction was not granted. For example, California’s Net Neutrality law was challenged but became effective as an injunction was rejected.71See Cecilia Kang, California Wins Court Victory for Its Net Neutrality Law, N.Y. Times (Feb. 25, 2021), https://perma.cc/NC6K-TK8B. Similarly, while SESTA continues to face legal challenges, its consequences have been felt since the law became effective.

While a congressional repeal of Section 230 is unlikely, there remains a significant risk that other actions could result in changes that are harmful to both speech and innovation.

Conclusion

The last few years have seen many policy makers and pundits express concerns about online content and the content moderation decisions of large platforms. But calls to change or revoke Section 230 are likely to have significant consequences for speech and be particularly burdensome on smaller platforms. Section 230 has been critical in enabling speech and innovation to flourish online. The result has greatly benefited consumers both in their ability to create and consume various content. While there is a range of proposals, from full repeal to the creation of additional carve-outs, policy makers should recognize the proposals offered so far would have consequences well beyond bad actors or big tech. Instead of seeking to reform Section 230, policy makers should seek to continue the light-touch approach to tech policy that allows a wide array of voices and innovators to flourish and, when necessary, address underlying concerns in regards to harmful content.

Share this article

Twitter
Facebook
LinkedIn