Moody v. NetChoice, LLC

LII note: The U.S. Supreme Court has now decided Moody v. NetChoice, LLC .

Issues 

Is the First Amendment violated when a state imposes content-moderation restrictions on social media companies’ ability to censure its posts or users or when a state imposes individualized-explanation requirements when social media companies censor their posts or users?

Oral argument: 
February 26, 2024

This case asks the Supreme Court to decide whether the First Amendment is violated when states impose content-moderation restrictions and require individualized explanations for social media companies to censure posts or users. Florida Attorney General Moody argues that the content-moderation laws only regulate content and not speech and that intermediate scrutiny applies. Moody also argues that social media companies are analogous to common carriers which are subject to regulations, and providing individualized explanations are not unduly burdensome to the well-funded social media companies. NetChoice counters that the content-moderation laws restrict editorial discretion, that its members are not common carriers, strict scrutiny applies to the content-moderation laws, and that the individual-explanation requirements are too burdensome. The outcome of this case has significant implications for the ability of social media companies to monitor posts on their platforms.

Questions as Framed for the Court by the Parties 

Issues: (1) Whether the laws’ content-moderation restrictions comply with the First Amendment; and (2) whether the laws’ individualized-explanation requirements comply with the First Amendment.

Facts 

On August 1, 2021, Senate Bill (“SB”) 7072 took effect in the state of Florida. NetChoice, LLC v. Attorney General at 7. The Bill’s purpose is to protect Floridians from censorship on popular social media sites. Id. at 7. Specifically, Governor Ron DeSantis said that the Bill was created to “fight against big tech oligarchs that . . . censor if you voice views that run contrary to their radical leftist narrative.” Id. at 8. The overall goal of SB 7072 is to protect conservative political viewpoints from being censored on social media sites. Id.

The Bill defines social media platforms by size and revenue thresholds, covering any internet site that surpasses a certain amount of users and revenue. Id. at 8. Despite the Bill’s intention to regulate only social media companies, its broad coverage includes other websites such as Wikipedia and Etsy. Id. at 9. To pass SB 7027, Florida compared private social media platforms to public utilities; thus, they treat them as common carriers and can regulate their speech.

The Bill regulates three main categories: “1) content-moderation restrictions; 2) disclosure obligations; and 3) a user data requirement.” Id. at 9. In this case, the Supreme Court is concerned with the first two: disclosure obligation and content-moderation restrictions. The disclosure requirements in SB 7027 force social media companies to provide “thorough rationale[s]” for all their content-moderation decisions. Id. at 4. This means that any time a platform wishes to remove a post or user that violates its guidelines, it is required to provide an explanatory written notice of the decision to the user. Id. at 64. The content-moderation restrictions do not allow social media companies to deplatform any candidate running for office for more than fourteen days or censor or deplatform any journalistic enterprises. Id. at 11. In addition, it forces platforms to censor or ban content in a consistent manner. Id.

The plaintiffs in this case, NetChoice and the Computer & Communications Industry Association (“NetChoice”), are a trade association of the largest social media companies like Facebook, Google (owner of YouTube), and TikTok. Id. at 13. NetChoice filed suit in the Northern District of Florida to enjoin several of the Bill’s sections. Id. 13. It argued that the Bill violated its right to free speech and is preempted by federal law. Id. 13-14. On June 30, 2021, the court found for NetChoice on both issues. Id. at 14. It concluded that the provision of SB 7072 that imposes liability on social media platforms when they remove content was preempted by 47 U.S.C. § 230 (c)(2), also known as the Communications Decency Act, and the Bill infringed on the platform’s protected editorial judgment. Id. The Florida Attorney General, Moody, then appealed this decision to the Eleventh Circuit. Id.

On May 23, 2022, the Eleventh Circuit affirmed the district courts’ decision and struck down the Bill’s content-moderation and individualized-explanation provisions on First Amendment grounds. Id. at 66. According to the Court, this Bill unlawfully limited the power of social media companies to moderate content on their platforms. Id. at 46. Concurrently, NetChoice filed a case challenging a related Texas law, and the Fifth Circuit upheld similar provisions against the social media companies. Brief of U.S. at 2. This decision created a circuit split and the Solicitor General for the United States, Elizabeth Prelogar, asked the Supreme Court to resolve it. Brief of the Amicus Curiae U.S.

On September 21, 2022, Moody filed a petition for a writ of certiorari, and the court granted certiorari on September 29, 2023. Petition for a Writ of Certiorari.

Analysis 

THE SPEECH QUALITY OF CONTENT HOSTING

Moody, et al. (“Moody”) argue that Florida’s content-moderation restrictions only regulate NetChoice’s members’ conduct, not their speech. Brief for Petitioners, Moody et al. at 14-15. Specifically, Moody asserts that NetChoice’s members engage in the conduct of hosting their users’ speech. Id. at 15. Moody contends that the First Amendment only protects conduct that is “inherently expressive” and that determining whether conduct is expressive turns on both the regulated conduct and the regulating law. Id. at 15-16. Particularly, Moody says that the Supreme Court has routinely allowed the government to “prevent[] private entit[ies] that generally open [their] doors to all speakers and speech from arbitrarily censoring those speakers.” Id. at 17. Moody points to Prune-Yard Shopping, where the Supreme Court found that a state could require a shopping mall to allow people to collect petition signatures on mall property. Id. Moody emphasizes that the Court supported this finding because the mall was not selective in who it admitted and that the public would not misattribute the solicitors’ speech to the mall owner. Id.

Moody next turns to the common carrier doctrine, where common law governments regulate the conduct of businesses that hold themselves out to provide common services to everyone, including inns, telegraphs, and phone lines. Id. at 19-20. Moody argues that NetChoice’s members are analogous to these traditional common carriers because they generally allow users to post content freely without review, often in numbers that would be impossible to review. Id. at 23-24. Moody distinguishes NetChoice’s members from companies that engage in expressive conduct like “newspaper[s], cable television provider[s], publishing house[s], bookstore[s], or movie theater[s]” by pointing out that these companies more carefully select the content they make available and do not rely on content generated by their users. Id. at 24. In sum, Moody concludes that Florida’s requirement that NetChoice’s members apply their moderation policies in a neutral manner are permissible regulations of their conduct in hosting users’ speech. Id. at 26.

NetChoice counters that Florida’s content-moderation restrictions are impermissible regulations of its members’ speech. Brief for Respondents, NetChoice, LLC, and the Computer & Communications Industry Association at 18. Specifically, NetChoice argues that the content-moderation laws restrict its members’ editorial decisions, which constitute speech. Id. at 18-19. NetChoice submits that the First Amendment protects editorial decisions because the choice to promote or make available certain information communicates a message, even if the promoter did not write the information itself. Id. at 19. NetChoice compares Florida’s content-moderation laws to the law struck down in Miami Herald Publishing Co. Id. at 20. NetChoice explains that the Court ruled that a law requiring newspapers to publish politicians’ rebuttals to their stories was unconstitutional because those kinds of editorial decisions constituted protected speech. Id. at 20. NetChoice asserts that protections extending to editorial decisions are not limited to press contexts, pointing to the Court’s past rejection of laws which would have required parade organizers to allow LGBTQ groups to march and utility companies to print third parties’ statements on their bills. Id. at 22.

Next, NetChoice contends that Moody’s common carrier argument is meritless because NetChoice members are not common carriers and regardless, the First Amendment still protects common carriers’ editorial decisions. Id. at 46-47. NetChoice emphasizes the editorial decisions its members make already, submitting that the Florida content-moderation laws are attempting to “convert them into common carriers.” Id. at 47. NetChoice points out that there were never common law accessibility requirements placed on entities that actively promoted their users’ speech. Id. It distinguishes its members’ services from telegraphs and telephones by pointing out that they actively promote and organize their users’ communications as opposed to the wholly passive role played by traditional communication utilities. Id. at 49. NetChoice also distinguishes Florida’s content-moderation laws from other common carrier laws by arguing that the law does not apply to all common carriers; it only applies to the largest carriers because of their perceived liberal political bias. Id. at 51.

TIERS OF SCRUTINY

Moody argues that if Florida’s laws “regulate expression,” they should only be subject to intermediate scrutiny, a standard that requires a law to further a “sufficiently important governmental interest” through means that are “unrelated to the suppression of free expression.” Brief for Petitioners, Moody et al. at 39. First, Moody contends that intermediate scrutiny applies because Florida’s content-moderation provisions do not limit what kind of content NetChoice’s members promote; they only require that moderation policies are applied identically across viewpoints. Id. at 39. Second, Moody submits that the thirty-day restriction on changing terms of service merely prevents NetChoice’s members from arbitrarily changing their rules to censor opinions with which they disagree. Id. at 40. Moody analogizes these provisions to laws that required cable companies “to carry local stations,” which the Supreme Court found to be content neutral and pursued an important governmental interest in “[seeking] to protect the speech of the NetChoice’s members’ users.” Id. Moody states that Florida is not seeking to suppress NetChoice’s members’ free expression, pointing to the fact that each of NetChoice’s members promote unique messages while Florida’s law applies uniformly to all of them. Id. at 41. Moody asserts that if strict scrutiny applied to these provisions, then it should apply to similar longstanding regulations of the Post Office, cable companies, broadcasters, and newspapers. Id. at 42-43. Moody concludes that Florida’s content-moderation laws pass intermediate scrutiny because the laws “assur[e] that the public has access to a multiplicity of information sources,” which is especially important given the large percentage of ideas shared on members’ sites. Id. at 43, 45. They are “reasonably tailored” because they only include platforms that could distort speech through censorship. Id.

In contrast, NetChoice argues that strict scrutiny applies to Florida’s content-moderation laws, a standard which requires laws to be “narrowly tailored to serve compelling state interests.” Brief for Respondents, NetChoice, LLC, and the Computer & Communications Industry Association at 27. NetChoice contends that strict scrutiny applies because the content-moderation laws are “content-based” restrictions on speech which regulate speech according to its message. Id. In support of this proposition, NetChoice points to cases where the Supreme Court held that laws requiring clinics to provide information or parades to allow marchers were content based because they changed the messaging those organizations offered. Id. at 28. NetChoice submits that Florida’s content-moderation laws are similar to these laws because they require its members to endorse and promote speech that express ideas with which they disagree. Id. In addition, NetChoice contends that the laws warrant strict scrutiny because they single out specific speakers—large social media companies, partly for their perceived liberal bias. Id. at 31-33. NetChoice posits that Florida’s content-moderation laws fail strict scrutiny because Florida does not attempt to argue the laws are “the least restrictive means of achieving a compelling state interest.” Id. at 35-36. Further, NetChoice says that the laws would fail intermediate scrutiny because they are not “narrowly tailored to serve a significant interest.” Id. at 36. NetChoice submits that the state’s interest in allowing people to hear different opinions is not significant under Supreme Court precedent. Id. Further, the state’s interest in allowing information to spread, NetChoice maintains, is only significant in contexts like cable or broadcasting, not the internet. Id. at 36-37. NetChoice concludes that the laws are not narrowly tailored because they both overregulate speech that falls outside the states’ interests and underregulate targeted speech by exempting some online platforms. Id. at 38.

INDIVIDUALIZED DISCLOSURE REQUIREMENT

Moody argues that Florida’s individualized disclosure law is permissible under the First Amendment. Brief for Petitioners, Moody et al. at 46. Moody contends that Zauderer applies, a Supreme Court case that allowed requiring disclosure of factual information when rules “are not ‘unduly burdensome” and “reasonably related to the State’s interest in preventing deception of consumers.” Id. Moody argues that the requirements are not unduly burdensome because the regulated companies have the technological capability to generate individual responses with little cost in time or money, pointing to their compliance with a similar European law. Id. at 48-49.

NetChoice counters that the individualized disclosure law is prohibited under the First Amendment. Brief for Respondents, NetChoice, LLC, and the Computer & Communications Industry Association at 39. NetChoice maintains that Zauderer should not apply because the individualized disclosure provision regulates speech on its content which warrants strict scrutiny and submits that the law fails Zauderer regardless. Id. at 39. NetChoice argues that the individualized disclosure rule unduly burdens its members’ editorial speech because they would have to spend so many resources to send those disclosures that they would likely have to make the decision not to moderate their users at all. Id. at 26, 39-40.

Discussion 

ECONOMIC AND COMPETITION ISSUES

The media conglomeration iTexasPolitics LLC, in support of Petitioners, argues that the Florida law promotes consistency in the social media companies’ screening and removal policies, creating transparency between the users and the platforms. Brief of Amicus Curiae iTexasPolitics LLC, in Support of Respondents at 26. iTexasPolitics LLC argues that this enhanced transparency will create efficiencies. Id. at 27. It claims that user disputes will be reduced since the accepted terms of using the platform are so clear. Id.

Retired Professor Rasmusen (“Professor”), in support of Petitioners, argues that the current lack of regulations for social media competition allows the federal government to censor opposition speech. Brief of Amicus Curiae Eric Ramusen, in Support of Respondents at 30. The Professor argues that requiring the moderation algorithm to be public will combat the federal government’s suppression of speech on social media. Id. Moms for Liberty, in support of Petitioners, argues that the States are best suited to regulate social media companies and to remove any federal government influence on their restrictions of posts or users. Brief of Amicus Curiae Moms for Liberty, in Support of Respondents at 32-33.

A group of small internet companies (“Internet Works”), in support of Respondent, counters that Florida’s law applies to numerous internet companies with no market power and not just large social media companies such as X or Facebook. Brief of Amicus Curiae Internet Works et. al., in Support of Respondents at 6-8. Internet Works argues that this law will result in needless litigation over websites like Reddit and Trip Advisor banning individuals from message boards or deleting online reviews. Id. at 10-11. The U.S. Chamber of Commerce, in support of NetChoice, argues that this law reduces a platform’s ability to remove spam, “offensive content, and explicitly inflammatory language.” Brief of Amicus Curiae Chamber of Commerce, in Support of Respondents at 5-6.

The Internet Society, in support of Respondent, argues that this broad regulation will burden new social media platforms from growing because they lack the resources to comply with the regulations unlike large existing social media companies. Brief of Amicus Curiae Internet Society, in Support of Respondents at 14. Internet Society also contends that social media companies compete on their content-moderation policies (for example, some users join Truth Social because it moderates less), so these new laws will eliminate competition from the market if they are now subject to the same rules. Id. at 15.

IMPLICATIONS FOR USERS

A group of seventeen states (“States”), in support of Petitioner, argue that a concentrated few social media companies wield market power and can stifle diversity of ideas through censorship. Brief of Amicus Curiae Missouri et. al., in Support of Petitioners at 26. The States argue that these social media platforms primarily target conservative ideas to censor. Id. at 33. They claim that this concentrated effort to suppress speech by the platforms makes the public uninformed, which undermines our “constitutional project.” Id. at 36.

The World Faith Foundation, in support of Petitioner, similarly argues that blocking, censoring, or removing content and users on social media platforms reduces the amount of information available to the electorate and undermines our political process. Brief of Amicus Curiae The World Faith Foundation, in Support of Petitioners at 23. It argues that the community guidelines or goals of social media companies, such as YouTube’s “to create a welcoming community,” result in viewpoint-based censorship by the platforms that attack the views of those with different values than the company. Id. at 36.

The Anti-Defamation League, in support of neither party, argues that this law hurts internet users because it deprives social media companies of the ability to stop hate and harassment. Brief of Amicus Curiae Anti-Defamation League, in Support of Respondents at 14. The Anti-Defamation League contends that social media companies combat hate and violence against religious, and racial minorities by purging their platforms of extremist and radical speech. Id. at 21-22. It argues that the law’s consistent application requirement will force platforms to restrict all speech relating to an issue, not just harmful posts. Id. at 13. For example, it argues that the platforms must restrict racially intolerant and discriminatory posts in addition to posts that promote racial equity if they wish to censor. Id.

Professors Richard Hasen, et al., in support of Respondent, argue that this law will increase election violence because platforms will be unable to remove dangerous political content. Brief of Amicus Curiae “Professors”, in Support of Respondents at 9-10. Public Knowledge, a consumer rights group, argues in support of Respondent that social media companies must set the parameters around speech on their platforms to foster an environment conducive to free speech—not an environment of intimidation. Brief of Amicus Curiae Internet Society, in Support of Respondents at 28.

Conclusion 

Written by:

John Orona

Alexander Strohl

Edited by:

Dustin Hartuv

Acknowledgments 

Additional Resources