Private Platforms and Public Speech: Rethinking the First Amendment in the Social Media Age
By Eden Reynolds
Nearly every generation in America uses social media to connect, learn, or engage in public conversation.[1] These platforms, like Facebook, X (formerly Twitter), Instagram, and TikTok, function as the new town square, yet the government has no ownership of them.[2] This reality presents a constitutional paradox: while the First Amendment protects citizens from government censorship, it does not constrain private platforms that now mediate most of our public disclosure.[3] As social media becomes central to communication and politics, courts and legislatures grapple with a fundamental question: what does “free speech” mean when private companies control the space where speech happens?[4]
Two recent Supreme Court cases, Packingham v. North Carolina and Murthy v. Missouri, capture the tensions between government regulation and private moderation.[5] State and federal lawmakers proposed hundreds of bills to regulate how these platforms operate.[6] These developments suggest that the First Amendment is entering a new era: one defined by blurred boundaries between public and private control over speech.[7]
Constitutional Baseline: Government vs. Private Regulation.
The First Amendment restricts only government action.[8] Private entities are not state actors and therefore have their own First Amendment rights to curate, moderate, or remove speech.[9] The First Amendment does not bound social media companies because they are privately owned.[10] Not only are these private companies allowed to write community guidelines that limit users’ speech, but Section 230 of the Communications Decency Act also shields and protects online platforms from liability for user content and their moderation choices.[11] This statutory immunity allows platforms to remove or restrict posts they find objectionable without fear of legal consequence.[12] For example, TikTok prohibits “hateful behavior or ideologies.”[13] The Supreme Court declined to limit Section 230 twice.[14] The Court suggested Congress should change the law because if the law were revoked, the internet would sink.[15] Supporters argue that Section 230 prevents an avalanche of lawsuits and encourages free expression online;[16] critics counter that it grants private corporations excessive power to silence certain viewpoints.[17]
The result is a constitutional divide: the government cannot censor, but private platforms can.[18] Yet, when the government pressures or collaborates with these platforms to suppress information, the First Amendment may reenter the picture.
Government Officials Online: When Public Accounts Become Public Forums.
Courts began addressing whether an official’s account qualifies as a “public forum” as social media became a vital tool of communication among public officials.[19] The Knight Inst. v. Trump case held that the President’s Twitter account was a public forum because he used it to conduct official business and interact with constituents.[20] Therefore, blocking users for their viewpoints violated the First Amendment.[21] Similarly, the Fourth Circuit in Davison v. Randall reached the same conclusion regarding a local official’s Facebook page.[22] However, not all courts agree. In Morgan v. Bevin, the Sixth Circuit held that the Kentucky Governor’s Facebook and Twitter accounts were not public forums because they were personal and not state controlled.[23] The doctrine depends on whether the space is government-controlled, not merely publicly accessible.
The Supreme Court’s decision in Packingham underscored the significance of online expression but stopped short of treating private social media as government forums.[24] The Court struck down a state law prohibiting registered sex offenders from accessing social media, calling such platforms “the most important places for the exchange of views.”[25] However, the Court also cautioned against treating private social media sites as constitutional public forums and recognized that ownership and control still matter.[26]
Government Regulation of Private Platforms.
Beyond public officials’ accounts, the government sought to regulate the platforms themselves.[27] State legislatures have introduced more than 400 bills since 2021 targeting social media regulation. These efforts range from content moderation transparency requirements to outright bans.[28]
For example, in 2020 TikTok became the central focus.[29] In 2020, President Trump issued an executive order banning TikTok and WeChat from app stores, citing national security concerns.[30] The order never took full effect.[31] Montana later attempted to ban TikTok statewide, and in 2024, President Biden signed a federal law forcing TikTok’s Chinese parent company, ByteDance, to sell or cease operations in the United States.[32] Supporters defended the law as a data privacy safeguard;[33] opponents view it as a disguised speech restriction.[34]
Meanwhile, over a dozen states enacted laws restricting minors’ access to social media and targeted “addictive” design features.[35] Presently, courts are deciding whether these measures infringe on free expression or fall within states’ traditional police powers.[36] California took a different approach in 2022 and required platforms to disclose their content moderation policies.[37] When X challenged the law, a federal court upheld it and ruled that the disclosure requirement promoted transparency without compelling speech or restricting content.[38] These state laws highlight the growing desire to hold private platforms accountable, but they also reveal how the First Amendment limits direct government control over private moderation practices.[39]
Private Power Over Public Speech.
The Constitution restrains the states, but private platforms regulate most of the nation’s expressive life.[40] The private companies determine what speech is amplified, suppressed, or monetized.[41] They enforce age restrictions. Utah and Arkansas’s parental consent laws mirror platform-driven initiatives like Instagram’s screen-time limits.[42] The private platforms remove misinformation, hate speech, or nudity under their community guidelines.[43]
This question parallels debates globally.[44] In the European Union, the “Right to be Forgotten” law allows individuals to request that search engines delete personal information.[45] The United States courts consistently rejected similar claims.[46] In Florida Star v. B.J.F. and Martin v. Hearst Corp., courts held that laws requiring the removal of truthful information violate the First Amendment as impermissible prior restraints.[47] Free expression remains paramount;[48] even when it collides with privacy or reputation interests.[49]
The Government–Platform Nexus: Murthy v. Missouri.
The Supreme Court’s recent decision in Murthy v. Missouri marks the next stage of this debate.[50] Plaintiffs alleged that federal officials coerced social media platforms into suppressing posts related to COVID–19 and election integrity, amounting to government-orchestrated censorship.[51] The Court ultimately dismissed the case on standing grounds, but it emphasized that First Amendment violations require strong factual records linking government acts to suppression of speech.[52] Justice Alito’s dissent warned of the dangers of subtle government “pressure campaigns” that turn private moderation into state censorship by proxy.[53] Murthy leaves open a critical constitutional question: when does cooperation between the government and platforms become unconstitutional coercion? This is a gray area. The public influence meets private control as digital speech regulation grows more complex, and this will shape the next generation of First Amendment law.[54]
Conclusion
The First Amendment was written for a world of printing presses and town squares, not algorithms and app stores. The First Amendment’s core purpose is to preserves open debates and limit state control over expression.[55] This remains as vital as ever.[56] Today, more speech moves to private public platforms, and more of those companies determine what “free speech” means in practice.[57] Courts resist expanding constitutional duties to private entities and prefer to leave reform to Congress.[58] But as social media platforms govern public discourse for billions, the line between private and public power continues to erode.[59] The future of free speech depends less on government restraint and more on the transparency, fairness, and accountability of the private actors who now host our democracy’s conversation.[60] So, who guards the digital public square? Until the law adapts, corporate policy, algorithms, and private platforms’ community guidelines draw the boundaries of free expression because the Court and Congress refuse.[61]
[1] Aishwarya Suresh & Sayan Nan, Social Media in America: 10 Stats that are Changing in 2025, sprinklr (Feb. 17, 2025) https://www.sprinklr.com/blog/social-media-in-america/.
[2] Lata Nott, Free Speech on Social Media: The Complete Guide, FREEDOM FORUM (Oct. 12, 2023) https://www.freedomforum.org/free-speech-on-social-media/.
[3] Id.
[4] Id.
[5] Packingham v. North Carolina, 582 U.S. 98 (2017); Murthy v. Missouri, 603 U.S. 43 (2024).
[6] Kevin Goldberg, Does Government Regulation of Social Media Violate the First Amendment?, FREEDOM FORUM (Aug. 26, 2024) https://www.freedomforum.org/government-regulation-social-media/.
[7] Id.
[8] Nott, supra note 2.
[9] Id.
[10] Id.
[11] What Is Section 230 and Why Should I Care?, FREEDOM FORUM, https://www.freedomforum.org/what-is-section-230/ (last visited Oct. 12, 2025).
[12] Id.
[13] Nott, supra note 2 (quoting TikTok’s community guidelines, “We do not allow any hateful behavior, hate speech, or promotion of hateful ideologies. This includes content that attacks a person or group because of protected attributes.”).
[14] Id.
[15] Id.
[16] Id.
[17] Id.
[18] Nott, supra note 2.
[19] Isabel Farhi, Twenty-First Century First Amendment: Public Forums in the Digital Age, Yale L. Sch.: Media Freedom & Info. Access Clinic (Oct. 29, 2018) https://law.yale.edu/mfia/case-disclosed/twenty-first-century-first-amendment-public-forums-digital-age.
[20] Knight First Amend. Inst. v. Trump, 302 F. Supp. 3d 541 (S.D.N.Y. 2018), aff’d, 928 F.3d 226 (2d Cir. 2019), cert. granted, judgment vacated sub nom. Biden v. Knight First Amend. Inst. At Columbia U., 141 S. Ct. 1220 (2021).
[21] See Farhi, supra note 19.
[22] Davison v. Randall, 912 F.3d 666 (4th Cir. 2019), as amended (Jan. 9, 2019).
[23] Morgan v. Bevin, 298 F. Supp. 3d 1003 (E.D. Ky. 2018).
[24] Packingham v. North Carolina, 582 U.S. 98, 105 (2017).
[25] Id. at 104.
[26] Id.
[27] Goldberg, supra note 6.
[28] Id.
[29] Id.
[30] Exec. Order No. 13942, 85 Fed. Reg. 48673 (2020); Exec. Order No. 13971, 86 Fed. Reg. 1249 (2021).
[31] Goldberg, supra note 6.
[32] Protecting Americans from Foreign Adversary Controlled Applications Act, Pub. L. 118-50, 138 Stat. 895 (2024).
[33] Goldberg, supra note 6.
[34] Id.
[35] U.S. State Law comparisons for adult content 2024, The Age Verification Providers Ass’n (June 25, 2024) https://avpassociation.com/thought-leadership/us-state-law-comparisons-for-adult-content-2024/.
[36] Goldberg, supra note 6.
[37] Id.
[38] Id.
[39] Id.
[40] Id.
[41] Id.
[42] Id.
[43] Nott, supra note 2.
[44] Gene Policinski, The Right to Be Forgotten: Everything to Know About Erasing Digital Footprints, FREEDOM FORUM (last accessed Oct. 12, 2025) https://www.freedomforum.org/right-to-be-forgotten/.
[45] Id.
[46] Id.
[47] The Fla. Star v. B.J.F., 491 U.S. 524 (1989) and Martin v. Hearst Corp., 777 F.3d 546 (2d Cir. 2015).
[48] Policinski, supra note 44.
[49] Id.
[50] See Murthy v. Missouri, 603 U.S. 43 (2024).
[51] Id. at 43.
[52] Id. at 57.
[53] Id. at 80.
[54] Goldberg, supra note 6.
[55] Farhi, supra note 18.
[56] Id.
[57] Nott, supra note 2.
[58] Freedom forum, supra note 11.
[59] Id.
[60] Id.
[61] Id.

