Online Safety Laws and Regulations in the US
The US regulatory framework governing online safety spans federal statutes, agency rulemaking, and a growing body of state legislation that collectively define legal obligations for platforms, service providers, and users across digital environments. This page maps the structure of that framework — covering the major laws, enforcement bodies, classification distinctions, and areas of active legal tension that shape how online safety obligations are assigned and enforced. Professionals navigating compliance requirements, researchers analyzing the policy landscape, and organizations assessing service-sector obligations will find this a structured reference for understanding where legal authority sits and how it operates.
- Definition and scope
- Core mechanics or structure
- Causal relationships or drivers
- Classification boundaries
- Tradeoffs and tensions
- Common misconceptions
- Checklist or steps (non-advisory)
- Reference table or matrix
Definition and scope
Online safety law, as a regulatory category, addresses the legal duties imposed on digital platforms, internet service providers, content hosts, and data processors to prevent harm arising from online activity. That harm can take multiple forms: exposure to illegal or harmful content, identity-based harassment, exploitation of minors, unauthorized data collection, cyberfraud, and nonconsensual distribution of private material.
At the federal level, "online safety" does not map to a single statute. Instead, it is distributed across sector-specific laws — children's privacy under the Children's Online Privacy Protection Act (COPPA), platform liability limits under Section 230 of the Communications Decency Act, child sexual abuse material (CSAM) prohibitions under 18 U.S.C. § 2256, and cyberstalking under 18 U.S.C. § 2261A. The Federal Trade Commission (FTC) and the Department of Justice (DOJ) hold primary federal enforcement authority across these areas.
State-level frameworks have expanded substantially since 2021, with at least 15 states enacting or advancing laws addressing social media safety for minors, age verification requirements, or platform transparency mandates as of 2023 (National Conference of State Legislatures). The online safety providers at this provider network catalog service providers and compliance professionals operating across this fragmented landscape.
Core mechanics or structure
Federal online safety regulation operates through four primary structural mechanisms:
Statutory prohibitions establish absolute floors — conduct that is criminalized regardless of platform. CSAM distribution under 18 U.S.C. § 2252 is a hard prohibition. Platforms cannot contract around it, and Section 230 immunity does not apply to federal criminal law.
Rulemaking authority delegates to agencies the power to set enforceable standards below the statutory ceiling. The FTC's COPPA Rule (16 C.F.R. Part 312) requires verifiable parental consent before collecting personal data from children under 13. Civil penalties under COPPA can reach $51,744 per violation as of the FTC's 2023 penalty adjustment (FTC Civil Penalty Adjustments).
Liability structuring through Section 230 defines when platforms bear third-party content liability. The statute grants interactive computer services broad immunity from civil claims based on user-generated content, while carving out exceptions for federal criminal law and sex trafficking under the Fight Online Sex Trafficking Act (FOSTA).
Sector-specific compliance frameworks apply where an industry vertical intersects with online activity. The Health Insurance Portability and Accountability Act (HIPAA), enforced by the HHS Office for Civil Rights, governs protected health information transmitted through online platforms. The Gramm-Leach-Bliley Act (GLBA) governs financial data in online contexts.
Causal relationships or drivers
The fragmentation of US online safety law is not accidental — it reflects three structural causes.
Federalism constraints mean Congress cannot uniformly preempt state authority in areas where federal legislation is absent or limited. In the absence of a comprehensive federal online safety statute (equivalent to the UK's Online Safety Act 2023), states have filled the legislative vacuum, producing divergent standards across jurisdictions.
Platform growth outpacing legislative cycles — the original Section 230 was enacted in 1996, before the commercial internet reached scale, and has not been comprehensively revised despite platforms handling billions of daily interactions. The Senate Judiciary Committee's EARN IT Act and the Kids Online Safety Act (KOSA) represent ongoing legislative attempts to address this gap as of the 118th Congress.
Incident-driven rulemaking accelerates regulatory action. The FTC's 2019 settlement with Facebook for $5 billion — at that time the largest FTC penalty in history (FTC Press Release, July 2019) — followed documented evidence of user privacy violations, demonstrating how enforcement outcomes drive subsequent rulemaking priority.
The purpose and scope of this provider network reflects this layered regulatory environment, covering service professionals who operate across federal, state, and sector-specific compliance requirements simultaneously.
Classification boundaries
Online safety obligations differ by entity type, content type, and user population served. Key classification distinctions include:
By entity type:
- Interactive computer services (platforms, social networks, app stores) hold Section 230 protections but face COPPA, FOSTA, and emerging state obligations.
- Internet access providers (ISPs) are regulated under FCC authority and have distinct obligations under network neutrality frameworks.
- Data brokers face separate state registration requirements (California's AB 1202, Vermont's Act 171 of 2018) and proposed federal regulation under the American Data Privacy and Protection Act (ADPPA).
By user population:
- Platforms directed at or with actual knowledge of users under 13 face COPPA obligations.
- Platforms that minors under 17 access face obligations under proposed KOSA standards and enacted state laws including Utah's Social Media Regulation Act (S.B. 152, 2023) and Arkansas's Social Media Safety Act (Act 689, 2023).
By content type:
- CSAM: zero-tolerance federal criminalization, mandatory reporting to the National Center for Missing & Exploited Children (NCMEC) under 18 U.S.C. § 2258A.
- Terrorist content, violent extremism: no standalone federal removal mandate; addressed through voluntary platform policies and DOJ/FBI coordination.
- Nonconsensual intimate imagery: criminalized in 48 states as of 2023 (Cyber Civil Rights Initiative state law tracker).
Tradeoffs and tensions
The most contested area in US online safety law is the relationship between harm prevention and First Amendment protections. The Supreme Court's 2023 decisions in NetChoice v. Paxton and Moody v. NetChoice addressed whether states could compel social media platforms to carry — or prohibit removal of — certain categories of speech. The Court's remand for further proceedings left unresolved the precise constitutional scope of state platform regulation (Supreme Court, June 2024).
Age verification mandates generate a parallel tension: enforcing age gates requires collecting identity data, which creates new privacy risks — particularly for minors — that can undercut the safety goals the mandate was designed to achieve. The Electronic Frontier Foundation (EFF) has documented how age verification architectures can create data breach exposure and surveillance infrastructure.
Section 230 reform proposals split along a structural fault line: narrowing immunity could deter harmful content hosting, but it could equally deter platforms from moderating content at all (due to increased liability risk from active moderation decisions), producing a chilling effect on the safety-oriented interventions reformers seek to encourage.
Common misconceptions
Misconception: Section 230 protects platforms from all liability.
Correction: Section 230(e) explicitly preserves federal criminal liability, intellectual property claims, and post-FOSTA liability for sex trafficking facilitation. Immunity is civil and has statutory carve-outs.
Misconception: COPPA applies to all minors under 18.
Correction: COPPA applies specifically to children under 13. The FTC's rule at 16 C.F.R. § 312.2 defines "child" as under 13. Separate state laws (California's AADC, now enjoined) and proposed federal statutes address the 13–17 range.
Misconception: State online safety laws are automatically preempted by federal law.
Correction: Federal preemption applies only where Congress has expressly legislated or where conflict makes compliance with both standards impossible. In the absence of a comprehensive federal online safety statute, state laws operate independently in most domains.
Misconception: Mandatory CSAM reporting applies only to large platforms.
Correction: 18 U.S.C. § 2258A applies to any "electronic service provider" — including small hosting services — that obtains actual knowledge of apparent CSAM. Failure to report carries criminal penalties.
Checklist or steps (non-advisory)
The following sequences describe how regulatory compliance assessment is typically structured for entities operating in the online safety space. This is a structural description of common compliance workflow — not legal or professional advice.
Federal Baseline Assessment
- [ ] Identify whether the entity qualifies as an "interactive computer service" under 47 U.S.C. § 230(f)(2)
- [ ] Determine if the platform is "directed to children" or has "actual knowledge" of users under 13 per 16 C.F.R. § 312.2 (COPPA threshold)
- [ ] Confirm whether the entity is an "electronic service provider" with CSAM mandatory reporting obligations under 18 U.S.C. § 2258A
- [ ] Assess whether HIPAA or GLBA sector-specific overlays apply based on data categories processed
State Law Mapping
- [ ] Identify all states in which users are located or services are directed
- [ ] Map applicable state data privacy laws (California CCPA/CPRA, Virginia CDPA, Colorado CPA, et al.)
- [ ] Check enacted or pending social media minor-safety statutes in states with active user bases
- [ ] Review nonconsensual intimate imagery statutes where platform hosts user content
Incident Response Framework
- [ ] Confirm NCMEC CyberTipline reporting pipeline for CSAM under 18 U.S.C. § 2258A
- [ ] Map state data breach notification timelines (72-hour requirements apply in some states)
- [ ] Identify FTC contact protocols for consumer protection incidents involving minors
Reference table or matrix
| Law / Framework | Governing Body | Scope | Key Obligation | Penalty Structure |
|---|---|---|---|---|
| COPPA (15 U.S.C. § 6501–6506) | FTC | Platforms collecting data from under-13s | Verifiable parental consent | Up to $51,744 per violation (FTC, 2023) |
| Section 230 (47 U.S.C. § 230) | No single enforcer | Interactive computer services | Civil immunity for third-party content | Liability carve-outs for federal crimes, IP, FOSTA |
| FOSTA-SESTA (Pub. L. 115-164) | DOJ / FTC | Platforms hosting sex trafficking facilitation | Remove, report, or face civil/criminal liability | Criminal penalties; civil suits by victims |
| 18 U.S.C. § 2258A | DOJ / NCMEC | All electronic service providers | Report CSAM to NCMEC CyberTipline | Criminal penalties for failure to report |
| HIPAA (45 C.F.R. Parts 160, 164) | HHS / OCR | Covered entities and business associates | Safeguard PHI in online systems | Up to $1.9 million per violation category per year (HHS OCR) |
| CCPA/CPRA (Cal. Civil Code § 1798.100) | California AG / CPPA | Businesses processing CA resident data | Opt-out rights, data minimization | $7,500 per intentional violation |
| Kids Online Safety Act (S. 1409, 118th Cong.) | FTC (proposed) | Platforms minors under 17 access | Duty of care, default safe settings | Civil penalties (amounts pending enactment) |
| Utah SB 152 (2023) | Utah AG | Social media platforms / minors | Age verification, parental consent | State civil enforcement |
The provider network resource at this site indexes compliance professionals and service providers organized by the regulatory frameworks described in this table. The structure of the legal landscape described here informs how service categories within that provider network are organized.