Contact Us


Disruptive Competition Project

655 15th St., NW

Suite 410


Washington, D.C. 20005

Phone: (202) 783-0070
Fax: (202) 783-0534

Close

The Evolving Landscape of Content Moderation: A Recap on the Influence of Online Safety Legislation

Moderating digital content, particularly to shield children from potentially harmful material, has become a complex and ongoing challenge for state lawmakers. Over the past few years, the admirable goal of protecting young internet users has sparked a surge in proposals focused on enhancing online safety for children. Digital platforms and service providers are equally committed to this goal and are constantly developing more advanced moderation tools to create safer spaces online. Today, a range of tools exists at various levels—application, device, and internet service provider (ISP)—all designed to block inappropriate content and provide a more secure browsing environment for young users.

However, while the intentions behind these legislative efforts are commendable, many of these proposals introduce new privacy concerns and unintended barriers to accessing crucial online communities and information. Laws like the Age-Appropriate Design Code (AADC) and mandatory age verification requirements have gained momentum across the country. Due to concerns over constitutional issues and federal preemption, these measures have encountered significant legal challenges in states like Arkansas, California, Florida, Mississippi, Ohio, Tennessee, Texas, and Utah. This wave of litigation reflects widespread doubt about the compatibility of these laws with U.S. federal law and raises concerns about the taxpayer cost of defending potentially unenforceable legislation.

For example, the AADC model aims to tailor content for younger users in a manner considered to be age-appropriate. However, this approach stifles access to information, a core tenet of the First Amendment. Protecting children from possible online harms does not justify a sweeping authority to restrict the ideas they may encounter, as emphasized in NetChoice v. Bonta. Speech that is not obscene for minors and does not violate other valid laws cannot be restricted simply to shield young users from content that a legislature deems inappropriate for them. An amended version of the AADC was passed in Maryland in 2024, with similar proposals emerging in Hawaii, Minnesota, Illinois, and Vermont—states that generally lean toward Democratic majorities. Despite legal challenges, this model continues to gain traction, reflecting a strong determination among lawmakers to pursue such approaches, even when they may not fully align with federal law.

Simultaneously, age verification and parental consent mandates are being introduced, particularly in states with Republican majorities. These laws require users to confirm their age or obtain parental consent through digital or even physical means, posing significant privacy and operational challenges. States like Tennessee, Florida, and Iowa have advanced these proposals despite pending lawsuits in places like Arkansas and Ohio, underscoring the potential risks and compliance challenges involved. By enforcing such laws prematurely, these states risk passing costly legal burdens onto businesses and, as previously mentioned, to taxpayers.

Other states, including Alabama and Arizona, have introduced even stricter measures, requiring devices like smartphones and tablets to include default “content filters.” While these measures are intended to block access to harmful content, they also raise concerns about privacy, access to information, and potential government overreach. Often marketed as a way to prevent children from accessing explicit material, these filters would place broad restrictions on devices purchased by adults. Moreover, readily available parental controls on most devices offer similar protections without infringing on individual freedoms, empowering parents to choose the content they want their children to access and engage with.

These state-level proposals diverge sharply from the federal Children’s Online Privacy Protection Act (COPPA), which governs online privacy for children under 13. COPPA relies on self-attestation, where users are responsible for verifying their age when creating accounts. However, recent legislation indicates that state legislators may no longer view self-attestation as a reliable method for user age verification and will likely continue to propose stricter measures.

As lawmakers work to balance online child safety with privacy and information access rights, it is crucial that they carefully consider the broad range of potential consequences these measures may entail. Without careful regulation, well-intended efforts to safeguard children could infringe on digital freedoms, privacy, and open access for all.

Privacy

Trust in the integrity and security of the Internet and associated products and services is essential to its success as a platform for digital communication and commerce. For this reason we’re committed to upholding and advocating for policymaking that empowers consumers to make informed choices in the marketplace while not impeding new business models.