Constitutional Barriers to Social Media Regulation
A landmark ruling has reshaped lawmakers’ ability to regulate social media. On March 31, the Western District of Arkansas case NetChoice v. Griffin clarified many key guidelines for lawmakers crafting such bills. US District Judge Timothy Brooks issued a final order invalidating Arkansas Act 689, a law requiring social media companies to verify all Arkansas users’ ages and only allow minors to use their services with parental consent. The opinion, which stated that the law “takes a hatchet to adults’ and minors’ protected speech,” demonstrates the need to carefully consider constitutional protections when attempting to regulate social media.
The first task when attempting to regulate “social media” is defining it. Act 689 defined a “social media platform” as one that provides an online or app-based service “for the primary purpose of interacting socially with other profiles.” However, Arkansas excluded “News, sports, entertainment, or other content that is preselected by the provider and not user generated” from this definition. The court found that regulating companies and platforms based on their “substantial function,” “predominant function,” or “primary purpose” was “unconstitutionally vague in violation of the Fourteenth Amendment,” as the law “does not explain how platforms are to determine which function is ‘predominant,’ leaving those services to guess whether they are regulated.” The court concluded that such definitions simply did not give companies the information needed to determine whether the law applied to them.
The court found the above formulation to violate the First Amendment as well. It held that “by any commonsense understanding of the term, the ban in this case is content based,” as it regulated news, sports, entertainment, and preselected content differently than other content posted on social media. The court noted that “[a] website operating in Arkansas, an enforcement official, a court, or a jury applying the Act, cannot determine whether the website is regulated without looking to the content posted on that website.” Similarly, because the law regulated news, sports, and entertainment outlets differently than social media users writing about the same topics, the court held that the law constituted a speaker-based restriction on speech, writing that the law “privileges institutional content creators—movie and TV studios, mainstream media outlets, and traditional journalists—over the Soundcloud artist, the TikTok chef, and the citizen journalist.” Such a law, the court held, could only meet the First Amendment’s requirements if it were narrowly tailored to serve a compelling state interest, and Act 689 was not. While the court accepted that Arkansas “has a compelling interest in protecting minor children,” it found that the law was not narrowly tailored to that interest, as “[r]ather than targeting content that is harmful to minors, Act 689 simply impedes access to content writ large.”
If lawmakers are barred from differentiating social media companies from other media companies based on their primary purpose, function, or the content they produce, how should they protect social media users? To comply with the First and Fourteenth Amendments, lawmakers must use well-defined criteria and treat social media and traditional media equally when determining a law’s scope. One way to do this is to determine when a company can collect and process data, with heightened protections for younger users and for sensitive types of data. States can (and do) require companies to minimize the amount of data collected from minors, and limit how long such data can be retained. They can also require companies targeted at minors to submit risk assessments documenting how minors’ data is used, and how they plan to mitigate any harms that might arise. These measures would avoid creating content- and speaker-based distinctions among media companies, and avoid having to regulate companies using vague criteria such as their “primary purpose” or “substantial function.”
A second task for lawmakers regulating minors’ use of social media is determining how to ascertain users’ ages. Act 689 required all Arkansas social media users to verify their ages using “commercially reasonable age verification methods,” including government-issued IDs, and to obtain “the express consent of a parent or legal guardian” to access social media before turning 18. These measures have appeared in several state bills recently. However, the court held that “[r]equiring adult users to produce state-approved documentation to prove their age and/or submit to biometric age-verification testing imposes significant burdens on adult access to constitutionally protected speech” and that “website visitors forgo the anonymity otherwise available on the internet,” thus violating the First Amendment.
If legislators cannot require age verification methods that reveal the user’s identity, what options remain to protect minors? Age estimation software is available, but is costly enough to significantly deter entrepreneurial start-ups entirely (note that, per the holding, a state could not require age verification for social media companies without requiring it for other online media outlets as well). Additionally, even the best algorithms currently carry an average error of 3.1 years when estimating user ages. Moreover, such software requires social media sites to collect even more data about both adult and minor users.
Again, a comprehensive privacy bill solves these problems. If legislators shift their focus from restricting access to social media to protecting users’ data online, they will preserve the First Amendment right to speak online anonymously, allow companies to know when laws apply to them, and limit, rather than increase, the amount of user data collected.
Following this ruling, legislators will face new constraints when defining social media companies or restricting access to their services. Yet lawmakers still have options for protecting users online. Requiring companies to minimize data collected from minors and protect users’ sensitive information avoids the constitutional pitfalls that led to Act 689’s invalidation. Companies could know when the law applies to them, and the First Amendment rights of both media outlets and internet users would be preserved. NetChoice v. Griffin underscores the importance of protecting consumers through comprehensive privacy legislation rather than by restricting their ability to receive and share information online.