Case Note: NetChoice v. Bonta
On September 9, 2025, in a unanimous 3-0 vote, the Court of Appeals for the Ninth Circuit affirmed in part, reversed in part, and remanded the Northern District of California’s decision to deny NetChoice’s motion for summary judgment in NetChoice v. Bonta, their case challenging California’s Protecting Our Kids from Social Media Addiction Act (the “Act”). The Act remains partially enjoined and the case will be remanded for further proceedings.
The Act, which was set to go into effect in early 2025, regulates digital services who generate content based on information provided by or associated with the user, which the Act calls “addictive platforms.”
As of now, covered entities must adhere to the Act’s restrictions on minors’ social media accounts if they have “actual knowledge” that a user is a minor. By 2027, the Act’s restrictions will apply to all users, unless, pursuant to Attorney General Bonta’s forthcoming regulations, the digital service determines a user is an adult.
The Act prevents online digital services from recommending, selecting, or prioritizing data for display based on information provided by the minor and showing the number of likes on social media posts to minors. Also, online digital services must apply privacy-protective settings to minors’s accounts. Minors can waive these features by providing the online digital services parental consent.
NetChoice, an internet trade association, challenged the Act before it went into effect. They argued that the Act’s vagueness and application to its members violates the First Amendment. Specifically, the Act’s requirements unconstitutionally limit their members’ communication to minors through algorithms, restrict minors’ ability to speak publicly, and deter adults from accessing a digital service’s speech by requiring them to prove their age.
Following the District Court granting in part and denying in part NetChoice’s motion for a preliminary injunction, NetChoice appealed and the Court issued a stay.
Standing
The Court found that NetChoice does not have standing to challenge the personalized feed delivery requirements and age verification standards. However, it found that NetChoice does have standing to challenge the Act’s “like-count” provision, which would require digital services to bar minors from seeing the number of likes on their posts.
As to the challenge to personalized feed requirements, while NetChoice’s members would have standing to sue on their own and their claims are relevant to NetChoice as an organization, the Court found that NetChoice must provide more information about each member’s algorithm. Echoing Moody, the Court found that algorithms do not automatically qualify for First Amendment protections because algorithms can monitor users’ actions or implement human directions, not just communicate with users. Thus, the Court determined that NetChoice needs to show that each member has an algorithm that expresses the digital platform’s speech.
The Court also found that NetChoice does not have standing to challenge the age verification standards. While the Court agreed that verification settings could cause a cognizable harm to online digital services, harm is neither imminent or ripe because the AG has not yet determined the age-verification regulations. Thus, at this point, any injuries or costs digital services incur are purely speculative and voluntary. Similarly, the Court determined that NetChoice cannot raise internet users’ interests in not having age verification because internet users are a third party.
Court of Appeals’ Findings and Conclusions on the Merits
Whether Algorithms Are Content-Based
The Court found that the Act’s exception for commercial websites does not make the entire Act content-based. Relying on City of Austin, the Court stated that statutes singling out solicitation can remain content-neutral even if they require some evaluation of speech. The Act’s exceptions for “commercial transactions” and “commercial reviews” can be understood as a solicitation exception, which is content-neutral.
Similarly, the Court found that the Act’s regulation of just social media is not a content-based restriction since the regulation applies to all digital services, not just those who facilitate social interaction or produce certain content.
Whether Personalized Feeds Are Protected Speech
The Court declined to reach the District Court’s conclusion that personalized feeds are “not necessarily a form of social media platform’s speech,” and therefore restricting personalized feed does not restrict digital services’ speech. Since this is a preliminary posture, the Court did not want to rule on this statement’s validity.
However, because NetChoice does not show that any unconstitutional applications outweigh constitutional application, the Court does not need to decide if the algorithms the Act regulates are protected speech.
The Act’s Like-Count Provision
The Court found that the Act’s like-count provision is content-based because it determines how the digital services can communicate with the user based on the content the digital service sends. Digital Services can communicate with minors by displaying posts on their feed and telling them users have interacted with their post, but digital services can’t tell a minor the number of “likes” on their post.
The Court applied strict scrutiny to this provision. They found that NetChoice is likely to prevail on their challenge because California could further its interest in a less restrictive way, such as encouraging websites to use voluntary filters.
Private-Mode Provision Survives Intermediate Scrutiny
The Act’s private-mode provision, which requires digital services to put minors platforms in privacy protective settings, only needs to survive intermediate scrutiny, since it applies the same way to all digital services, regardless of their content.
The Court determined this provision survives intermediate scrutiny since it is narrowly-tailored to promote California’s goal of protecting minors’ mental health and the Act’s exceptions do not raise doubts on whether California is trying to regulate the addictive nature of online digital services.
Void for Vagueness Claim
The Court of Appeals found that designating online digital services “addictive” does not discriminate against online digital services that meet this definition since the Act provides a definition of “addictive algorithms” as a whole and is not denoting some digital services addicting.
Although it may be difficult for digital services to determine if they qualify as an “addictive platform” that shows user feeds based on the pages the user follows, this exemption is not vague.
Severability
According to the Court of Appeals, the unconstitutional like-count provision is severable because the Act contains a severability clause and the like-count provision is grammatically and functionally separate. The Court of Appeals determined that the like-count provision, which requires operators to by default limit children’s “ability to view the number of likes or other forms of feedback to pieces of media within an addictive feed,” is its own clause because it is listed separately from the other challenged provisions in the Act. Therefore, when the like-count provision is removed, the meaning of the Act remains intact.
Conclusion
Now, a lower court will hear NetChoice’s case for all the provisions NetChoice is challenging. In the meantime, only Act’s like-count provision will remain enjoined.