Lessons Learned from California’s Privacy Rulemaking
California has spent nearly three years finalizing its rules on automated decisionmaking technology (ADMT), which will take effect on January 1, 2026. After multiple rounds of public comments and revisions, the California Privacy Protection Agency (CPPA)’s rules differ substantially from when they were first proposed, having adopted many suggestions from CCIA and others. The evolution of these rules yields key insights regarding which privacy and technology laws are administratively feasible. The lessons from this process are valuable to all lawmakers seeking to regulate the intersection of AI and privacy law.
First and foremost, the finalized rules signify a shift in CPPA’s understanding of ADMT. The final rules define ADMT as “technology that processes personal information and uses computation” to “substantially replace human decisionmaking.” This language contains a subtle yet important change from the initial ADMT definition, which also included cases where such technology “substantially facilitate[d] human decisionmaking.” The final version shows the understanding that an “automated” decision is a decision made by technology, not a decision made by a human with technological assistance. The final definition is far more easily administrable than the initial one– it saves both businesses and regulators from having to subjectively assess what degree of technological assistance constitutes facilitation of a decision.
CPPA’s assessment of the scope of its own authority has evolved along with its understanding of ADMT. The proposed rules originally extended many of the restrictions on ADMT usage to AI in general. Not all AI, however, is used to make decisions regarding consumers, and thus not all AI is ADMT. The final rules recognized this distinction and removed the attempts to regulate AI writ large. This revision indicated a shift in CPPA’s understanding of its own regulatory powers: While AI and consumer privacy laws often intersect, AI law is not a subfield of consumer privacy law, and is not used solely within the consumer-business interactions governed by CPPA. The final rules reflect the view that CPPA may only regulate AI insofar as it implicates consumer privacy.
CPPA’s final rules also demonstrate a shift in its understanding of privacy. During the rulemaking process, CPPA removed several provisions that did not align with California’s existing privacy laws, particularly in the restrictions on profiling. The rules initially coined the term “extensive profiling” for certain forms of profiling that would require risk assessments, including “Profiling a consumer through systematic observation of a publicly accessible place” and “Profiling a consumer for behavioral advertising.” Both of these concepts were at odds with the California Consumer Privacy Act (CCPA), the law CPPA was established to administer. The CCPA, however, directly exempted publicly available information, and only regulated a limited subset of behavioral advertising known as “cross-context behavioral advertising,” which targets consumers based on their activity across different websites and online services. However, CPPA removed the regulations of “extensive profiling” and of behavioral advertising writ large from the finalized rules. This shift reflects a key principle of consumer privacy law: Unauthorized access to consumer data, by definition, can only occur when a controller accesses the consumer data without obtaining the needed consent from the consumer. Generally, controllers do not need consent to access publicly available information, and have already obtained consent for any lawfully acquired first-party data. Using these types of data in future interactions with consumers therefore does not typically harm the consumer’s privacy. Such harms are generally limited to cases where a controller obtains nonpublic data from someone other than the consumer. For this reason, most other states (e.g. Colorado, Connecticut, and Texas) focus on these cases. While some first-party data usage (such as training internal models when developing ADMT) are still regulated in the final rules, CPPA’s revisions demonstrate a clear shift away from regulating first-party data.
CPPA’s rulemaking process underscores three key lessons about consumer privacy law: First, regulations involving privacy and ADMT should take effect when ADMT replaces human decisionmaking, not when it assists human decisionmaking to some indefinite degree. The later approach makes deciding which ADMT-assisted decisions are regulated nearly impossible. Controllers that used ADMT to assist human decisionmaking could not know whether they were violating the law, and state regulators would be hard-pressed to apply such a law consistently.
Second, consumer privacy laws should focus on actions that can result in privacy harms, i.e. the collection of nonpublic data about a consumer from an entity other than the consumer. Businesses should generally be free to use publicly available data and the data from their own users to suggest products or services to their customer base. The fewer burdens regulators place on the use of data that does not jeopardize consumer privacy, the more incentives businesses will have to rely on this data, rather than third-party data that poses greater risks to consumers when disclosed.
Third, new privacy laws should strive for compatibility with existing ones, both inside and outside the state. Lawmakers should continuously assess whether a given regulation conflicts with existing state law, and if the regulation will be unique among states. If few or no other states have enacted such a regulation, they should ask whether the regulation will incentivize businesses to stop serving consumers in the state rather than absorb the costs of compliance.
Consumer privacy law is still in its early stages, even at the state level. Only twenty of the fifty states have a comprehensive consumer privacy law, and most lack substantial administrative guidance. As states fill in their regulatory frameworks, lessons from the CPPA’s rulemaking process can help guide their efforts to best protect consumers.