Draft California Automated Decisionmaking Technologies Regulations to Be Revised Before Formal Rulemaking

Last month, we wrote about draft regulations the California Privacy Protection Agency (“CPPA”) is considering around automated decisionmaking technologies (“ADMT”) and requirements for risk assessments of those technologies. On December 8, 2023, the CPPA met to discuss these and other proposals they are considering for formal rulemaking in 2024. The December 8th meeting produced lively discussions and ultimately concluded with a motion (which passed) to provide CPPA staff more time to solicit individual feedback from Board members to revise the current draft of ADMT and risk assessment regulations.

The ADMT and risk assessment draft regulations are likely to be revised by: (1) tightening the scope of the definition of ADMT and the applicability of the draft regulations; (2) expanding exceptions for legitimate employee monitoring; and (3) reducing required risk assessment documentation in a way that is responsive to First Amendment concerns raised by the Net Choice v. Bonta Age-Appropriate Design litigation.

Tightened ADMT Definition

A major concern of some CPPA Board members is the overly broad and unclear definition of ADMT.  As we discussed in our summary, the draft regulations, as written, would arguably encompass every business subject to CCPA.

Board member Alastair Mactaggart raised this concern to staff, pointing out that the definition of ADMT encompasses not only technologies that make decisions, but also (1) technologies and processes that facilitate human decision-making and (2) includes profiling, which would likely encompass all or nearly all software. Board member Lydia de la Torre echoed the concerns that California’s scope is uniquely broad, and, as a result, more cumbersome on businesses as compared to other U.S. state laws because the CCPA applies to employees and consumers, alike.

However, discussion of proposed solutions prompted concerns from staff that the Board’s proposals would require the same line-drawing that produced the over-breadth and lack of clarity members find concerning. For example, de la Torre proposed that the ADMT opt-out right be revised to be an opt-out of intrusive ADMT, prompting questions about which ADMT technologies would be considered intrusive. Further, if the Age Appropriate Design Code (“AADC”) litigation is any indication, businesses are likely to object to being compelled to say their data processing is “intrusive.”

Board members largely spoke in agreement favoring opt out rights for technologies that make decisions with “legal or similarly significant effects” (suggesting potential harmonization with the GDPR standards) and opt out rights for behavioral advertising (a right in addition to an opt-out for cross-context behavioral advertising, i.e., “sharing”). Note, the Board did not define behavioral advertising, nor discuss how it may be defined. It’s possible that the revised draft may be more narrowly tailored to these concerns—or the CPPA staff may provide more illustrative examples of which kinds of technologies would trigger the ADMT regulations—but the next draft of regulations will likely still contain language that may create opt out rights for first party advertising and targeting that occurs using data from a single session.

Expanded Exceptions for Legitimate Employee Monitoring

Even where the discussion of legal or similarly significant effects occurred, Mactaggart raised concerns that the ADMT right would permit employees to opt out of typical human resources functions. While employees should be notified of monitoring, Mactaggart argued, they shouldn’t be allowed to opt out entirely. As support, he offered examples of monitoring truck drivers for exhaustion to make sure they were driving safely and monitoring the number of sales calls employees in sales functions completed as part of performance evaluation. These examples are legitimate and reasonably expected kinds of monitoring. Although the draft regulations include exceptions for security incidents and physical safety, the draft regulations do not currently consider the unique needs for businesses to monitor employees and production.

One option the CPPA staff and Board may consider when revising the draft regulations, which Mactaggart appeared to support, would be to clarify the right to opt out of ADMT using established precedent articulating a reasonable expectation of privacy. That is, employees would be permitted to opt out of ADMT only where they have a reasonable expectation of privacy.

Reduced Required Risk Assessment Documentation

In response to AADC litigation, Mactaggart offered up concerns that the draft regulations’ risk assessment provisions would compel speech from businesses subject to the regulations in a manner that would be inconsistent with the First Amendment (a concern he raised with California’s draft cybersecurity audit regulations, as well). California’s AADC contains similar risk assessment requirements for products likely to be accessed by minors, prompting a federal court to issue a preliminary injunction against the statute due to these kinds of requirements unconstitutionally encumbering commercial speech.

 As an example of these kinds speech considerations in the draft regulations, they require businesses reveal information about how it expects the technologies used to negatively impact consumers physically, psychologically, and reputationally. Note, the draft regulations assume these technologies have negative impacts. And there are otherrequirements that could be characterized as compelled “speech.” Further, the regulations do not provide information about what kind of privilege may accompany these risk assessments.

Although the Board advanced the draft cybersecurity audit regulations in the face of similar concerns, one important difference in those regulations and the ADMT regulations is that the cybersecurity audit regulations advise businesses that they may make such evaluations in their cybersecurity audits, whereas the ADMT regulations direct businesses that they shall make these evaluations. Multiple Board members expressed support for thorough risk assessments, but the constitutional concerns may prompt staff to revise for consistency with the cybersecurity audit regulations the Board advanced.

In addition to advancing the cybersecurity audit regulations and sending the ADMT and risk assessment regulations back for revisions, the Board adopted its Data Broker Registry Fee regulations in response to passage of the California Delete Act and directed CPPA staff to prepare draft CCPA regulation updates and draft insurance regulations for formal rulemaking.

We will continue to closely monitor the CCPA regulatory process as these drafts make their way toward final regulations.

 Charlotte Lunday is a Senior Associate at Hintze Law with expertise in COPPA, FERPA, and online safety.

Hintze Law PLLC is a Chambers-ranked, boutique privacy firm that provides counseling exclusively on global data protection. Its attorneys and privacy analysts support technology, ecommerce, advertising, media, retail, healthcare, and mobile companies, organizations, and industry associations in all aspects of privacy and data security.