What California’s New Age-Appropriate Design Code Means for Your Business

By Charlotte Lunday

On September 15, Governor Gavin Newsom signed into law the California Age-Appropriate Design Code Act (CAADC). The law which received bipartisan support in the Legislature has a goal of protecting the wellbeing, data, and privacy of children, including teens, using online platforms.

Group of teen reaching out to silver laptop keyboard and screen.

In his press release, Governor Newsome stated, “We’re taking aggressive action in California to protect the health and wellbeing of our kids.”

Businesses will be required to comply with significant new documentation and privacy by design and privacy default obligations by July 1, 2024. These obligations are largely adopted from the United Kingdom’s Age-Appropriate Design Code, and the statute’s preamble points to this law and the UK’s Information Commissioner’s Office (ICO) guidance to interpret the CAADC.

Who is impacted?

The CAADC applies to businesses subject to CCPA, as amended by CPRA, that make “online products, services, or features” which are “likely to be accessed by children.” For purposes of CAADC, children are consumers under the age of 18.

As a threshold question, businesses must determine whether their online products, service, or features (which do not include certain telecommunications services, broadband internet access services, or physical products) are likely to be accessed by children. In making this determination, businesses should consider:

(1) whether its products and services would be considered directed to children under COPPA[1] or has elements that would interest children, such as, games, cartoons, music, or celebrities appealing to children (similar to the factors the Federal Trade Commission considers when determining whether a site or service is directed to children, although the assessment of these factors may differ given that the threshold age for COPPA is 13 whereas it is 18 for CAADC),

(2) whether the service, product, or feature is routinely accessed by a significant number of children — or substantially similar or the same as such service, product, or feature — as determined by “competent and reliable evidence regarding audience composition, and

(3) whether the business’s internal research shows that a significant amount of the audience are children.

My products are likely to be accessed by children under 18, what new obligations do I have?

Businesses have new documentation and product design obligations, as well as data use restrictions for those products or services that are subject to CAADC. Although most of CAADC’s provisions go into effect July 1, 2024, businesses will benefit from proactive efforts to comply, as some documentation requirements must be completed before July 1, 2024, and many of CAADC’s requirements will likely prolong product development and launch timelines. The primary new obligations are[2]:

Estimate the age of child users. Businesses have an affirmative obligation to estimate the age of child users “to a reasonable degree of certainty.”  Where a business performs age assurance for their products, they should do so[3] in a manner proportionate to the risks of the product, service, or feature. Notably, the UK’s Age Appropriate Design Code requires that businesses establish the age of users to a reasonable degree of certainty, whereas CAADC imposes guardrails on age assurance, but only requires businesses estimate age. Nevertheless, businesses must not use personal information it collects to estimate age or age range for any other purpose or retain the information longer than necessary to estimate age. To support the design of online products, businesses should consider the following age ranges and developmental stages: 0 to 5 years (preliterate and early literacy); 6 to 9 (core primary school years); 10 to 12 years (transition years); 13 to 15 years (early teens); and 16 to 17 years (approaching adulthood).

Perform Data Protection Impact Assessments (DPIAs) for all online products, services, and features that are likely to be accessed by children. Once a business determines their online products, services, or features are likely to be accessed by children, the business must complete DPIAs. For any such product that was launched before July 1, 2024, the corresponding DPIAs must be complete by July 1, 2024. After that, DPIAs must be done prior to launching any new online product, service, or feature likely to be accessed by children. These DPIAs should be reviewed and updated every two years and must be provided to the Attorney General on written request within 5 days. A list of all completed DPIAs must be produced within 3 days, on written request. A single DPIA may cover multiple types of processing so long as each product, service, or feature is addressed.

In addition to documenting the data life cycles and privacy risks and mitigations of the products, CAADC requires businesses evaluate whether:

  • the product design exposes children to harmful content or contacts or involves children in harmful conduct,

  • the online product, service, or feature uses targeted advertising that could harm children,

  • the design seeks to extend use of the online service, product, or feature, and

  • the product, service, or feature collects or processes children’s sensitive personal information.

Upon identifying and documenting any material risk of harm to children in the DPIA, a business must create a timed plan to mitigate or eliminate the risk before the product, service, or feature is accessed by children.

Provide privacy information and other policies in language appropriate for children. Businesses should communicate privacy information, terms of service, policies, and community standards. Such communication must be written in language that is accessible and understandable by children in the developmental stages determined as likely to access the products, services, or features.

Configure all default privacy settings for children to a high level. Unless businesses can demonstrate a compelling reason that different settings are in the best interests of children, businesses should default child users’ settings to “a high level of privacy.”

Alert children to parental/guardian monitoring. If the service, product, or feature permits parent/guardian or another consumer to monitor the child’s online activity or track the child’s location, the product design should include an obvious signal to the child when the child is monitored.

Design products to mitigate risks harm to children. CAADC prohibits the use of children’s personal information in a way that the business knows or has reason to know is materially detrimental to the physical health, mental health, or well-being of a child.

Businesses should therefore use DPIAs to determine where products need to be updated to promote child safety and well-being. Businesses should also monitor consumer complaints and press related to how children use their products and the consequences of that use, as a business could be found to have constructive knowledge that their products are materially detrimental to the health and well-being of children.

Businesses that offer products that it knows or should know are materially detrimental to children’s health and well-being will need to change their product designs to mitigate those risks. Where doing so puts the “best interests of children” in conflict with a business’s commercial interests, the business must prioritize the best interests of children.

For example, if a business documents that a child’s interests in health and wellness content exposes the child to content that promote eating disorders, the business should revise how it uses data to promote content. Similarly, if an app provides rewards to children to increase time spent online and the company learns its products contribute to high rates of adverse mental and physical health risks, it should exclude users under 18 from being eligible for these rewards.

Additionally, the CAADC adopts CPRA’s prohibition against “dark patterns” that encourage children to provide more information than reasonably expected,[4] to reduce privacy settings, or to take an action that is materially detrimental to the child’s health and well-being.

Limit the collection, selling, sharing, and retention of personal information and limit profiling under certain circumstances. Businesses must refrain from profiling a child except under limited circumstances. Businesses should also practice data minimization by limiting their collection, use, and sharing of personal information that is not necessary to provide the product, service, or feature (however, exceptions exist where a business can demonstrate the collection, selling, sharing, or retention is in the best interests of children). Further, precise geolocation information should not be collected, used, sold, or shared by default, and should only be collected, used, or sold/shared when strictly necessary and for a limited time. When precise geolocation is being collected, the product design should include an obvious signal to the child when location is being collected.

What are the penalties for noncompliance?

The California Attorney General may seek injunctions against businesses that fail to comply with CAADC. The Attorney General may also seek civil penalties of up to $2,500 per affected child for each negligent violation, and up to $7,500 per affected child for each intentional violation. However, the Attorney Generally must provide businesses that substantially comply with the CAADC DPIA obligations with written notice of violations before initiating an action and permit the business 90 days to cure the violations.

Working Group Reports and AG Regulations

The CAADC also establishes a California Children’s Data Protection Working Group to deliver reports and recommendations to Legislature on best practices for relating to the law. Further, the Attorney General is given authority to adopt regulations to clarify requirements of CAADC.




[1] COPPA defines a “website or online service directed to children” as “a commercial website or online service that is targeted to children; or that portion of a commercial website or online service that is targeted to children. A commercial website or online service, or a portion of a commercial website or online service, shall not be deemed directed to children solely for referring or linking to a commercial website or online service directed to children by using information location tools, including a directory, index, reference, pointer, or hypertext link.” 15 U.S.C. § 6501(10). Notably, COPPA defines children as individuals under 13, 15 U.S.C. § 6501(1), and the Federal Trade Commission uses several factors, including the content and subject matter of a site and service and empirical evidence regarding audience composition, to interpret whether a site or service is directed to children.

[2] Note: This is not a comprehensive summary of obligations but focuses on the major new requirements.

[3] Examples of age assurance provided by the UK ICO are: self-declaration; hard identifiers; account holder confirmation; age verification; and artificial intelligence. Where users may be under 13 years of age, these methods may result in increased COPPA obligations and risks.

[4] The statute doesn’t clarify whose expectations should be the threshold, but arguably, it would be the reasonable expectations of the child, depending on the relevant developmental stage.

Charlotte Lunday is a Senior Associate at Hintze Law with expertise in COPPA, FERPA, and online safety.

Hintze Law PLLC is a Chambers-ranked, boutique privacy firm that provides counseling exclusively on global data protection. Its attorneys and privacy analysts support technology, ecommerce, advertising, media, retail, healthcare, and mobile companies, organizations, and industry associations in all aspects of privacy and data security.