Hintze Law Global Privacy Updates

Here’s a snapshot of a few of the privacy developments we followed over the past few weeks.  If you missed our last post, you could find it here.

By: Emeka Egwuatu and Destiny Ginn  

US FEDERAL

BIPA: Amazon Can't Pause Battle with Student Over Facial Recognition

In August, a federal judge ruling rejected Amazon’s motion to dismiss claims that they violated the Illinois biometric privacy law by allowing online learning company ProctorU to use Recognition facial-recognition technology to verify students’ identities. The case is scheduled for trial next year.

In the court’s assessment, the motion did not raise threshold, non-merits questions such as jurisdiction or immunity of a defendant that warranted a stay. In addition, Judge Chun said several of the plaintiff’s arguments and AWS’s defenses require fact-based analyses that discovery would inform.

CFPB: Takes Action to Protect the Public from Shoddy Data Security Practices

The Consumer Financial Protection Bureau (CFPB) published a circular to guide consumer protection enforcers on August 11th. The circular confirmed that financial companies may violate federal consumer financial protection law when they fail to safeguard consumer data and includes examples of when organizations can be held liable for lax data security protocols.

 

CFPB Advisory Opinion Underscores FCRA’s Privacy Protections, Applicable to Consumer Reporting Agencies and Users of Consumer Reports

The Consumer Financial Protection Bureau (“CFPB”) issued an advisory opinion entitled ‘“Fair Credit Reporting: Permissible Purposes for Furnishing, Using, and Obtaining Consumer Reports.”

The advisory opinion clarifies that “permissible purposes” under the Fair Credit Reporting Act (the “FCRA”) are “consumer specific” and highlights that a person (organization? Company?) who uses or obtains a “consumer report” is “strictly prohibit[ed]” from doing so without a permissible purpose under the FCRA.

 

FCRA: Third Circuit Adopts “Reasonable Reader” Standard for Credit Reports in FCRA Claims

The recent opinion of  Marissa Bibbs v. Trans Union LLC is a significant victory for creditors/credit reporting agencies in the battle against claims of inaccurate or misleading credit reports asserted under FCRA. In Bibbs, borrowers claimed that the subsequent credit reporting comprised of a negative “Pay Status” notation of “>Account 120 Days Past Due,” coupled with the reporting that the accounts had been closed, transferred, and had an account balance of zero, was misleading. The district courts dismissed all the borrowers’ cases without discovery.

 

FTC: Investigates Companies Sharing Biometric Data

The FTC is investigating the biometric practices of Clarifai, an AI company, and Match.com, the parent company of OkCupid, which the FTC claims shared biometric data with Clarifai. The companies’ data sharing practices were brought to attention in claims of Clarifai violating the Illinois Biometric Information Privacy Act in Stein v. Clarifai, Inc.

 

FTC: Launches Rulemaking Process Covering Sweeping Data Practices

On August 11, 2022, the Federal Trade Commission (“FTC”) published an advance notice of proposed rulemaking (“ANPR”) in a 3-2 vote on party lines requesting public comment on questions covering a wide range of “commercial surveillance” and data security practices.  The FTC defines “commercial surveillance” as a wide array of practices most businesses commonly engage in with their customers and employees.  The FTC’s scope of data security practices includes expected data breach response and data management, retention, and data minimization areas it has not dedicated significant attention to in the past.  The FTC provided additional summaries of these practices in a “fact sheet” released with the ANPR. 

Those who wish to submit comments may do so within 60 days after the publication of the ANPR in the Federal Register.  The FTC will also host a virtual forum on September 8 and will allow members of the public to speak for 2 minutes.

 

HIPAA: Meta sued for violating patient privacy with a data tracking tool

Meta and major US hospitals violated medical privacy laws with a tracking tool that sends health information to Facebook, two proposed class-action lawsuits allege. The lawsuits in the Northern District of California in June and July focus on the Meta Pixel tracking tool. The tool can be installed on websites to provide analytics on Facebook and Instagram ads. It also collects information about how people click around and input information into those websites.

 

NIST Risk Management Framework: Second Draft

Comments are due on September 29 on the second draft of the NIST Risk Management Framework. Additionally, there will be a workshop to discuss the draft on October 18th and 19th. Also previously released was a comparison draft playbook for the AI RMF.

 

U.S. STATES

 

California: Civil Rights Council

California Civil Rights Council (formerly California Fair Employment & Housing Council) released proposed amendments to the Employment Regulations regarding Automated Decision-making Systems applicable to employers and their service providers. The draft regulation is currently in a 45-daycomment? period, and the council held a hearing on August 10 to discuss the changes.

 

California: Attorney general announces first CCPA enforcement action against Sephora

California Attorney General’s office announced a settlement with beauty retailer Sephora for $1.2 million - the AG’s first monetary penalty for CCPA violations.

Sephora has also agreed to a 2-year consent decree with ongoing monitoring and reporting obligations. This enforcement action confirms the AG’s interpretation that: (1) the CCPA requires specific CCPA-mandated contractual terms with each cookie, pixel, and tracking technology provider that companies use on their websites for personal information sharing not to be a “sale” of data under the CCPA, and (2) companies that engage in “sales” of personal information on their websites must honor the Global Privacy Control signal from consumers who choose to use the GPC.

·       For more information on the enforcement action please see Sam Castic’s blog post.

 

New York: First State to Require CLE in Cybersecurity, Privacy, and Data Protection

New York became the first state to require attorneys to complete at least one credit of cybersecurity, privacy, and data protection training as part of their continuing legal education (“CLE”) requirements. In a joint order, the judicial departments of the Appellate Division of the New York State Supreme Court formally adopted the recommendation.  The new requirement will take effect on July 1, 2023.

 

EUROPE & UK

 

CJEU: Decision on inference

The Court of Justice of the European Union (“CJEU”) found that the processing of any personal data that are “liable indirectly to reveal sensitive information concerning a natural person,” i.e., any information that may reveal a person’s racial or ethnic origin, religious or philosophical beliefs, political views, trade union membership, health status or sexual orientation, is subject to the prohibition from processing under Article 9(1) GDPR unless an exception under Article 9(2) applies.

 

The Czech Republic New Cybersecurity Regulation:

An amendment to the Act on Cybersecurity was published in the Collection of Laws. The National Office prepared the Cyber ​​and Information Security law to adapt the Czech legal system to the European Parliament and the Council of the EU's regulation with the abbreviated "cyber security act.” The law became effective on the day following its promulgation.

 

France: AdTech Giant Criteo faces 65 million dollars fine for GDPR consent breaches

Criteo, a French AdTech company, has been fined 65 million dollars for GDPR consent breaches. A formal complaint was initially filed by Privacy International, a UK-based privacy advocacy group, in 2018, claiming Criteo operated a “manipulation machine” by using a range of tracking techniques and data processing practices designed to build profiles of internet users to target them with behavioral advertising without any legal basis under the GDPR

 

Ireland: Instagram fined €405M for violating kids’ privacy

·       The Irish Data Protection Commission has fined Instagram €405 million for violations of the General Data Protection Regulation. The fine is the second highest under the GDPR and the third for a Meta-owned company handed down by the regulator. The penalty is aimed at Instagram’s violation of the children’s privacy, including its publication of kids’ email addresses and phone numbers.

 

Slovenia DPA has issued guidance on DPIA's (Slovenian-language only)

The Slovenian Information Commissioner confirmed in recently released guidance that impact assessments on protecting personal data are essential for operators to manage the risks of breaches of personal data protection rules in an appropriate and timely manner. Although knowledge and experience of their production are accumulating, the Information Commissioner notes that some errors are repeated – in particular, there is a lack of a clear description of the risk assessment methodology, the risks are underestimated, and the role of the data protection officer is wrong, and the involvement of stakeholders is missing.

 

   In addition to the guidelines on this subject, there is now an infographic in which we draw attention to key shortcomings, make recommendations, and we would like to remind taxpayers that there is a checklist to check whether the impact assessment is comprehensive.

   Infographic is available here. We also recommend viewing the IA guidelines

 

ASIA-PACIFIC, MIDDLE EAST & AFRICA

 

China: Releases Draft of SCCs for Data Export

China released a draft of their standard contract for personal information export and accompanying regulation (Standard Contract Regulation”) for public consultation. The release of the draft Standard Contract and the relevant regulation marks a step closer to establishing China’s mechanism for exporting personal information via Standard Contract.  It is expected to be finalized.. (include date estimate).

 

China: Disclosure of Platform Algorithms to Regulators

New regulations in China aimed at tightening regulation of big tech require (amongst other things) certain large platform providers to disclose the actual algorithms used on their platforms to help show the algorithms do not invade user privacy or unduly influence user choice. Chinese regulators have required the disclosure of at least 30 algorithms on platforms like Alibaba, Wechat, Tencent, etc. 

 

China: New Mobile App Regulation

A new mobile app regulation is a consolidation and expansion of existing mobile app privacy obligations and applies to mobile app operators in Mainland China. Requirements include enhanced obligations around notice and consent, unique obligations for minors' data, default system permissions and configuration settings, and compliance obligations related to offerings of third-party SDKs. Operators must also follow the specific app and data categorization practices in their privacy disclosures. The effective date is November 01, 2022. 

 Hintze Law PLLC is a Chambers-ranked, boutique privacy firm that provides counseling exclusively on global data protection. Its attorneys and privacy analysts support technology, ecommerce, advertising, media, retail, healthcare, and mobile companies, organizations, and industry associations in all aspects of privacy and data security.