US states implement data practice assessment obligations

March 2024  |  SPECIAL REPORT: DATA PRIVACY & CYBER SECURITY

Financier Worldwide Magazine

March 2024 Issue


Companies subject to the California (CCPA), Colorado (CPA), Connecticut (CTPA) and Virginia (VCDPA) consumer privacy regimes (collectively referred to below as the 2023 US Privacy Laws) should be conducting and documenting data practice assessments prior to engaging in certain types of data processing.

Other state laws going into effect in 2024 have similar requirements. Most notably, assessments are required if the processing is deemed ‘high risk’, which specifically includes, without limitation, processing for targeted advertising, profiling and automated decision making (ADM), processing of sensitive personal data and sale of personal data.

Since they are inspired by the European Union’s (EU’s) General Data Privacy Regulation’s (GDPR’s) assessment requirements, direction on what else may constitute high risk processing can be found in guidance from the European Data Protection Board, and should be considered.

The 2023 US Privacy Laws do not mandate a particular methodology for analysing risk and impact but call for a risk and benefit balancing. However, there is a GDPR-inspired trend to do so by ranking severity and likelihood of risk on a numeric scale against common framework controls. One simple scoring methodology recommended by the UK Information Commissioner’s Office ranks severity of impact against likelihood of occurrence, each ranked as low, medium or high.

Whether or not a scoring system is utilised, there should be a standard against which decisions are made. Such a standard can be articulated in a privacy programme plan and in a responsible artificial intelligence (AI) policy. Such plans and policies can be built around frameworks such as the National Institute of Standards and Technology (NIST) – including the NIST Privacy Framework and the NIST AI Risk Management Framework – and the International Standardization Organization.

A simpler framework that has influenced data privacy and information governance laws and best practices worldwide, including the 2023 US Privacy Laws, is the US government’s Fair Information Practice Principles. Tennessee’s new privacy law offers a potential safe harbour to violations where a controller maintains a written privacy programme plan that meets certain adequacy requirements and is consistent with NIST or similar privacy frameworks.

Colorado has promulgated detailed regulations on assessment requirements. In late August 2023, the California Privacy Protection Agency (CPPA) published a discussion draft of potential regulations for assessments (and a set for cyber security audits), which were further updated in December 2023 but have not yet been approved for publication for public comment, and which are thus not yet part of the formal rulemaking process. Virginia and Connecticut are not likely to detail what will be required for assessments.

Some other currently effective state privacy laws, like the Utah Consumer Privacy Act and the Iowa Consumer Data Protection Act, do not require assessments, but the data minimisation and purpose limitation provisions of those laws make assessments practically necessary. Similarly, the newly enacted Washington My Health My Data Act does not require assessments, but its requirements to establish, implement and maintain administrative, technical and physical data security practices that, at a minimum, satisfy a reasonable standard of care within the industry, suggest a practical need for using assessments. The California Age-Appropriate Design Code Act (AADCA), however, does call for assessments.

When must assessments be conducted?

In sum, applying the highest watermark of what it would take for a single programme to comply with all of these state laws, and applying best practices, assessments should be undertaken if any of the activities outlined below are present with regard to the personal data a company controls, in whole or in part.

First, processing sensitive data. Second, processing for targeted advertising. Third, selling personal data. Fourth, sharing personal data for cross-context behavioural advertising. Fifth, processing personal data for high-risk profiling, such as where it could impact access to essential goods or services or impact rights. Sixth, using automated decision-making technology for a decision that produces legal or similarly significant effects concerning a person, profiling a person acting in their capacity as an employee, job applicant, independent contractor or student, profiling a person in a publicly accessible place or behavioural advertising, including first party, processing the personal data of children or other minors, and other potentially high-risk processing activities, which could potentially include processing data on a large scale. Seventh, processing personal data to train AI or ADM technology. Eighth, matching or combining data sets in a way that would exceed the reasonable expectations of the data subjects. Ninth, innovative use or use of new technology. Tenth, when processing itself prevents consumers from exercising a right or using a service. Eleventh, using cookies or other tracking technologies. Lastly, when a security incident would trigger an obligation to notify data subjects or the government.

What is required to be in an assessment?

An assessment must be a genuine and thoughtful analysis of each personal data processing activity that presents a heightened risk of harm. In doing so, it should identify and describe the risks to the rights of data subjects and others associated with the processing, document measures considered and taken to address and offset those risks, contemplate the benefits of the processing, and demonstrate that the benefits of the processing outweigh the risks offset by safeguards in place or to be taken.

The depth, level of detail and scope of assessments should take into account the scope of risk presented, the size of the company, amount and sensitivity of personal data processed, the nature of the personal data processing activities subject to the assessment and the complexity of the safeguards applied.

Assessments should involve all relevant internal actors from across the company’s organisational structure, as well as relevant external parties, where appropriate, to identify, assess and address data protection risks. California proposes to require presentation of assessment findings to the company’s governing board and certification by an executive officer.

In sum, assessments should include the following information: (i) a summary of the processing activity; (ii) identification of the personal data involved in the processing activity, including identification of sensitive data and the sources of the data; (iii) the context, nature, purposes and operational elements of processing; (iv) a risk-benefit analysis of the processing activity; (v) identification of potential risks and harms and description of measures taken to address risks, as well as the potential benefits of the processing activity; (vi) a list of the internal and external actors involved in processing activity, including all data recipients; (vii) a description of notices and choices to be given to data subjects, particularly as required by applicable law; and (viii) other specific requirements enumerated in the state laws or the Colorado or California regulations, particularly regarding ADM profiling and processing of sensitive data.

With regard to the AADCA, there are additional specific requirements for assessments before an online service, product or feature likely to be accessed by minors is offered to the public.

To address risks identified, an assessment should apply a risk and benefit analysis that identifies the transparency, technical and organisational remedial measures required for approval of the proposal, that may be necessary to bring risk into an acceptable level, protect privacy and security, and achieve data minimisation and purpose limitations. Some practices may not be capable of such remediation and will need to be prohibited, while others may be approved with conditions designed to mitigate risks and otherwise comply with legal obligations.

Record-keeping and update requirements

Again applying the highest watermark of what it would take for a single programme to comply with all of these state laws, assessments should be updated periodically (annually for profiling in Colorado, while California is considering mandatory updates every three years regardless of processing type), considering the level of risk and any changes made throughout the processing activity’s lifecycle.

Assessments should be stored throughout the lifecycle of the processing activity and for at least three years after its end (California is considering five years). Draft regulations propose to require summaries of assessments to be certified by an executive officer and filed with the CPPA. Under the AADCA, assessments must be biennially reviewed and updated and must be maintained for as long as the online service, product or feature is likely to be accessed by minors.

Government filings or access to assessments

Generally, a company should be prepared to disclose assessments to the respective state’s regulator upon request, which should be subject to confidentiality protections. The CCPA gives the CPPA authority to require assessments be filed with it. Currently, the CPPA is discussing requiring filing of summaries of all assessments. Companies may want to create a non-privileged summary of a completed assessment to show customers, particularly in a business-to-business due diligence context.

Cyber security audits

In addition to assessments, the 2023 US Privacy Laws, along with other recently passed state data protection laws, also include provisions about cyber security audits. A majority of the laws offer controllers a right to conduct reasonable audits of processors, and California and Colorado obligate controllers to review their vendors’ security practices. Draft regulations propose a potential scope of the audit and establish a process to ensure that audits are thorough and independent and that summaries of all audits be filed with the agency.

Conclusion

In conclusion, the assessment and audit requirements of the new generation of state data protection laws will force US companies to move beyond mere window dressing (e.g., privacy policies and consumer rights request mechanisms) to develop fulsome data protection programmes. If California follows through on requiring assessment and audit summaries to be certified and filed with the state, regulators will be able to easily see which companies are not doing the behind-the-scenes work to ensure that their data protection programme is meaningful.

 

Alan L. Friel is chair of the global data practice at Squire Patton Boggs. He can be contacted on +1 (213) 689 6518 or by email: alan.friel@squirepb.com. The author would like to thank Sasha Kiosse, an associate at Squire Patton Boggs, for her assistance with the preparation of this article.

© Financier Worldwide


©2001-2024 Financier Worldwide Ltd. All rights reserved. Any statements expressed on this website are understood to be general opinions and should not be relied upon as legal, financial or any other form of professional advice. Opinions expressed do not necessarily represent the views of the authors’ current or previous employers, or clients. The publisher, authors and authors' firms are not responsible for any loss third parties may suffer in connection with information or materials presented on this website, or use of any such information or materials by any third parties.