The Colorado Division of Insurance's Final Governance and Risk Management Framework Requirements for Life Insurers' Use of External Consumer Data and Information Sources, Algorithms, and Predictive Models (the "Regulations") have gone into effect. Below is a summary of the Regulations' requirements and our thoughts on how the Regulations fit into the larger privacy and cybersecurity regulatory landscape.
Effective Date: November 14, 2023.
- Creation of the Governance and Risk Management Framework: Life insurers that use external consumer data and information sources ("ECDIS") or algorithms and predictive models that use ECDIS (collectively, the "AI Tools") must establish a risk-based governance and risk management framework that "facilitates and supports policies, procedures, systems, and controls designed to determine whether the use of AI Tools potentially results in unfair discrimination with respect to race and remediate unfair discrimination" (the "Framework"). One notable aspect of the Framework is the extent of executive participation required. For example, the Framework must be overseen by the board of directors or a board committee and senior management must be accountable for setting and monitoring the strategy governing the use of AI Tools. Tests to detect unfair discrimination in insurance practices resulting from the use of AI Tools must also be conducted and businesses must outline a description of the steps taken to address any such discrimination. The Framework must also document a governance group composed of key representatives across a business's teams, including legal, compliance, risk management, product development, etc. and a rubric used to assess and prioritize risk associated with the AI Tools.
- Reporting Requirements: Insurers using AI Tools must submit a report summarizing progress towards compliance by June 1, 2024. Additionally, such insurers must also submit a signed report summarizing compliance and the title and qualifications of each individual responsible for compliance by December 1, 2024.
The Regulations are part of a larger enterprise risk management ("ERM") trend we have seen this year. Regulations across the privacy and cybersecurity space have overlapped on several key issues such as board or executive leadership involvement or liability. For example, earlier this year we wrote about the proposed changes in the New York Department of Financial Services ("NYDFS") Cybersecurity Regulation, now finalized, which included requirements for board of director oversight of cybersecurity management. The recent proposed California Cybersecurity Audit regulations, which we wrote about here, similarly require that auditors report to an organization's "highest ranking executive" or to the organization's board of directors. The cybersecurity audit required under the California regulations must also be signed by a member of the board, governing body, or the business's highest ranking executive. Additionally, although the California Privacy Protection Agency has not yet released proposed AI regulations, their proposed assessment regulations includes potential senior management and board overweight elements.
Regulators are focused on ERM, and it is bleeding into privacy, cybersecurity and AI regulations. Detailed policies and procedures that capture these activities in a structured fashion, and test the controls for effectiveness, are becoming a requirement across these areas. Board members and executives need to understand ERM and must thoroughly involve themselves in the policies, processes, and audits they approve to comply with these new regulations.
This alert provides general coverage of its subject area. We provide it with the understanding that Frankfurt Kurnit Klein & Selz is not engaged herein in rendering legal advice, and shall not be liable for any damages resulting from any error, inaccuracy, or omission. Our attorneys practice law only in jurisdictions in which they are properly authorized to do so. We do not seek to represent clients in other jurisdictions.