CFPB Warns Users Of Algorithms, AI, And Machine Learning Of Anti-Discrimination Compliance Requirements

Anti-discrimination law applies to companies with black-box credit models using complex algorithms, accordingt o a circular issued by the Consumer Financial Protection Bureau...
United States Finance and Banking
To print this article, all you need is to be registered or login on Mondaq.com.

Anti-discrimination law applies to companies with black-box credit models using complex algorithms, according to a circular issued by the Consumer Financial Protection Bureau (CFPB) on May 26, 2022. According to the CFPB, the anti-discrimination law requires companies to explain to applicants the specific reasons for denying an application for credit or taking other adverse actions, even if the creditor is relying on credit models using complex algorithms. In addition, the CFPB highlights creditors' adverse action notice requirements under the Equal Credit Opportunity Act (ECOA). This warning complements another recent CFPB announcement that it intends to use its general UDAAP authority to prosecute discrimination in any financial services category, whether it be banking (including deposits), servicing, collections, credit reporting, payments, or money transfers and remittances.

ECOA Requirements

ECOA protects individuals and businesses against discrimination when they are seeking, applying for, or using credit. ECOA requires that a creditor provide a notice when it takes an adverse action against an applicant, which must contain the specific and accurate reasons for that adverse action. According to the CFPB, creditors cannot lawfully use technologies in their decision-making processes if using them means that they are unable to provide these required explanations.

Takeaways from the Circular

  • Federal consumer financial protection laws and adverse action requirements should be enforced regardless of the technology used by creditors.
  • Creditors cannot justify noncompliance with ECOA based on the mere fact that the technology they use to evaluate credit applications is too complicated, too opaque in its decision-making, or too new.
  • The risks associated with decision-making technologies extend beyond adverse action notices and ECOA. For example, recently, the CFPB began taking a close look at the use of automated valuation models within the home appraisal process to ensure home valuations are accurate and fair.

Call for Whistleblowers

In the press release, the CFPB also highlighted the role that whistleblowers play in uncovering information about companies using technologies, like black-box models, in ways that violate ECOA and other federal consumer financial protection laws.

Emerging Technologies

The CFPB also says it is closely monitoring the work of the National Institute of Standards and Technology , within the U.S. Department of Commerce , and other government bodies around the world, to assess the benefits and risks associated with emerging technologies.

Consumer Financial Protection Circular 2022-03: Adverse action notification requirements in connection with credit decisions based on complex algorithms

The content of this article is intended to provide a general guide to the subject matter. Specialist advice should be sought about your specific circumstances.

We operate a free-to-view policy, asking only that you register in order to read all of our content. Please login or register to view the rest of this article.

CFPB Warns Users Of Algorithms, AI, And Machine Learning Of Anti-Discrimination Compliance Requirements

United States Finance and Banking
Contributor
See More Popular Content From

Mondaq uses cookies on this website. By using our website you agree to our use of cookies as set out in our Privacy Policy.

Learn More