3 minute read

Complex Algorithms Causing Complex Compliance Issues

Complying With Reg B And Ecoa On Adverse Action

By TYNA-MINET ANDERSON , Contributor, Mortgage Women Magazine

Algorithms are part of our everyday life. For example, when you Google a question, it is an algorithm that brings up the most popular sites. Using a proprietary computation, Google ranks the websites based upon the number of keywords or combination of words and the relevance to the question being asked. Then Google provides the top results and ranks them based on the user’s profile. This is a great example of a “complex algorithm” because the results can be different for every user.

Algorithm: a process or set of rules to be followed in calculations or other problem-solving operations, especially by a computer — Oxford Dictionary

Complex algorithms aid in many mortgage-related decisions, and sometimes they can actually make the decision. Software developers have built programs for the mortgage industry based on specific parameters such as credit ratings, creditor guidelines, appraisal values, as well as other loan dynamics.

Some regulators have recently started expressing concern that the input might have built-in bias or discrimination. Many creditors rely on the software and current algorithms to make the lending decision for a loan approval or denial. However, the creditor is ultimately responsible for ensuring that fair lending laws are adhered to and all clients are being treated equally. Creditors must be able to clearly demonstrate how the lending decision was derived.

The mission of the Consumer Financial Protection Bureau (CFPB) is to hold creditors accountable for all lending decisions regardless of the technology used in the decisionmaking process.

Some creditors may make credit decisions based on certain complex algorithms, sometimes referred to as uninterpretable or “black-box” models, that make it difficult — if not impossible — to accurately identify the specific reasons for denying credit or taking other adverse actions. The adverse action notice requirements of ECOA and Regulation B, however, apply equally to all credit decisions, regardless of the technology used to make them. Thus, ECOA (Equal Credit Opportunity Act) and Regulation B do not permit creditors to use complex algorithms when doing so means they cannot provide the specific and accurate reasons for adverse actions. CFPB Circular

2022-03 May 26, 2022

ECOA Regulation B specifies that the loan decision can only be made based on the creditworthiness of the applicant. It prohibits discrimination based on race, color, religion, national origin, sex, marital status, age, or public assistance programs.

If the mortgage company cannot identify the proper reason, it becomes nearly impossible to defend the decision to decline the loan based solely on the credit of the applicant.

When a creditor enters into the computer the borrower’s personal information such as race, color, national origin, sex, marital status, age, and any income derived from public assistance, it will be important to know if a decision made by a complex algorithm was influenced by any one of these factors as that could be considered discrimination.

ECOA requires a creditor to provide an adverse action to all clients that have been declined with specific details on the reasons for the adverse action. Additionally, the creditor must be able to defend each reason for the denial as listed.

Creditors must be able to provide applicants against whom adverse action is taken with an accurate statement of reasons. The statement of reasons “must be specific and indicate the principal reason(s) for the adverse action.” — CFPB

ECOA is intended to give consumers who experience a declined request for credit the ability to understand the reasons why they were declined. It also gives the consumer the opportunity to correct the issues. If the primary reason is not clearly identified on the adverse action, it makes it difficult for the consumer to take action to help themselves with appropriate steps to correct the reasons for denial.

Creditors cannot justify noncompliance with ECOA and Regulation B’s requirements based on the mere fact that the technology it employs to evaluate applications is too complicated or opaque to understand. A creditor’s lack of understanding of its own methods is therefore not a cognizable defense against liability for violating ECOA and Regulation B’s requirements. CFPB Digital redlining using complex algorithms in lending, including advertising and creditworthiness decisions, will be an ongoing compliance issue.

We need a fair housing market that is free from old forms of redlining, as well as new digital and algorithmic redlining … we should never assume that algorithms will be free of bias. If we want to move toward a society where each of us has equal opportunities, we need to investigate whether discriminatory black box models are undermining that goal. I am pleased that the CFPB will continue to contribute to the allof-government mission to root out all forms of redlining, including algorithmic redlining.

— CFPB Director Rohit Chopra

As a creditor, it is essential that you understand the information used to make any lending decisions internally by the software program’s algorithms. As a mortgage professional, you will be held responsible, not the software application or the software developer when it comes to bias and discrimination. n

This article is from: