3 minute read
Are your fair lending practices evolving with the times?
Fair lending requirements have come a long way since the first law, the Fair Housing Act (FHA).
Lending discrimination happens when a lender bases a credit decision on factors other than a borrower’s creditworthiness. Following the FHA, the Equal Credit
Opportunity Act (ECOA), explicitly prohibited discrimination in lending. Under the ECOA and FHA, discrimination can occur at any point in the credit process and can involve any credit product or delivery system. The rules apply to marketing, the application process, pricing, underwriting, servicing and collections.
Methods of proof
The courts have recognized three methods of proof of lending discrimination: • Overt evidence of disparate treatment • Comparative evidence of disparate treatment • Evidence of disparate impact
Disparate treatment results when applicants are treated differently and there is no explanation for the difference other than that they are part of a prohibited basis group. Disparate impact is a neutral policy applied equally to everyone, but the effect disproportionately excludes or burdens persons on a prohibited basis.
Fair lending can touch every area and be unintentional. Addressing these questions can help raise your awareness of issues: • Does your marketing material or website contain images that don’t represent all protected classes? Is there something that may discourage someone within a prohibited basis group from seeking a loan from you? • How does your response time for applicants in majority minority areas compare with those from other areas? This data is helpful in determining an appearance of disparate treatment or impact. • For exceptions, are you documenting the mitigating circumstances, analyzing the exceptions and presenting them to the board or committee of the board for review? This can help you understand how evenly your credit practices are applied.
Redlining persists
Digital redlining is a current focus of the Consumer Financial Protection Bureau (CFPB). Academic studies are raising questions about algorithmic bias. CFPB director Rohit Chopra stated in a recent speech that a statistical analysis of two million mortgage applications found that black families were 80% more likely to be denied by an algorithm when compared to white families with similar financial and credit backgrounds. These “black box” algorithms can result in redlining by excluding borrowers who live in specified areas.
Chopra noted that mortgage companies conceded that their researchers don’t have all the needed data that feeds into the algorithms or full knowledge of how they work. The algorithms were compared to black boxes behind a brick wall.
When consumers do not know how the decisioning is being made by the algorithms, they are not able to actively participate in a fair and competitive market free from bias. Do you understand how the system you use is making decisions, or is it a black box behind a brick wall?
Scrutiny of appraisals
A study from the Federal Home Loan Mortgage Corporation (Freddie Mac) found that the appraisal industry was consistently undervaluing the homes of Blacks and Latinos compared to white-owned homes. On average, homes in Black and Latino neighborhoods are undervalued by as much as 23% to similar homes in white neighborhoods. When reviewing appraisals, the reviewer should be looking
at the selection of comparables, adjustments made to the comparables and appraiser opinions in the reconciliation of the value.
While fair lending requirements are not new, much work remains to be done to ensure compliance. Fair lending is a critical priority for financial institutions, and technology has led to evolving concerns about violations and enforcement measures.
Kristen L. Eustis, CRCM
Manager WIPFLI
EXPECT A NEW STANDARD OF SERVICE
Abram was founded with the speci c purpose of transforming the cash in transit industry, through the heart of a servant. Our vision is to positively transform lives in the communities we serve.