Financial firms may use algorithms—pre-coded sets of instructions and calculations that are executed automatically—to enhance consumer loan underwriting, the process of evaluating the likelihood that applicants will make timely loan repayments. Lenders may rely upon forms of automated analysis to help decide whether to offer consumers loans and at what terms. Faster computing power, internet-based products, and cheaper data storage at scale have increased the prevalence of algorithms.
This In Focus discusses developments in automated decisionmaking, artificial intelligence (AI), and machine learning (ML) in consumer loan underwriting. First, it focuses on market developments, then it discusses the current regulatory framework, then finally, it highlights selected policy issues.
Market Developments
Since the 1970s, consumer loan underwriting has become more automated, first with the increasing use of credit scores and more recently with new data and technologies. Credit scores are a (numeric) metric calculated with information in consumer credit reports and prepared for lenders to determine the likelihood of loan default. New technological innovations have been used to update automated processes, in some cases beyond traditional numeric credit scores. For example, for some lenders, the internet has been incorporated to accept applications, and new data sources are used to conduct consumer loan underwriting. Alternative data generally refers to information that may be used to determine a consumer's creditworthiness that the national credit reporting agencies—Equifax, Experian, and TransUnion—have not traditionally used when calculating credit scores for consumers. Further, AI and ML technologies have advanced rapidly in recent decades. AI technologies are computerized systems that work and react in ways commonly thought to require intelligence, such as solving complex problems in real-world situations. ML is often referred to as a subfield of AI with algorithms designed to automatically improve their performance through experience with little or no human input.
These technological developments potentially allow for greater speed, accuracy, and confidence in loan decisions. They are currently used more frequently in fintech products than in more traditional consumer lending products, particularly ML models and alternative data. Fintech (short for financial technology) refers to advances in technology incorporated into financial products and services. Many companies—both traditional financial firms and new technology-focused entrants to the market—are developing fintech products, making it a subject of increased interest for the public and policymakers.
ML Models in Consumer Loan Underwriting
Consumer loan underwriting can potentially be enhanced by ML models. ML models could improve efficiency and performance and reduce costs for financial institutions, potentially expanding credit access or making credit less expensive for some consumers. ML models could make consumer underwriting decisions more accurate by identifying new patterns, such as changing credit conditions, and by automatically updating the models to make more accurate underwriting assessments.
However, ML models can also introduce risks. One risk is a lack of explainability, the inability to explain why programs make particular decisions. Another risk is dynamic updating, which is when models evolve over time without oversight. ML models also raise concerns that they may not perform as intended, possibly resulting in higher loan losses in new market environments or discrimination against protected groups.
Current Federal Regulatory Framework
The Consumer Financial Protection Bureau (CFPB) is the primary consumer protection regulator for consumer financial products and services. One of the CFPB's statutory objectives is to ensure that "markets for consumer financial products and services operate transparently and efficiently to facilitate access and innovation." The CFPB has the authority in consumer financial markets to write regulations and enforce the law for both bank and nonbank financial institutions. However, the CFPB's supervisory authority to examine financial institutions for consumer protection compliance varies based on the charters, activities, and size of institutions. Therefore, financial regulators may monitor some nonbank fintech companies less than traditional banks.
Regulatory Uncertainty
Many financial laws and regulations that existed prior to recent ML technological developments have led to questions concerning their effectiveness achieving their designed policy goals as these potentially beneficial technologies evolve. Relevant laws and regulations may need to be reconsidered or updated in response to the future use of ML models in consumer loan underwriting. This often involves balancing efforts to encourage innovation while protecting consumers.
Federal financial regulators have been monitoring ML models in consumer lending. In March 2021, the bank and credit union regulators, along with the CFPB, requested information on financial institutions' use of AI, including ML models. In April 2023, the CFPB, along with other federal agencies, published a joint statement emphasizing that existing legal authorities still apply to automated systems and AI.
Consequently, some financial institutions, particularly many banks or other highly regulated parts of the financial system, may choose not to use ML models to approve or reject applicants for consumer credit, even if they are more accurate or efficient, due to regulatory uncertainty and compliance risks.
Policy Issues
The use of ML algorithms in credit underwriting has raised a number of policy issues of interest for financial regulators and Congress.
ML Models and Explainability Concerns
The ability of regulators or other outside parties to understand what an ML program did, and why, may be limited or nonexistent. This poses a significant challenge for companies using ML programs to ensure that they will produce outcomes that comply with applicable laws and regulations.
When a lender denies a loan application, the lender must send an adverse action notice to the applicant explaining the reason for the denial. Some question how well lenders will understand and be able to explain the reasons for adverse actions resulting from ML algorithms. To address this issue, some observers assert that regulators should set standards for how ML programs are developed, tested, and monitored, although debate exists about what these standards should include. Concerns exist about ML model fairness; the ability to provide more algorithm transparency; and developing processes to assess ML models, for example, for fairness, reliability, privacy, and security. In May 2022, the CFPB issued guidance clarifying that lenders using "complex algorithms" still need to comply with adverse action notice requirements.
Algorithmic Bias and Fair Lending
Consumer loan underwriting models using ML can introduce fair lending risks due to biases in data or model development. ML models may have training data biases, which is when a model has biases due to the limited or flawed dataset it was developed on. Historical data can reflect historical biases, potentially creating models that discriminate against protected classes. In addition, alternative data may include proxies for protected classes that might also bias ML models.
The Equal Credit Opportunity Act (ECOA; 15 U.S.C. §§1691-1691f) generally prohibits discrimination in credit transactions based upon certain protected classes, including sex, race, color, national origin, religion, marital status, age, and "because all or part of the applicant's income derives from any public assistance program." Questions exist about how lenders can comply with ECOA and other fair lending laws when using ML models for loan underwriting.
Underserved Consumers and Access to Credit
In the United States, robust consumer credit markets allow most consumers to access financial services and credit products to meet their needs in traditional financial markets. However, consumers who have difficulty entering the traditional credit reporting system face challenges accessing many consumer credit products, because lenders are unable to assess their creditworthiness. Limited credit history is correlated with age, income, race, and ethnicity, and many of these consumers are young.
Automated underwriting and ML models may expand credit access or make credit less expensive for some consumers. In particular, these technologies may increase financial inclusion for younger consumers, who may be more likely to have limited credit histories in the credit reporting system.
Data Privacy, Security, and Transparency
In credit underwriting, ML models often access sensitive consumer financial data, and the increase in digital data collection raises greater privacy and cybersecurity concerns. These data practices raise questions over what consumer information is appropriate to collect and use for loan underwriting.
Laws such as the Fair Credit Reporting Act (FCRA; 15 U.S.C. §1681) and the Gramm-Leach-Bliley Act (GLBA; P.L. 106-102) impose requirements on firms that use consumer data for credit underwriting. As data use in consumer financial services has grown, some have debated whether the scope of these laws should be expanded.
CRS Resources
CRS Report R47475, Consumer Finance and Financial Technology (Fintech), coordinated by Cheryl R. Cooper.
CRS Report R46795, Artificial Intelligence: Background, Selected Issues, and Policy Considerations, by Laurie A. Harris.
CRS In Focus IF11630, Alternative Data in Financial Services, by Cheryl R. Cooper.
CRS Report R44125, Consumer Credit Reporting, Credit Bureaus, Credit Scoring, and Related Policy Issues, by Cheryl R. Cooper and Darryl E. Getter.
CRS In Focus IF10031, Introduction to Financial Services: The Consumer Financial Protection Bureau (CFPB), by Cheryl R. Cooper and David H. Carpenter.
CRS In Focus IF11195, Financial Innovation: Reducing Fintech Regulatory Uncertainty, by David W. Perkins, Cheryl R. Cooper, and Eva Su.