Analysis Featured

Legal Considerations When Using Big Data And Artificial Intelligence To Make Credit Decisions

Companies across all sectors increasingly use big data as part of business decision-making. “Big data” has many definitions, but generally refers to “a collection of data from traditional and digital sources inside and outside [a] company that represents a source for ongoing discovery and analysis.”1 The credit and mortgage lending businesses are no exception to this growing trend. Companies use big data, algorithms, and artificial intelligence to make decisions about the extension of credit. While this emerging practice has the potential to accurately identify more people who are good credit risks and expand access to credit to traditionally under-served communities, companies should be aware of the legal risks that may arise under familiar laws such as the Fair Housing Act (FHA), Equal Credit Opportunity Act (ECOA), and the Fair Credit Reporting Act (FCRA).

Like companies in other sectors, banks and fintech companies have access to an increasing amount of data about their consumers. For example, a company might know what websites its consumers visit, the people they are connected to on social media, the clothes they buy online, the college they went to, the bills they pay, the places they used to live, and more.2 A company might collect this information itself, or it may purchase it from a “data broker” that compiles information about consumers from across the internet.3 Lending companies can use this data to make decisions about who should get credit by developing and applying algorithms that analyze the various data points and drawing conclusions from them about a consumer’s credit risk.4

For example, one company that has embraced the use of big data in evaluating consumers’ credit risk is Upstart. Upstart provides personal loans to consumers based on an assessment of their credit-worthiness that is based in part on “non-traditional sources of information such as education and employment history.”5 The company uses artificial intelligence and machine learning to make its determinations. Another company in this space is underwrite.ai. Underwrite.ai creates algorithms that use machine learning (i.e., artificial intelligence) that lenders can customize to help them make credit decisions.6 Its algorithms “analyze[] thousands of data points from credit bureau sources” in order to “accurately model credit risk for any consumer.”7

These companies and others are using technology to change the way that lending decisions are made, and more companies will likely enter this space in the near future. Yet even as technology and decision-making methods change, companies that choose to use big data, algorithms, and artificial intelligence to make lending decisions should not forget that traditional fair lending risks remain. Specifically, lenders should be attentive to avoiding discriminatory impact in lending decisions and consider how to comply with the obligation to disclose why a consumer was denied credit.

First, lenders must ensure that their lending decisions do not have a discriminatory impact in terms of race, gender, or other protected classes. The FHA prohibits discrimination in securing financing for housing on the basis of race, sex, and familial status, among other protected characteristics.8 ECOA plays a similar role context of credit transactions.9 The discrimination need not be intentional; a violation of these statutes occurs if there is a disparate impact on members of a protected class.10

The use of big data raises particular concerns related to disparate impact. For example, the inputs—that is, the data itself—can lead to inadvertent disparate impact on protected classes. If, say, a lending decision is made in part by screening people in certain zip codes, the racial distribution of loans may be uneven since de facto residential segregation persists in the United States. The same could be true for decisions made based on connections on social media sites such as Facebook. While the data is not explicitly based on race, it may still have a racially disparate impact. Relatedly, the algorithms a lender uses may contribute to a disparate impact. The algorithm might rely on correlations between certain data points that end up affecting people of certain groups differently. And machine learning could exacerbate the problem. If a machine learns through patterns that credit risk is correlated to zip code, it could adjust the algorithm and sort consumers by race, even though the algorithm itself is racially neutral.

To avoid making unintentional disparate lending decisions, companies should be vigilant about testing the impact of their algorithms and adjusting them as necessary. Moreover, companies should give close and careful thought to the business justifications for using a particular data set or algorithm.

Second, lenders that use big data, algorithms, and artificial intelligence to make lending decisions should be attentive to the requirements of the Fair Credit Reporting Act. Specifically, the FCRA requires lenders to disclose to consumers if they deny credit based on a consumer report and to disclose to consumers if they charge more for credit based on a consumer report.11 This allows consumers to check for inaccuracies in the consumer report, the document that determines their access to credit.12 This is a fairly simple requirement to satisfy when a credit decision is based on a FICO score. But when a credit decision is based on a complicated algorithm drawing from a wide data set that incorporates a multitude of data points compiled by a third party, it is much more difficult to explain the credit decision to the consumer. Lenders ought to think carefully about choosing data sets and designing algorithms that can lead to clear explanations of lending decisions.

Though this is still an emerging area, federal regulators have already demonstrated an interest in the fair lending implications of the use of big data. For example, in September 2017, the Consumer Financial Protection Bureau (CFPB) issued a No Action Letter to Upstart. In the letter, the CFPB indicated that it did not intend to take an enforcement or supervisory action against Upstart.13 But it clarified in an accompanying press release that it was paying attention to the emerging field of big-data-based lending. It noted that the CFPB had “launched an inquiry into the use of alternative data sources in order to evaluate creditworthiness and potentially expand access to credit for consumers with limited credit history.”14 Other government agencies have also begun thinking about the fair lending ramifications of big data, algorithms, and machine learning. For example, the Federal Trade Commission put out a report in January 2016 raising a number of issues that may become relevant in this space.15

It is difficult to predict how the current administration will address these emerging issues.16 Yet companies continue to recognize the real risks that fair lending laws can pose to their business models. For example, underwrite.ai has specifically noted its intent to incorporate federal fair lending and credit laws into its algorithmic model: “Our system was designed to be fully compliant with all FCRA regulations from the ground up. Additionally, we specifically exclude from analysis any data that might proxy for a protected class. In our model, we don’t know or care about the gender, age, race, religion, zip code, sexual preference, or ethnicity of applicants. We strongly believe that these attributes are fundamentally NOT predictive of credit worthiness.”17 Other companies would be prudent to follow a similar path.

This is particularly true because, even if federal enforcement of fair lending laws wanes, private plaintiffs may still take action. The FHA and ECOA both provide for private rights of action.18 Available remedies include punitive damages and equitable and declaratory relief, as well as attorneys’ fees. State attorneys general, too, may seek to enforce state statutes that protect fair lending. Many states have fair lending laws, and, in fact, many of those statutes are more expansive in terms of protected classes than the federal laws are.19 No matter the role the federal government takes in the near future, private plaintiffs and state attorneys general still have the power and opportunity to seek relief.


1 Lisa Arthur, What Is Big Data?, Forbes.com (Aug. 15, 2013).
2 See, e.g., Federal Trade Commission, Big Data: A Tool for Inclusion or Exclusion?: Understanding the Issues (January 2016), at 3-5, (“FTC Report”); Charles Lane, Will Using Artificial Intelligence To Make Loans Trade One Kind Of Bias For Another?, WBEZ (March 31, 2017); Executive Office of the President, Big Data: A Report on Algorithmic Systems, Opportunity, and Civil Rights (May 2016), at 11-12.
3 FTC Report at 4.
4 Id. at 4-5; Lane, Will Using Artificial Intelligence To Make Loans Trade One Kind Of Bias For Another?.
5 Press Release, Consumer Financial Protection Bureau, CFPB Announces First No-Action Letter to Upstart Network (Sept. 14, 2017), (“Upstart Press Release”).
6 http://www.underwrite.ai/services.
7 http://www.underwrite.ai/.
8 42 U.S.C. § 3605.
9 15 U.S.C. § 1691.
10 Tex. Dep’t of Hous. & Cmty. Affairs v. The Inclusive Cmtys. Project, Inc., 135 S. Ct. 2507 (2015).
11 FCRA, 15 USC § 1681m(a), (h).
12 FTC Report at 14.
13 Letter from Christopher M. D’Angelo, Associate Director for Supervision, Enforcement & Fair Lending, Consumer Financial Protection Bureau, to Thomas P. Brown, Paul Hastings, LLP (September 14, 2017).
14 Upstart Press Release.
15 See generally FTC Report.
16 Michael Mulvaney, the acting Director of the CFPB, recently decided to move the Fair Lending Office out of the Supervision, Enforcement, and Fair Lending Division and into the Office of the Director. Many consumer groups viewed this move as a signal that the CFPB will decrease fair lending enforcement. Kate Berry, CFPB’s Mulvaney Strips His Fair-Lending Office of Enforcement Powers, American Banker (Feb. 1, 2018).
17 http://www.underwrite.ai/faq.
18 42 U.S.C. § 3613; 15 U.S. Code § 1691e.

Authors:

Kali BraceyKali Bracey is a litigation partner in Jenner & Block’s Washington, DC office and a member of the firm’s Government Controversies and Public Policy Litigation Practice.  She brings 20 years of experience handling complex commercial litigation, investigations and regulatory matters.

 

 

 

Marguerite MoellerMarguerite L. Moeller is an associate in Jenner & Block’s Litigation Department.

Testimonials

default image

"Your daily letter is great!" , Ram , Founder and CEO, PeerIQ

default image

"Hi George - just want to tell you that you are doing a great work with Lending Times;-) Brgds, Kasper" , Kasper, Partner and Co-founder at Dansk Faktura Børs A/S

default image

"I've been following your newsletter for some time now and have been very impressed with the content." Charlie,Co-Founder | Bolstr

default image

"Hey George, I must say I really enjoy your site. It has inspired me to do some changes at our platform and we are the biggest consumer lender in Sweden." , Ludwig, CEO @ Savelend Sweden AB

default image

"Your daily email is very useful. It gives quick update on what's going in the market. Thank you very much for all that info." Yann Murciano, Head of Base Metals Trading at Morgan Stanley

Our daily p2p news digest

Daily News Summary Digest Sent Daily To Your Inbox