The student loan company was accused of violating consumer protection laws

- Earnest Operations LLC agrees to pay $2.5 million and reform lending practices.
-
Massachusetts AG alleges AI-driven loan decisions harmed Black, Hispanic, and non-citizen applicants.
-
Settlement mandates compliance measures and bans discriminatory algorithmic rules.
Massachusetts Attorney General Andrea Joy Campbell has secured a $2.5 million settlement with Earnest Operations LLC, a Delaware-based student loan company, over allegations that its lending practices driven by artificial intelligence (AI) discriminated against marginalized borrowers and violated consumer protection and fair lending laws.
The settlement, filed in Suffolk County Superior Court as an assurance of discontinuance, addresses what the AGs office described as systemic failures in Earnests underwriting process, including the use of AI models that allegedly produced disparate impacts on Black, Hispanic, and non-citizen borrowers.
AI models under scrutiny
According to the Attorney Generals investigation, Earnest used algorithmic models to make critical decisions about loan eligibility, pricing, and terms. However, the company failed to test for discriminatory outcomes and relied on data inputs and training methods that introduced bias amplifying existing inequities in the lending process.
Earnests failure to comply with consumer protection and fair lending laws, including through its AI models, unfairly put historically marginalized student borrowers at risk of being denied loans or receiving unfavorable loan terms, AG Campbell said in a statement.
One key point of contention was the companys use of the federal Cohort Default Rate (CDR) a statistic reflecting average loan defaults at individual schools as an input variable in its algorithms. The AGs office said this disproportionately penalized applicants who attended minority-serving institutions, including historically Black colleges and universities.
Other alleged violations
In addition to algorithmic bias, the AG alleged other unfair practices:
-
Use of a Knockout Rule to automatically deny loans based on immigration status.
-
Arbitrary human assessments that led to inconsistent and opaque decisions.
-
Inaccurate adverse action notices that misinformed applicants about credit decisions.
-
A lack of internal compliance infrastructure to oversee fair lending risks.
Earnest denied all allegations and maintained that it did not violate state or federal law. The company said it agreed to the settlement solely to resolve the matter without prolonged litigation.
Reforms mandated in settlement
Under the terms of the agreement, Earnest must:
-
Pay $2.5 million to the state of Massachusetts.
-
Cease use of the Cohort Default Rate and immigration-based Knockout Rule in its loan decision models.
-
Establish a robust corporate governance structure to monitor AI use.
-
Develop written policies for responsible, legally compliant AI deployment.
-
Regularly report compliance metrics to the AGs office.
The settlement marks one of the first state-level enforcement actions targeting AI-related bias in financial services, setting a precedent for how regulators may respond to emerging technologies that impact consumer rights.
This case sends a strong message, Campbell said, that technology, no matter how advanced, cannot be used as an excuse to sidestep civil rights and consumer protections.
Posted: 2025-07-18 19:53:37