x
Economy USA

AI Tenant Screening Tool to Stop Scoring Housing Voucher Applicants Following Discrimination Lawsuit

AI Tenant Screening Tool to Stop Scoring Housing Voucher Applicants Following Discrimination Lawsuit
AP Photo / Nam Y. Huh
  • PublishedNovember 22, 2024

SafeRent, an artificial intelligence-powered tenant screening tool, will cease using AI-generated “scores” to evaluate applicants who use housing vouchers as part of a settlement agreement following allegations of discrimination.

The settlement, which was approved by US District Judge Angel Kelley, resolves a class action lawsuit filed in Massachusetts in 2022, accusing the company of unfairly disadvantaging low-income tenants, particularly Black and Hispanic individuals.

Under the terms of the five-year settlement, SafeRent will stop displaying tenant scores for applicants using housing vouchers nationwide. The company will also be prohibited from offering recommendations to landlords on whether to accept or deny rental applications for voucher holders based solely on the AI-powered SafeRent Score. Instead, landlords will be required to consider a tenant’s complete record when making housing decisions, rather than relying on the AI-generated score.

The lawsuit claimed that SafeRent’s algorithm, which uses factors such as credit history and non-rental-related debts to generate scores, disproportionately harmed applicants using housing vouchers. It also argued that the scoring system was opaque, as it did not provide transparency on how scores were determined. The plaintiffs, led by Mary Louis, a Massachusetts resident who experienced a rental denial, asserted that the algorithm unfairly assigned lower scores to Black and Hispanic applicants, which led to discriminatory denials of housing.

The settlement also includes a $2.3 million payout for Massachusetts residents who were denied housing due to SafeRent’s scoring system. SafeRent, while maintaining that its scoring system complied with applicable laws, expressed a desire to resolve the matter without further legal proceedings, citing the cost and time involved in defending the case.

The case highlights the growing scrutiny of AI-driven systems used in housing and other sectors. Critics argue that these algorithms, even if not intentionally biased, can perpetuate discrimination if they are trained on data that reflects historical inequalities. The SafeRent case is one of several legal challenges faced by algorithmic-driven services in the real estate industry, as regulators and activists push for more accountability and transparency in AI decision-making processes.

While some states have proposed regulations to address the use of AI in housing decisions, such measures have faced limited support. This case could set a significant precedent for future legal actions aimed at holding AI tools accountable for discriminatory outcomes.

For now, SafeRent has agreed to work with third-party experts if it develops new screening tools, ensuring that any future algorithms are validated for fairness.

With input from the Verge and the Associated Press.

Written By
Joe Yans