How to Avoid Discrimination When Using AI in Tenant Screening

HUD warns against artificial intelligence-based housing discrimination.

 

 

The use of artificial intelligence (AI) technologies in the multifamily industry has been raising red flags. As an increasing number of tenant screening companies claim to use AI to detect “higher-risk renters,” housing and privacy advocates are sounding the alarm on opaque algorithms that they say are enabling high-tech discrimination and amplifying existing biases in an already unequal housing market.

HUD warns against artificial intelligence-based housing discrimination.

 

 

The use of artificial intelligence (AI) technologies in the multifamily industry has been raising red flags. As an increasing number of tenant screening companies claim to use AI to detect “higher-risk renters,” housing and privacy advocates are sounding the alarm on opaque algorithms that they say are enabling high-tech discrimination and amplifying existing biases in an already unequal housing market.

Last October, President Biden signed an executive order calling on HUD to issue guidance to combat unlawful discrimination and biases against protected groups enabled by automated or algorithmic tools used to make decisions about access to housing. HUD recently followed through on that directive as well as on a recent pledge to enforce civil rights laws as new technologies such as AI become more common. The agency recently released guidance addressing the misuse of AI capabilities in the tenant screening process. HUD also released guidance on the misuse of AI and housing availability advertisements through platforms that use targeted ads, which we’ll cover in a future issue.

For now, HUD is monitoring how AI applications can violate provisions of the Fair Housing Act (FHA). HUD’s tenant screening guidance makes clear that use of third-party screening companies, including those that use AI or other advanced technologies, must comply with the FHA and ensure that all housing applicants are given an equal opportunity to be evaluated on their own merit. We’ll discuss the potential FHA pitfalls in the tenant screening process and best practices to avoid them when using automation such as machine learning and other forms of AI to screen tenants.

How Using AI Can Lead to Legal Liability

HUD’s tenant screening guidance highlights the fact that in recent years owners have increasingly relied on tenant screening companies to drive tenant selection decisions. And an increasing number of these tenant screening companies are claiming that they use advanced technologies such as machine learning and other forms of AI. Regardless of what technology is used, the FHA applies, and both owners and tenant screening companies are responsible for ensuring they don’t use these technologies in a way that can result in discrimination.

In fact, in its guidance, HUD reminds owners that under principles of direct liability, owners are responsible for ensuring their rental decisions comply with the FHA even if they’ve largely outsourced the task of screening applicants to a tenant screening company. While the screening company may be responsible too, owners retain authority over screening practices and decisions at their sites.

According to HUD, the use of machine learning and other forms of AI can increase these companies’ ability to analyze information about applicants, but the technology hasn’t been widely used for rental decisions until recently and its results may have little bearing on whether someone will comply with their lease. The technology can also lead to a less transparent process by hiding the precise reasons for a denial.

HUD notes the technology may use bad or incomplete data on credit scores, eviction history, and criminal record data. The problem with eviction history is that court records tend to include eviction records regardless of case disposition, meaning evictions are noted on public records even if the case was decided in favor of the tenant. And the same goes for criminal record data, which can include administrative citations, bench warrants, and traffic tickets, along with misdemeanors and felonies. Some companies also offer screening reports with additional details, such as medical debt, foreclosures, student loans, and even the number of phone numbers an applicant has had.

Generally, automated software is used to generate these screening reports, and companies tend not to disclose how this software works, including the extent to which it uses advanced technologies, such as AI. Some screening reports include a recommendation to accept or deny the applicant or provide a numerical score or grade. Some reports detail the records found, while others simply state if the applicant “passed” or “failed” in various areas.

6 BEST PRACTICES FOR NONDISCRIMINATORY

TENANT SCREENING POLICIES

1. Choose Relevant Screening Criteria

Screen applicants only for information relevant to the likelihood that they will comply with their tenancy obligations. If a screening policy disproportionately excludes applicants of a certain race or other protected class, conducting the screening in a more precise way may be a less discriminatory alternative.

HUD says past actions unrelated to tenancy and past incidents unlikely to recur such as eviction due to job loss, family, or medical emergency should not be the basis for denials.

And some records are more relevant than others. For example, more weight should be given to recent records versus older ones. In addition, records without a negative outcome are not relevant. For example, the record of an eviction proceeding has no relevance if the tenant prevailed. If a court record doesn’t provide enough information to determine who prevailed, it should be disregarded unless additional information about the disposition is obtained.

Credit history considerations. Tenant screening companies will incorporate consumer credit information into their complex tenant screening models. HUD highlights the fact that the use of these models or algorithms obscure how the information is used.

This is a problem because racial disparities exist among those who are “credit invisible.” This category applies to those having minimal or no credit history. These disparities exist in credit scoring, in part, because Black and Brown persons lack access to equitable credit and homeownership opportunities, and credit scores generally don’t capture timely rent, utility, or other bill payments.

Another example of a credit-invisible applicant is someone who recently immigrated to the United States. This person may not have any history that suggests they’re a credit risk, let alone a rental risk. Their credit record simply lacks information regardless of their financial history in their country of origin.

In addition, survivors of domestic violence, who are disproportionately women, are also more likely to have had experiences resulting in no or low credit scores. A survivor might not have credit records because their partner prohibited them from opening their own accounts, or the survivor might have negative credit history from being forced to obtain credit for the perpetrator’s benefit or assume the perpetrator’s unpaid bills. Because of these disparities, overbroad screenings for credit history may have an unjustified discriminatory effect based on race or other protected characteristics.

It’s important to note that no HUD program requires screening rental applicants for their credit score, and previous HUD guidance has noted that owners may forgo credit checks as long as they don’t discriminate because of a protected characteristic. Furthermore, owners and tenant screening companies should avoid denials and denial recommendations based on an applicant’s credit score in circumstances when the applicant’s financial background has especially little relevance. Limiting the use of credit scores when more relevant financial information is available may be a less discriminatory alternative to using credit scores in all instances.

2. Use Only Accurate Records

According to HUD, datasets used for tenant screenings are often incomplete, missing key personal identifiers, or updated infrequently. Automated systems might mis-categorize records with missing or unclear information if they aren’t programmed to account for those scenarios.

Ensuring records are accurate and using specific information in queries can help avoid discriminatory screenings. For example, records of people bearing the same or similar names are often attributed to the wrong person. This is a problem that’s more common for last names more prevalent among Latino, Asian, or Black individuals.

This usually occurs when using records that list only partial information, such as an initial instead of a first name, or records that lack corroborating details, such as a name without a date of birth. This can also happen when an owner or screener puts less specific information into a search than is available to them, such as inputting a state instead of a full address into queries. Records should match multiple pieces of identifying information, and “wildcard” or “name-only” matching shouldn’t be used.

Eviction history considerations. HUD says court records of evictions are notably unreliable, citing a large study in which 22 percent of the eviction records evaluated contained ambiguous information on how the case was resolved or falsely represented a tenant’s eviction history. And tenant screening companies have built private databases from court records of eviction cases.

While eviction records are one of the most commonly marketed tenant screening components, you should be aware that the quality of the records in these databases varies, and overbroad screenings for eviction history may have an unjustified discriminatory effect. Therefore, tenant screening companies and owners shouldn’t rely on eviction records that are old, incomplete, irrelevant, or where a better measure of an applicant’s behavior is available.

If information about an eviction record is known before a screening, the record shouldn’t be used. Otherwise, applicants should get the chance afterwards to have the record disregarded and corrections made. Also, owners and tenant screening companies shouldn’t deny or recommend the denial of housing to applicants based on eviction proceedings where the tenant prevailed, settlement was reached, or the matter was dropped.

In certain circumstances, you wouldn’t hold a past eviction against a tenant—for example, an eviction filed against a tenant in retaliation for asserting their rights such as to remedy conditions violations. An eviction that’s due to an underlying experience of domestic violence, dating violence, sexual assault, or stalking, such as for related noise violations, shouldn’t be held against the survivor. In addition, reasonable accommodations to the screening policy may be necessary if the eviction was related to the applicant’s disability—for example, an eviction for late payment of rent because of the timing of an SSI or SSDI payment or a medical emergency.

3. Follow the Applicable Screening Policy

You should strictly follow your site’s written screening policy, and records outside the scope of the policy shouldn’t be considered. If you use a tenant screening company, you should check whether the company is using the standards cited in the site’s screening policy and properly limiting its screening to conform to the policy. 

For example, misdemeanors or civil violations shouldn’t be considered under a policy of screening for felony convictions. From prior guidance, HUD has cautioned against overbroad criminal records screening policies since they’re likely to have an unjustified discriminatory effect. Persons who have been involved with the criminal justice system are disproportionately individuals with disabilities and Black and Brown persons.

Overbroad criminal records screenings include those that don’t differentiate between offenses based on their nature, severity, or how long ago they occurred; those that consider records, such as arrest records, that didn’t result in a conviction; and those that don’t provide an opportunity for the applicant to provide evidence of rehabilitation or other mitigating factors.

As a best practice, tenant screening companies should ensure that screening reports, recommendations, grades, and algorithmic models differentiate between criminal records on these bases, such as by excluding records that are old or for offenses not directly relevant to tenancy, and owners should account for these considerations when formulating screening policies and reviewing applications.

4. Be Transparent with Applicants

Transparency is important throughout the application process so that applicants know beforehand how they’ll be screened and afterwards why they were denied. Make sure your tenant screening policies are in writing, made public, and readily available to potential applicants. Before applying, potential applicants should be given a copy of the screening policies or the link to a website where they can find the policies.

Screening policies should contain enough detail for an applicant to tell whether they’re likely to qualify. Policies should specify what records will be considered, what incidents will be disqualifying, and how far back the screening will look. Providing this information early can reduce the number of unqualified applicants, saving owners and applicants time and expense.

Applicants should also be given information about how evidence of mitigating circumstances can be submitted and will be treated, how to request a reasonable accommodation for a disability, and how to contest an inaccurate, incomplete, or irrelevant record.

Denial letters should contain as much detail as possible as to all reasons for the denial, including the specific standard(s) that the applicant didn’t meet and how they fell short. If an applicant fails multiple screening criteria, all of those criteria should be included in the denial letter. All records relied on should be attached, including any screening reports. Denial notifications should also instruct applicants how to submit an appeal if a record is inaccurate, incomplete, or irrelevant; mitigating circumstance exists; or a reasonable accommodation for a disability is needed.

5. Allow Applicants to Challenge Negative Information

HUD says applicants should be given the opportunity to challenge any potentially disqualifying information. The applicant should be sent any information reviewed, such as their screening report along with the precise standards at issue. This information should be provided in a way that makes it easy for the applicant to understand why they were denied.

The applicant should then be given the opportunity to dispute the accuracy or completeness of any negative information. For example, the tenant may be able to show that a record belongs to another person with a similar name or the report omits a court decision in their favor.

The applicant should also get the chance to show that even if a negative record is accurate they will comply with their tenancy obligations regardless. Applicants may demonstrate that any negative behavior is unlikely to recur by providing evidence of mitigating circumstances, such as participation in a rehabilitation or financial literacy program, a change in education or employment status, or a positive reference from a social service provider or another person who knows the applicant.

6. Test Models for Fair Housing Compliance

The FHA applies broadly, including to housing decisions made using machine learning and other forms of AI. So, ask whether your tenant screening company uses a complex model and investigate how “interpretable” the model is; you may find it difficult to be sure you’re complying with the FHA when the reasoning behind automated decisions isn’t transparent. If a highly complex model has a discriminatory effect, the model’s lack of transparency could make it hard to prove that a legally sufficient justification exists for the criteria used for a denial decision.

Screening companies that use machine learning or other forms of AI should program complex models following best practices for nondiscriminatory model design and with attention to aspects especially likely to pose fair housing concerns. And training the model on demographically representative data can help ensure that the model doesn’t erroneously learn that certain protected classes should be screened out at higher rates. Check whether your screening company performs ongoing monitoring for these issues, which is important to ensure that changes over time, such as demographic shifts, don’t cause a dataset to become unrepresentative or incomplete.

 

Topics