Tenant screening practices are being put under a microscope

Tenant screening software is no panacea. It has invited a rash of lawsuits alleging inaccurate information or discriminatory practices. For all of the marvels of technology, there is no substitute for old-fashioned personal sleuthing.

Whoever said that we live in a litigious society must have had rental housing providers in mind.

In recent memory, Bornstein Law has seen a spate of lawsuits targeting landlords. It began with tenants and their attorneys claiming wrongful eviction, with six-figure payouts not uncommon when successful.

We saw enterprising attorneys attempting to "shake down" rental housing providers who summarily state that Section 8 vouchers are not accepted,  a refusal that flies in the face of fair housing laws, as we explained in this earlier video.

 

 

Tenants' attorneys became more inventive by suing real estate companies that did not have a website that accommodates disabled persons who require some additional tools to access the content.

We've seen massive lawsuits arise when a tenant pays rent to live in an unwarranted unit, the rental relationship sours, and the disgruntled renter seeks retribution for having been placed in that unit because it is out of the good graces of the city, even if it is in pristine condition and the tenant was living there for years on end.

And, of course, there are always some errant instances when the property owner effectuates an owner move-in eviction (OMI) or relative move-in eviction (RMI) without staying in the unit for a certain period of time or otherwise not being compliant with the law. This has always been a reason for litigation, but we are starting to see much more vitriol for these types of evictions when they are done in bad faith and with an ulterior motive.

The list can go on, but we want to single out another type of lawsuit that is emerging, and that is the reliance on automated tenant screening software that might provide faulty information when the landlord evaluates rental applicants, and, most importantly, the inartful communication many landlords and property managers are using to deny a tenancy.

As NBC reported, a Navy man with top-secret clearance returned home after being deployed in South Korea, but an algorithm-based screening process falsely identified him as a Mexican drug trafficker. He sued RentGrow and CoreLogic for this stain on his record.

With the decline of homeownership and a surge in apartment vacancies, tenant screening has become a big business. The industry has been largely unregulated up until now, but it is starting to feel some pressure. One case that has been on our radar is Connecticut Fair Housing Center et al., vs. Corelogic Rental Property Solutions, LLC.

In a landmark civil rights decision, the Connecticut federal District Court has clamped down on tenant screening practices, ruling that the standards of the Fair Housing Act apply to consumer reporting agencies.

This lawsuit arose when a property management company disallowed a disabled Latino man from moving in with his mother, who happened to be appointed the conservator. The property manager relied on data provided by CoreLogic’s "CrimSAFE" background check that revealed the blemish of a shoplifting charge that was dropped, resulting in recommended rejection of the application.

When the mother probed into why she could not welcome her son into the home,  she was unable to access the underlying background report that was used to deny the tenancy. As a result, her son had to remain in a nursing home for more than a year because of this minor infraction that was alleged long ago but was not prosecuted.

The below statement comes from legal representatives for plaintiffs Carmen Arroyo and the Connecticut Fair Housing Center, Christine E. Webber, Partner at Cohen Milstein Sellers & Toll, Shamus Roller, Executive Director of the National Housing Law Project, and Erin Kemple, Executive Director at the Connecticut Fair Housing Center:

Tenant-screening technologies often lack a sufficient review process to ensure fair housing standards, yet are increasingly common as landlords look to third-party companies to evaluate and streamline their rental application process. CoreLogic RPS’s analysis and ultimate denial of Ms. Arroyo’s son’s rental application illustrates how the algorithms these technologies use are discriminatory. We look forward to proving our case in court.

Amid the systemic racism that exists in our criminal justice system, where people of color are disproportionately burdened by the law, tenant-screening algorithms that deny applicants based on past criminal records without any limitation or individualized consideration are unjust in deciding a person’s right to housing. This is a significant limitation of algorithmic technologies that frequently discriminate against Black and Latino applicants.

We hope the housing industry is paying attention to this trial, as it will set an important precedent for tenant-screening technologies moving forward.

 

Lessons to be learned

Rental housing providers already have a target on their backs. As we emerge on the other side of COVID, our community does not need another problem that is easily avoidable. Computer-generated notations that a rental application is "disqualified" with no explanation are not acceptable.

We have to take into account the individual circumstances, the severity of the offense, and other factors without painting a broad brush and having a blanket ban on all rental applicants with a checkered past. CoreLogic, which has since divested its tenant screening business, argued that someone who has been arrested once is more likely than others to be arrested again.

This struck a nerve with the court, which pointed to the long-lasting impacts of stigmatizing anyone who has been arrested and that, of course, these types of exclusionary policies in turning down rental applicants with a criminal history have had a disproportionate and arbitrary effect on racial minorities.

Less communication is more. We advise that those renting residential property should not make categorical statements, and not express any preference for a certain type of tenant or disfavor for another group. The reason for denying a tenancy reasonably could be explained that you found a more suitable renter. Period. Without any bias.

We've said many times and in many ways that technology should not be used as a crutch and that asking for references and doing your own due diligence is key when evaluating a tenancy. Not solely relying on algorithms.

Bornstein Law has not been a Johnny-come-lately on this subject, and we invite you to read on in the links below. As a parting thought, for those of you who are property managers, we strongly recommend that you train all employees in fair housing laws and make it a core objective so that no one in your organization slips up and exposes you or your clients to liability.

 

RELATED FROM OUR BLOG

‘Second Chance’ laws raise weighty moral and legal issues

Criminal Background Checks in Tenant Screening

Tenant Screening Hampered by Ease of Concealment

Law Poised to Invalidate Tenant Screening Reports & Technology

Tenants Concealing a Checkered Past