When It Comes to Housing, Your Data Can Be Used Against You

The algorithmic processes routinely used by cities are completely biased—and could prevent you from accessing affordable housing.

Graphic from cover of Weapons of Math Destruction (Crown, 2016), by Cathy O’Neil

Big data can have a big impact, especially when it comes to fair housing. Housing application procedures, through which cities allot affordable housing, use many different metrics to determine the strength of applicants, but sometimes these conditions start to exclude certain ethnicities from the application—what has been defined as disparate impact. The reliance on credit reports and ranking systems, as well as the lack of laws regulating advertising, can not only affect applicants’ fair-housing eligibility but also skew their awareness of their options.

When it comes to housing applications, the components I worry about most are akin to credit reports or credit scores. Big companies like Equifax collect profiles and information on all Americans. Then they use the FICO scoring system to build credit scores that function like moral badges of honor or shame where companies or government o ices are concerned. These scores don’t reflect work ethic or character, but reflect whether you have paid your bills on time, regardless of the circumstances.

A good example of how this can go wrong is the way credit reports have been used in job applications. These reports have created a destructive feedback loop, whereby people who are perfectly capable of having a job are deemed unfit because of something like a medical bill that they didn’t have insurance for (perhaps because they didn’t have a job). This leads to a downward spiral—they have a bad credit score, which keeps them from getting the job, which negatively affects their credit score, and then further obstructs them from certain jobs.

The same thing can happen with housing. If people use FICO scores to decide whether someone is allowed to be given housing, it can shut them out unfairly from the system, making affordable housing inaccessible. It creates its own spiral. So even if planners and architects focus their efforts on increasing affordable housing stock, people who need housing the most could be prevented from getting it.

What makes it worse is that there are all sorts of other scoring systems, lists, and blacklists that don’t have any appeal system whatsoever, aren’t made apparent to the consumer, and have no regulations. If you are scored by any of these other systems, any data or attribute can be used against you, there’s no way for you to correct mistakes, and often you can’t even nd out what the data is until it’s used against you. A lot of these are marketing-silo scoring systems. If you go to a website, the algorithm on the site will put you in a marketing silo. As you connect to the home page it’ll say, “Oh, let’s assess this person. Is this person a high-value customer or a low-value customer?” Then once it’s put you into a box, it will decide what version of the website you are going to see—which in turn affects the kinds of ads you will see.

If you go to a housing website, it could decide what kind of housing you are going to be eligible for. The way the companies always frame it is “Oh, we are going to connect people to the things they want.” But of course it runs in both directions— they connect you to what they want. What’s dangerous is that these types of website analytics are unregulated. There aren’t any laws in place to make sure that it is not happening. It’s not illegal to advertise whatever you want. However, if you send a bunch of advertisements to someone for really good housing, and you send a bunch of advertisements to someone else for really terrible housing, you’re not forcing their choice, but you are definitely informing their options.

Algorithms are almost entirely built in the context of efficiency, to maximize profit. That’s why we desperately need new regulation. We need antidiscrimination laws to be expanded. I would like to see people being allowed to view their data, the way they get to see their credit report. I would like to be able to see whether that data is accurate. I would like there to be rules about fairness—like whether you are allowed to take into account someone’s gender and race. (In the case of FICO scores, you are not.) In general, I would like people to be made aware of moments when they are being scored, and the way they are being scored, and how that data is being used.

Cathy O’Neil is the former director of the Lede Program for Data Journalism at Columbia University Graduate School of Journalism, Tow Center, and the author of the popular blog mathbabe.org. Her book Weapons of Math Destruction was nominated for the 2016 National Book Award for Nonfiction. 

Recent Viewpoints