How the Supreme Court rules on a Dallas-based case that it’s now considering — Texas Department of Housing and Community Affairs v. the Inclusive Communities Project — will serve as a guidepost in how the U.S. will interpret civil rights-era legislation moving forward. The case challenges the interpretation of the Fair Housing Act as a tool meant to broadly protect those who are often targets of housing discrimination at a time when, as the New York Times points out in this op-ed, “persistent housing segregation remains a fact of life for many blacks.”
Big data will be watching. As I wrote about in “Big Data Has Potential to Both Hurt and Help Disadvantaged Communities,” information gathered by data brokers is increasingly having an influence on the composition of low-income neighborhoods through the use of algorithmically driven practices like predatory lending, target marketing and real estate development.
“In my mind, [these practices are not] the type of rampant racism in housing markets during the time when redlining was part of the assessment of property values in the United States,” says Seeta Gangadharan, a research fellow at the New America Foundation’s Open Technology Institute. “In that time, you might have had a very explicit conversation about how to keep neighborhoods white. It’s really unlikely that’s someone is saying, ‘Let’s program this algorithm to be racist.’ It’s much more complicated than that.’”
Data brokers intending only to “find out how to best serve their customers” — to use industry parlance — might sell geotagged information that could inadvertently reinforce intransigent residential patterns once historically determined by segregation. As this map of Cleveland shows, there is an unfortunate, but unsurprising, correlation between the neighborhoods where African-Americans were steered in the early 20th century and Cleveland’s 21st-century hotspots for health problems and predatory lending.
While Gangadharan says the Federal Trade Commission sparked a helpful conversation about big data and discrimination in the financial industry with the passage of the Equal Credit Opportunity Act and the Fair Credit Reporting Act, there are a lot of strides to be made in housing, civil liberties and privacy circles.
“[In] the United States, place is race,” says Gangadharan. “We’ve known this forever, since the founding of this country.” Certain populations have always been associated with particular blocks, neighborhoods or building types. But in 2015, neighborhoods where people end up renting or buying, upward social mobility, and consumption habits are now partially determined by habits like how you use your phone or interact on apps or social media.
“You might be targeted in a particular area based on aggregate data about your neighborhood, even though you, yourself, might be different than the aggregate. That might be economically consequential,” Gangadharan explains.
While these types of discussions are opening up in law, tech, policy and academic circles, Gangadharan points to grassroots activism having a similar and perhaps more profound effect on revealing housing discrimination: “A lot of the black and youth movements that came out of social justice concerns over the events in Ferguson and New York are also connected to these conversations about residential segregation and the broad reassessment of civil rights.”
The big data industry isn’t all doom. There are examples of organizations and companies using data mining to have a social impact. Gangadharan uses the example of black-owned businesses being able to target their online ads on Facebook. “Big Data Could Help Some of the 200,000 NYC Households That Get Eviction Notices This Year” tells the story of one organization using number-crunching to find families at risk of becoming homeless.
“We’re at such a nascent stage with respect to designing and implementing fairer algorithmically driven systems,” says Gangadharan. “It is my hope that some companies will realize very early on, well before public outrage foments, that it is interesting and feasible to design products and services such that we have a process of integrity with regards to decisions and outcomes. Some companies will take some leadership in thinking about some of the technical solutions that might amplify civil rights rather than hinder them.”
The Court is expected to rule on the Dallas case this summer.
The Equity Factor is made possible with the support of the Surdna Foundation.
Alexis Stephens was Next City’s 2014-2015 equitable cities fellow. She’s written about housing, pop culture, global music subcultures, and more for publications like Shelterforce, Rolling Stone, SPIN, and MTV Iggy. She has a B.A. in urban studies from Barnard College and an M.S. in historic preservation from the University of Pennsylvania.