The Equity Factor

Big Data Has Potential to Both Hurt and Help Disadvantaged Communities

Two new reports measure big data’s disparate impact on low-income communities.

(AP Photo/Mark Lennihan)

This is your first of three free stories this month. Become a free or sustaining member to read unlimited articles, webinars and ebooks.

Become A Member

In the future, all aspects of daily urban life might be tracked and translated into data points. Local governments and companies collecting this type of information are already testing out potential uses. According to some watchdogs, without a holistic look at how data collection and algorithms impact different communities, this has the potential to reinforce already rigid structural barriers to economic and physical mobility.

The Chicago Police Department’s experiments with predictive policing have incited worries about unfair profiling in black communities. Boston has tested out situational awareness software, which uses mass surveillance and face-recognition technology as a safety measure for large-scale assembly — with search queries capturing skin tone on a scale of 1-100. Data might also be used in a sort of 21st-century redlining, with banks and healthcare companies using informational leverage to deny service to people living in low-income communities.

“A big part of what’s happening across society today is that major institutions are increasingly using computers to make the decisions that shape people’s lives. These processes are often opaque,” says David Robinson, of Robinson + Yu, a firm that provides technical expertise to help social justice advocates engaging in big data issues, and that recently released a new report called “Civil Rights, Big Data and Our Algorithmic Future” that points to the possible upsides and pitfalls of this information-based future. “People need to feel a sense of empowerment around these algorithmic processes. I think there’s a real cultural tendency to defer to decisions that come from a computer and to feel like if an algorithm has rendered some decision then it must be fair or we can’t understand it or it shouldn’t be scrutinized.”

Obviously intentional discriminatory practices are one thing, but uncovering unintentional discrimination is in murky and uncharted legal territory, according to Solon Barocas, of Princeton’s Center for Information Technology Policy. A recent report that he co-authored studied the disparate impact of big data on vulnerable communities. “We need to be extremely sensitive to the very subtle way that things can produce a disparate impact,” says Barocas, “And having that sensitivity means knowing about the data that you’re working with.”

Barocas’ report cites Boston’s Street Bump as an example. When smartphone users drive over Boston potholes, the widely acclaimed app reports the location to the city. While inventive, the differences in smartphone ownership across Boston’s populations might cause the app to unintentionally underserve the infrastructural needs of poorer communities.

“Historically disadvantaged communities tend to be simultaneously over-surveilled — if you are a part of the welfare system, you have a lot of information being collected by the state at all times — and severely underrepresented, because you might not be an attractive consumer,” says Barocas. Credit scores are a popular example that show how people outside of the formal economy have a hard time registering enough information to qualify for loans, but new alternative metrics come with their own dangers.

The questions that data miners ask and the way that the results are categorized are extremely important. Barocas brings up an anecdote about Evolv, a San Francisco startup that develops hiring models. In searching for predictors for employee retention, the company found that employees who live farther from call centers were more likely to quit. But because the results also could have an unintentional link to race, Evolv declined to use that information as a caution against violating equal opportunity laws.

“You can use data mining to do something completely different,” Barocas points out. “You could ask ‘If I adjust workplace policies or workplace conditions, might I be able to recruit or retain different people?’” Rather than blindly using data that might unintentionally discriminate, employers can intentionally reverse prior hiring practices that have adversely affected job candidates based on their race, ethnicity, gender and income.

“I think that the lesson of history is that when powerful institutions are designing processes, or when markets are creating new practices in terms of decision-making,” comments Robinson, “we need to check and make sure that they are done in a way that is consistent with human rights and that we shouldn’t be just assuming that they are.”

The future doesn’t have to feel like The Minority Report. Armed with expertise on technical issues, civil rights groups and social justice organizations can play an advisory role to companies and governmental institutions wielding these large data sets. Barocas imagines a future where big data is one of the best tools to expose persistent discrimination.

“What might be interesting is that this really technical thing might be a way of showcasing how it’s actually impossible to avoid having a frank discussion about the acceptability of inequality in society,” he says. “A lot of these techniques will expose the extent of inequality and actually exacerbate it. In a way, it’s an opportunity to have a conversation that many civil rights organizations have always wanted to have, which is that this is not just a matter of conscious prejudice, but actually structural inequality and structural prejudice.”

The Equity Factor is made possible with the support of the Surdna Foundation.

Like what you’re reading? Get a browser notification whenever we post a new story. You’re signed-up for browser notifications of new stories. No longer want to be notified? Unsubscribe.

Alexis Stephens was Next City’s 2014-2015 equitable cities fellow. She’s written about housing, pop culture, global music subcultures, and more for publications like Shelterforce, Rolling Stone, SPIN, and MTV Iggy. She has a B.A. in urban studies from Barnard College and an M.S. in historic preservation from the University of Pennsylvania.

Follow Alexis

Tags: income inequalitybig data

×
Next City App Never Miss A StoryDownload our app ×
×

You've reached your monthly limit of three free stories.

This is not a paywall. Become a free or sustaining member to continue reading.

  • Read unlimited stories each month
  • Our email newsletter
  • Webinars and ebooks in one click
  • Our Solutions of the Year magazine
  • Support solutions journalism and preserve access to all readers who work to liberate cities

Join 1099 other sustainers such as:

  • Gabby at $5/Month
  • Abigail at $10/Month
  • Gloria at $5/Month

Already a member? Log in here. U.S. donations are tax-deductible minus the value of thank-you gifts. Questions? Learn more about our membership options.

or pay by credit card:

All members are automatically signed-up to our email newsletter. You can unsubscribe with one-click at any time.

  • Donate $20 or $5/Month

    20th Anniversary Solutions of the Year magazine

has donated ! Thank you 🎉
Donate
×