Photo by Johnny Silvercloud via Flickr

Fighting Bias, Block by Block

As U.S. cities grapple with the repercussions of decades of redlining and other discriminatory housing policies, MacArthur fellow Dr. Jennifer Eberhardt offers a road map for how online communities can spark meaningful dialogue between neighbors.

Story by Dr. Jennifer Eberhardt

Published on

This is your first of three free stories this month. Become a free or sustaining member to read unlimited articles, webinars and ebooks.

Become A Member

EDITOR’S NOTE: The following is an excerpt from “BIASED: Uncovering the Hidden Prejudice That Shapes What We See, Think, and Do,” by Dr. Jennifer L. Eberhardt, Ph.D., published by Viking. In the book, Dr. Eberhardt, a 2014 MacArthur fellow, digs into the science of how racial bias embeds within our brains and within social institutions, and how those presumptions can be confronted and mitigated. In this section, she covers the legacy of how bias circumscribes where we live and how we interact with our neighbors. If you make a donation to Next City of $60 or more to support our urban affairs internship, we will send you a copy of “BIASED” as a thank-you gift.

The patterns of residential racial segregation that we live with today are a legacy of our nation’s not-so-distant past, when public institutions and private social forces conspired to keep white neighborhoods white by restricting where black people could live.

The federal government played a direct and deliberate role in creating segregated spaces: refusing to back mortgage loans in racially mixed neighborhoods, subsidizing private development of all-white suburbs, and restricting GI Bill housing benefits so that black military veterans could buy homes only in minority communities.

Those government practices were supported and reinforced by local laws and customs that allowed segregation’s reach to extend to schools, hospitals, hotels, restaurants, and parks.

The residue of those discriminatory practices lingers today, fueling stereotypes that seed the stigma attached to black people and black places. Research has shown the power of those stereotypes to shape one of the most fundamental decisions of our lives: where we make our homes. African Americans are more likely than any other group to live in segregated neighborhoods. That residential isolation persists across the social and economic spectrums, in cities large and small. And it is reinforced by prejudicial associations that are shocking to document.

More than half of whites say they would not move to an area that is more than 30 percent black, because they believe that the housing stock would not be well maintained and crime would be high. In fact, according to studies by sociologists Lincoln Quillian and Devah Pager, the more blacks there are in a community, the higher people imagine the crime rate to be — regardless of whether statistics bear that out. That correlates with fear and with bias. The “blacker” they believe their neighborhood to be, the more likely whites are to expect that they will be victimized by crime. The sheer presence or absence of blacks in a space is taken as a true indicator of danger, distorting safety perceptions and biasing people’s sense of risk.

Race affects people’s judgment of physical conditions as well. Sociologists Robert Sampson and Stephen Raudenbush have found that the more blacks there are in a neighborhood, the more disorder people see, even when their perceptions don’t correspond to measurable signs like graffiti, boarded-up houses, or garbage in the street. And black people are just as likely as whites to expect signs of disorder in heavily black neighborhoods.

That suggests a sort of implicit bias that has more to do with associations we’ve absorbed through history and culture than with explicit racial animus. But the outcome is tragic nonetheless. Decades of studies have shown that white avoidance of neighborhoods with more than a handful of blacks is a key driver of the kind of racial segregation that fosters inequality in every area of life, from school to employment to health care.

Does Surveillance Make Us Safer?

Today, in the 21st century, bias has found a new accessory in technology that magnifies what we see but leaves us ill-equipped to decipher its meaning.

Surveillance cameras have gone mainstream, guarding our front doors. Online social networks connect us to our neighbors. But the same tools that promise security and promote camaraderie can foster a sort of tunnel vision that distorts our sense of danger, heightens suspicion, and even puts the safety of others at risk. Residents can circulate photographs of “suspicious” strangers to neighbors and police with the touch of a button and without any evidence the person is doing anything wrong — a practice that can amplify and justify bias in the minds of locals primed by worries about the safety of their homes.

The same fear response that’s supposed to keep us safe can activate bias in ways that stigmatize and threaten others. The technology we enlist to shut down threats from outside our homes may also shutter our humanity.

And the same technology that stigmatizes innocent targets is also being used to document and share disturbing displays of bias that might have been invisible before: black teens accosted and ejected by a white resident from a North Carolina community pool, Latino restaurant employees yelled at and threatened for speaking Spanish in New York City, an Asian-American woman whose Airbnb reservation was canceled because of her ethnicity. As the host explained in a text message, “One word says it all. Asian.”

We are building more advanced systems to screen some people out and welcome others in, but who gets put in which category? That’s the ultimate choice, one we make over and over every day. And while we expect technology to make us less fearful, it also encourages us to act more quickly on the fears — acknowledged or not — that we harbor. Sometimes its mere existence seems to prime us to confront the bogeyman we think we need protection from.

A white homeowner in Michigan who took off with his shotgun after a black teenager rang his doorbell hadn’t even bothered to check his security camera until after police were called. The boy had stopped to ask for directions to the local high school, but the white woman who opened the door was so shocked to find him there, she hollered for help. Her husband ran down the stairs, took one look at the boy, and grabbed his gun and fired a shot.

The boy wasn’t hit, but the homeowner was arrested. And the doorbell camera he’d installed for protection instead provided police with evidence against him, by documenting the encounter and backing the boy’s account that he’d done nothing more threatening than ask for directions. The couple saw the boy through the lens of a scary stereotype, and a camera in a doorbell can’t remedy that.

Research shows that fear can be a driver of bias, unleashing whatever primal stereotypes make a nervous fourteen-year-old morph into a dangerous man — if that is what your attitudes and experiences have primed you to see. The same fear response that’s supposed to keep us safe can activate bias in ways that stigmatize and threaten others. The technology we enlist to shut down threats from outside our homes may also shutter our humanity. In this case, the surveillance equipment captured the thinking that spurred that couple to overreact. “Why did these people choose my house?” the woman was reportedly heard saying. There were no “these people” on her porch that morning, just one lone lost boy.

After law enforcement officials examined the video, county sheriff Michael Bouchard publicly called out the homeowner for putting the boy’s life at risk. “The guy stepped out and fired a shotgun because somebody knocked on his door,” he told reporters. “If someone is running from your house and you chase them outside and shoot at them, you’re going to have criminal charges coming from us.”

In October 2018, six months after the homeowner was charged with “assault with intent to murder,” he was convicted by a jury of the lesser charge of felony assault and of using a firearm while committing a felony, which has a mandatory two-year prison sentence. It took the jury just three hours to reach a verdict.

From the outset, the sheriff had set the sort of boundaries that allowed justice to be served. He marshaled truth as a weapon against bias, offering clear direction on what is appropriate to do in such circumstances and what is not. He looked at the whole picture, not just at a snapshot gilded by fear and framed by preconceived notions.

That’s a fundamentally sound way of dealing with the interference of bias. But it doesn’t happen often enough. And some of the measures that are supposed to help us build bridges and blunt bias are turning out to aid and abet the unconscious ways that we discriminate.

Who’s That Next Door?

I made the forty-minute drive from Stanford’s lush green campus to the grimy congested core of San Francisco to meet with Sarah Leary, one of the founders (with CEO Nirav Tolia) of Nextdoor, an online social networking service that serves as a sort of giant chat room for individual neighborhoods. Tens of millions of people use it, across the country and around the world. Its mission statement conveys its high-minded goal: to provide a trusted platform where neighbors work together to build stronger, safer, happier communities. It sponsors a network of locally based neighborhood online bulletin boards so that neighbors can communicate more easily.

The business is headquartered in a tall, imposing structure, just behind Twitter’s main building on Market Street. When I reached the seventh floor and stepped off the elevator and into their office space, it looked just as I would imagine a tech start-up to look: cool and casual. There were no fancy offices with fancy desks. It felt transparent, as if I could see the entire operation from where I stood: a sea of white Formica desks laid out in a wide-open space. Even the founder’s desk was mixed in with the masses; there were no dividing walls. We met in a glass-enclosed conference room and sat in rolling plastic chairs. The egalitarian vibe of that physical space is what Nextdoor aims to offer online: a space where people can feel comfortable connecting with neighbors they’ve never met, whether they’re looking for a lost dog or a reliable babysitter, off-loading old furniture or sharing a garden’s bounty, warning neighbors about a coyote roaming the block or a stranger who seems out of sync with the prevailing demographic. It’s that last option that caused the trouble that brought me to Nextdoor’s conference room.

At that time, Nextdoor was working well in more than 185,000 neighborhoods in the United States and another 25,000 around the world. But its “crime and safety” category has become the problem child. There were too many posts with racist overtones, messages that labeled blacks and Latinos “suspicious” for walking down a street, sitting in a car, talking on a cell phone, knocking on a door. When an Oakland-area news outlet wrote about the problem, Sarah and her business partners were horrified by the stories that emerged. And they began hearing similar stories from their own users. Instead of bringing neighbors closer together, the platform exposed raw racial dynamics that generated hurt feelings, sparked hostilities and fueled fierce online arguments.

The Nextdoor team began scouring the site for signs of racial profiling and digging through the research on how to deal with bias. The number of troubling posts they found was “minuscule” for a site that channels millions of messages every day, Sarah said. “But we were of the mind-set that even one of these is bad. There was a real kind of gut check and soul-searching experience for us.”

Her team pored over dense empirical articles, trying to break through the esoteric language and search for techniques that would preserve users’ freedom to flag danger when they see it but protect people from being unfairly targeted. “Most people weren’t consciously racial profiling,” Sarah said. “They couldn’t even agree on what it was. They just knew when they’d seen something that made them uncomfortable and compelled them, for safety’s sake, to share it. So they’re in this heightened state, they put out a message and they think they’re doing good by doing that.”

To Curb Profiling, Build Speed Bumps

Nextdoor needed to find a way to dial back the hair-trigger impulse that makes skin color alone arouse suspicion. Her team wanted to educate, not shame or alienate users who’d stumbled into trouble with awkward or insensitive postings. She found possible solutions in studies that show how bias is most likely to surface in situations where we’re fearful and we’re moving fast. I visited her office to share my expertise on the subject.

Speed is the holy grail of technology. Most tech products are created with the aim of reducing friction and guiding us through a process rapidly and intuitively. But the very thing that makes technology so convenient also makes it perilous where race and safety are concerned. The goal is to create an online experience for users that’s easy, quick, and fluid, allowing them to express themselves instantly. Yet these are exactly the kinds of conditions that lead us to rely on subconscious bias.

The posting process was changed to require users to home in on behavior, pushing them past the “If you see something, say something” mindset and forcing them to think more critically: if you see something suspicious, say something specific.

To curb racial profiling on the platform, they had to contemplate slowing people down. That meant adding steps to the process of posting about “suspicious people” but not making things so cumbersome that users dropped out. They needed something that would force people to look past the broad category of race and think about specific characteristics. So they developed a checklist of reminders that people have to click through before they can post under the banner of “suspicious person”:

  • Focus on behavior. What was the person doing that concerned you, and how does it relate to a possible crime?
  • Give a full description, including clothing, to distinguish between similar people. Consider unintended consequences if the description is so vague that an innocent person could be targeted.
  • Don’t assume criminality based on someone’s race or ethnicity. Racial profiling is expressly prohibited.

Research supports the notion that raising the issue of race and discrimination explicitly can lead people to be more open-minded and act more fairly, particularly when they have time to reflect on their choices.

The posting process was changed to require users to home in on behavior, pushing them past the “If you see something, say something” mindset and forcing them to think more critically: if you see something suspicious, say something specific.

Adding friction to the process slowed things down a bit, but it did not lead to the huge drop-off in users that industry observers had predicted. What it did do was reduce the incidence of racial profiling: Nextdoor’s tracking suggests it is down by more than 75 percent. They’ve even adapted the process for international use, with customized filters for European countries, based on their mix of ethnic, racial, and religious tensions.

Create a Space for Connection and Conversation

The approach offers benefits beyond reducing neighborhood animosity. That friction and the awareness it generates may make people more willing and better equipped to think and talk frankly about race. Conversations about racial issues in interracial spaces can be uncomfortable. It’s no wonder people tend to avoid them. Integration is hard work, and threat looms over the process. White people don’t want to have to worry that something they say will come out wrong and they’ll be accused of being racist. And minorities, on the other side of the divide, don’t want to have to wonder if they’re going to be insulted by some tone-deaf remark. The interaction required to move past stereotypes takes energy, commitment, and a willingness to let big uncomfortable issues intrude on intimate spaces — your home and your neighborhood.

Living with diversity means getting comfortable with people who might not always think like you, people who don’t have the same experience or perspectives. That process can be challenging. But it might also be an opportunity to expand your horizons and examine your own buried bias.

Research shows that talking about racial issues with people of other races is particularly stressful for whites, who may feel they have to work harder to sidestep the minefields. Their physical signs of distress are measurable: Heart rates go up, blood vessels constrict, their bodies respond as if they were preparing for a threat. They demonstrate signs of cognitive depletion, struggling with simple things like word-recognition tasks.

Even thinking about talking about race can be emotionally demanding. In a study of how white people arranged the physical space when they knew they’d be in conversation with blacks, the arrangements varied based on the subject of those chats. When the study participants were told they’d be talking in small groups about love and relationships, they set the chairs close to one another. When they were told the topic was racial profiling, they put the chairs much farther apart.

Nextdoor can’t make the angst go away. But benefits accrue from nudging people to talk about race and consider the harm a thoughtless judgment can do. “What I have found is that this can be a personal journey,” Sarah said. “When you raise the issue with people, at first there might be a little bit of ‘Oh, come on.’ And then you explain, and you get ‘Oh yeah, that makes sense.’ I think right now most people believe ‘I can only screw this up, so maybe I shouldn’t have that conversation.’ But if people believed that having the conversations actually led to better understanding, they’d be more willing.”

She saw that happen in Oakland, when people came together to talk about their distress over racially biased posts. “I think people just get closed off, and they try to simplify the world with simple assumptions to get through their day,” she said. “But there’s a whole canopy of examples of people’s lives that are maybe more similar to yours than you assume. When you have direct connections with people who are different from you, then you develop an ability to recognize that.” So the scary black teenager in the hoodie in the dark turns out to be Jake from down the block, walking home from swim team practice.

The beauty of Nextdoor’s template is that it catches people before they’ve done anything wrong. “We try and be very mindful of going through the process of assuming good intent,” Sarah explained. “I think where it actually gets embarrassing for people is when they had good intentions and they put something out there, and they thought they were helping the neighborhood, and someone comes back and is like, ‘You’re a racist.’”

The tool gets users to stop and think before they post something that will land them in heated arguments with neighbors. Because once the comment is out there, it’s hard to dial things back. It’s not a productive conversation if one person is outraged over being labeled a racist and the other is feeling aggrieved about always having to be the person waving the flag and saying, “Do you realize what you just did?” When there’s more thoughtfulness and less defensiveness, honest conversations about race are possible.

Ultimately, we see our neighborhoods as an extension of our homes. And home is the place where you let your guard down; where you expect to feel loved, safe, and comfortable. But living with diversity means getting comfortable with people who might not always think like you, people who don’t have the same experience or perspectives. That process can be challenging. But it might also be an opportunity to expand your horizons and examine your own buried bias.

Pullquote photo credits, from top: Thomas Hawk via Flickr; chilot via Flickr; Tony Fischer via Flickr.

From “BIASED,” by Jennifer L. Eberhardt, Ph.D., published by Viking, an imprint of Penguin Publishing Group, a division of Penguin Random House, LLC. Copyright © 2019 by Jennifer L. Eberhardt.

Like what you’re reading? Get a browser notification whenever we post a new story. You’re signed-up for browser notifications of new stories. No longer want to be notified? Unsubscribe.

Dr. Jennifer Eberhardt is a professor of psychology at Stanford and a recipient of a 2014 MacArthur “genius” grant. She has been elected to the National Academy of Sciences, the American Academy of Arts and Sciences, and was named one of Foreign Policy‘s 100 Leading Global Thinkers. She is co-founder and co-director of SPARQ (Social Psychological Answers to Real-World Questions), a Stanford Center that brings together researchers and practitioners to address significant social problems.

×
Next City App Never Miss A StoryDownload our app ×
×

You've reached your monthly limit of three free stories.

This is not a paywall. Become a free or sustaining member to continue reading.

  • Read unlimited stories each month
  • Our email newsletter
  • Webinars and ebooks in one click
  • Our Solutions of the Year magazine
  • Support solutions journalism and preserve access to all readers who work to liberate cities

Join 1099 other sustainers such as:

  • Gabby at $5/Month
  • Abigail at $10/Month
  • Gloria at $5/Month

Already a member? Log in here. U.S. donations are tax-deductible minus the value of thank-you gifts. Questions? Learn more about our membership options.

or pay by credit card:

All members are automatically signed-up to our email newsletter. You can unsubscribe with one-click at any time.

  • Donate $20 or $5/Month

    20th Anniversary Solutions of the Year magazine

has donated ! Thank you 🎉
Donate
×