Nextdoor Is Cracking Down on Racist Posts—But Can It Curb Unconscious Racial Bias?

512_Nextdoor
A demo neighborhood map in Nextdoor's website. Nextdoor/Courtesy

Updated | Last March, several residents in the central Oakland neighborhood of Adams Point took to Nextdoor, a local social networking platform, to complain about a “very nice African American young boy” who would not pick up his pitbull’s waste. The replies soon devolved into neighbors urging the posters to report this to the police and the boy’s school.

One commenter uploaded a photo of the boy and said even if the city fined his family, they wouldn’t be able to pay because “they don't have that kind of money.”

The racial profiling by the Adams Point residents was not the exception but the norm on Oakland’s Nextdoor, according to a cover story by Oakland’s alt-weekly magazine East Bay Express last October. The story detailed multiple stories of black Oaklanders being profiled, harassed and reported to police by paranoid residents, who found refuge in building a quasi-neighborhood watch on Nextdoor—which was initially created for neighbors to trade tips, barter goods and build a community.

Since the publication of the article seven months ago, Nextdoor says it has been meeting Oakland city officials, police officers and neighborhood activists to stop racial profiling on its forums. On Tuesday, its CEO Nirav Tolia appeared before the Oakland City Council the company’s new tools, including algorithms which catch racially charged words, to crack down on its user’s racism.

Tolia tells Newsweek the three tools—the algorithms, an awareness education program and a new streamlined flagging form—went into effect last month and are only available in Bay Area communities so far. He says that Oakland was the only city where activists and city officials reached out to Nextdoor regarding racial profiling after receiving feedback from many communities. Nextdoor is used across 99,000 neighborhoods nationwide.

The algorithms work with a simple word-detecting program that prohibits slurs asks users to elaborate with more descriptions if a generic racial word—like black or Asian—is used. “If you post something that says there is a Latino guy breaking into a house, that does not help anybody catch that criminal,” Tolia says.

While Tolia concedes that he and Nextdoor can’t eliminate its users’ racial bias for good, he notes that there is no safe space for them to spew it publicly in his platform. “We are not a platform for free speech,” Tolia tells Newsweek. “We are a platform to make a happier neighborhood.”

Unconscious racial bias has been an uncomfortable pressure point for tech companies in the diverse and politically liberal Bay Area. In April, Uber rejected its drivers’ demands for an in-app tipping option by citing academic studies which says unconscious racial biases makes customers favor white service providers than the colored.

Tolia, who is Indian-American, says that while other social networks are eliminating “friction” within their app to facilitate users to say whatever to whoever they want, Nextdoor aims to educate by putting up virtual yard signs, “decision points,” as Tolia calls them, to remind users to be wary of what they say.

“Unconscious bias is not a Nextdoor-created problem,” Tolia says. “It’s a part of society today. We don’t live in a post-racial world. There’s a sensitivity that is required.”

So far it has received the approval of Oakland city officials and the neighbors who were affected by the racial profiling on Nextdoor. “It will allow people to ride bikes through a neighborhood without feeling like someone is thinking they’re up to something,” Shikira Porter of Neighbors for Racial Justice, an Oakland community group organized to fight racial profiling, tells the San Francisco Chronicle.

A previous version of the article incorrectly said Oakland was the only city to give Nextdoor feedback on racial profiling. Nextdoor in fact collaborated with many other cities.