Lost cats, petty crimes and controversial developments.
Those are the most common topics of discussion for residents of one Los Angeles community on Nextdoor, a social network for neighborhoods that was recently valued at $1.1 billion.
The next most popular talking points for the L.A. neighborhood are a bit thornier: race and sexual orientation.
Racial profiling is apparently a fairly common phenomenon on Nextdoor, and recent media reports have raised questions about whether the social network is doing enough to discourage discriminatory behavior in its neighborhood forums.
“Rather than bridging gaps between neighbors, Nextdoor can become a forum for paranoid racialism,” writes Fusion’s Pendarvis Harshaw.
But by facilitating discussions on discrimination, Nextdoor can also foster constructive dialogue that might actually chip away at stereotypes.
Harshaw cited a case where one Nextdoor member posted a warning about a man in a “white hoodie” and “a thin, youngish African-American guy wearing a black beanie, white t-shirt with dark, opened, button-down shirt over it, dark pants, tan shoes, gold chain.”
As it turned out, the “sketchy men” were just another resident’s friends. That resident had given the men the wrong directions to a party she was hosting.
Nextdoor takes a number of measures to discourage discriminatory remarks on its network.
The social network’s member guidelines ask users to “refrain from using profanity or posting messages that will be perceived as discriminatory.”
Users can flag inappropriate comments for review by a moderator. If a comment is deemed to have violated Nextdoor’s member guidelines, then the person who posted it may have their account suspended.
The fact that members must register their real name and address serves as another safeguard against bigoted conduct on Nextdoor, Nextdoor told Harshaw. Most people don’t want to be perceived as racists by their neighbors.
What Nextdoor does not do to discourage bigotry, however, is actively search out and remove discriminatory comments.
That stands in stark contrast to how StreetAdvisor, another neighborhood website, handles the issue. The platform’s automated keyword filter doesn’t allow users to publish posts containing sensitive words such as “African American” and “homosexual,” among others.
Nextdoor stops short of proactively censoring discussion to avoid diluting the authenticity of its communities, the social network told PandoDaily’s Michael Carney.
Whether that could benefit society is an open question.
If racially charged statements on Nextdoor spark debate, that might cause more bigots to question their beliefs, potentially fostering tolerance. But if racist comments go largely unquestioned, that could reinforce racism in a community.
“Let’s hope these semipublic, semiprivate conversations lead to diverse communities better understanding each other rather than Nextdoor, and similar services, simply becoming yet another place to safely air long-held racial assumptions,” writes Harshaw.
Nextdoor is hardly the only real estate technology company to be accused of inadvertently facilitating discriminatory behavior.
Critics say websites that bake data such as racial composition or crime ratings into the real estate search experience can magnify buyers’ tendency to sort themselves into communities along demographic lines.
The National Fair Housing Alliance has been investigating whether real estate search websites and apps’ use of demographic data may violate fair housing laws.