It's 'Our Fault': Nextdoor CEO Takes Blame For Deleting Of Black Lives Matter Posts

Jul 1, 2020
Originally published on July 2, 2020 9:43 pm

As protests swept the nation following the police killing of George Floyd, there was a surge of reports that Nextdoor, the hyperlocal social media app, was censoring posts about Black Lives Matter and racial injustice.

In an interview with NPR, Nextdoor CEO Sarah Friar said the company should have moved more quickly to protect posts related to Black Lives Matter by providing clearer guidance.

It "was really our fault" that moderators on forums across the country were deleting those posts, she said.

People of color have long accused Nextdoor, which serves as a community bulletin board in more than 265,000 neighborhoods across the U.S., of doing nothing about users' racist comments and complaints. But Nextdoor came under especially heavy criticism in May after the company voiced public support for the Black Lives Matter movement.

Unpaid volunteers, known as leads, moderate posts on Nextdoor. Friar said they were deleting posts about Black Lives Matter because they were following outdated rules stating that national conversations have no place in neighborhood forums. Those guidelines have now been revised to state that conversations about racial inequality and Black Lives Matter are allowed on Nextdoor.

"We did not move quickly enough to tell our leads that topics like Black Lives Matter were local in terms of their relevance," Friar said. "A lot of our leads viewed Black Lives Matter as a national issue that was happening. And so, they removed that content, thinking it was consistent with our guidelines."

She added that the new rules make one thing clear: "Black Lives Matter is a local topic."

Friar said that Nextdoor is taking several more steps to improve the moderation of comments. It will soon offer unconscious bias training to all moderators. It will also launch a campaign to enlist more Black moderators. And it is ramping up efforts to detect and remove instances of racial profiling.

Apologizing, then asking for help from Black users

Neighbors take to Nextdoor to search for a local plumber, find a babysitter or sell a piece of furniture. But the app also has gained notoriety for spreading panicked messages that carry racist overtones.

In recent weeks, as the national conversation has centered on racial injustice, Black users have shared their stories of abandoning Nextdoor. One person wrote on Twitter that they stopped using it after reading repeated complaints about "large groups of black teens walking in their neighborhood." Another tweeted that their neighbors would write messages such as "Saw a black youth hanging out next door. Calling the cops."

Mayisha Fruge, 42, a black mother of two in San Diego, Calif., who is active on Nextdoor, said those kinds of post sound familiar.

About 90% of her neighbors come across as good, decent people on the app, she said.

"That other 10 percent? They must be hiding behind the computer. I never would have thought that my neighborhood had those types of people, racist people in it," she told NPR.

In one post, a neighbor was suspicious about a black person who was simply taking a stroll. Another asked: do the Black Lives Matter protesters have jobs?

"I said, what does this have to do with equality and justice?" Fruge said.

Friar has apologized to Black users who have said they do not feel welcomed or respected on the app, vowing that racism has no place on Nextdoor.

She also announced that Nextdoor was cutting off a tie to law enforcement by ending a "forward to police" feature that allowed users to report observed activity to authorities.

But Friar told NPR that Nextdoor's efforts to combat racism on the app will go even further.

Nextdoor has enlisted Stanford University psychology professor Jennifer Eberhardt to help slow down the speed of comments to tamp down on racial profiling, and it's working with her to make unconscious bias training available to hundreds of thousands of moderators.

It is a change that some Nextdoor users have demanded. In an online petition, they criticized the app's "murky" guidelines for content moderation, which users said led to abuse and the silencing of Black voices.

In response to Nextdoor's commitments, the Atlanta-based group Neighbors for More Neighbors, which helped organize the petition, applauded the news but remained cautious.

"This is a positive step towards creating a true community forum where all people in our neighborhoods feel safe to participate," said activist Andrea Cervone with the group. "We will be keeping an eye on the company to make sure they continue forward and fulfill these public commitments."

In Northwest Indiana, Jennifer Jackson-Outlaw had a lukewarm reception to the company's announcements. Jackson, a black woman who became fed up with Nextdoor and deleted the app, said Nextdoor's mostly white executive suite needs a shakeup in order to effect real cultural change at the company.

"It's important to not only have representation as far as those who are the moderator, but also those who are in the leadership of the company who may be more be well-versed on some of the issues," she said.

At Nextdoor, Friar has kicked off an effort to recruit more Black leads. This includes inviting especially active Black users to become moderators and starting outreach campaigns to encourage Black users to join the app.

"We recognize that is an underrepresented group on Nextdoor," Friar said of Black users. "There are others of course, but we want to start there because we really feel that the Black Lives Matter movement is so critical and important right now just to the health of our country."

Friar described Nextdoor's content moderation as "a layered cake," saying it involves local moderators, artificial intelligence tools and the company's human reviewers.

She said that the app's AI programs are being fine-tuned to better detect both explicit racism and posts that engage in racial profiling, or what she called "coded racist content." Nextdoor is now dedicating more staff to focus on attempting to ferret out racist content on the app.

"We're really working hard to make sure racist statements don't end up in the main news feed, making sure that users that don't act out the guidelines aren't on the platform anymore," Friar said. "It is our No. 1 priority at the company to make sure Nextdoor is not a platform where racism survives."

Confronting the 'Karen problem'

Though anecdotal evidence suggests Nextdoor's user base is largely white, Friar said the company has no internal metrics about the race of its users.

The app does not ask about race when users sign up, a decision that Friar said may soon change as the company examines how best to hold itself accountable in its push to diversify the platform.

"We are debating that," she said. "Because if we want to measure our success of being a diverse platform, perhaps that's something we do need to ask."

Critics of Nextdoor, including U.S. Rep. Alexandria Ocasio-Cortez, D-N.Y., have drawn attention to the app's so-called Karen problem. It's a term that has come to describe a middle-aged, privileged white woman with racist habits, whether overt or subtle.

When asked if Nextdoor has a Karen problem, Friar deflected by saying any intolerance or racism on the app is a snapshot of issues plaguing the entire country, not problems confined to the neighborhood platform.

"Does the U.S. have that problem? Yes, it's out there," Friar said. "But I think we're working as hard as we can to make sure neighbors are doing right by each other, that they're being civil, being respectful and that they're not falling back to calling each other names but rather trying to deeply understand."

Copyright 2020 NPR. To see more, visit https://www.npr.org.

MARY LOUISE KELLY, HOST:

Nextdoor - it's the popular social media app where you report your missing dog or look for a plumber or sell a piece of furniture to your neighbor. But it also has a troubling side. For years, Black users have complained the app is used for racial profiling. More recently, the app has been criticized for censoring posts related to Black Lives Matter. NPR's Bobby Allyn takes a look at the company's plan to make the platform more welcoming to Black users.

BOBBY ALLYN, BYLINE: Ask Myesha Fruzier (ph) how often she sees racist comments on Nextdoor, and she'll tell you...

MYESHA FRUZIER: You know, it was so many I can't even count.

ALLYN: Fruzier's a Black mother of two in what she described as a mostly white, conservative-leaning community in San Diego. Like many, she's been on Nextdoor a lot during the pandemic, and she says about 90% of her neighbors come across as good, decent people.

FRUZIER: That other 10% - they must be hiding behind the computer. But I never would've thought that my neighborhood had those kind of people - racist people in it.

ALLYN: In one post, a neighbor was suspicious about a Black person who was going on a walk. Another asked, do Black Lives Matter protesters have jobs?

FRUZIER: I said, what does this have to do with equality and justice?

ALLYN: It was against this backdrop that corporate Nextdoor publicly pledged support for Black Lives Matter. At the same time, many neighborhood discussions about the movement, like Fruzier's, were being deleted. Volunteer moderators were taking them down. Some said they were just following Nextdoor's community guidelines on keeping national topics out of local threads. Nextdoor CEO Sarah Friar told NPR she's the one who's responsible for those actions by moderators. Nextdoor calls them leads.

SARAH FRIAR: Really our fault. We did not move quickly enough to tell our leads that topics like Black Lives Matter were local in terms of their relevance.

ALLYN: Nextdoor has since updated its guidance to its quarter of a million volunteer moderators and now says, yes, Black Lives Matter content is relevant in a hyperlocal neighborhood app. Friar is also announcing other steps. All moderators will soon be provided unconscious bias training, and Nextdoor is launching a campaign to recruit more Black users to be moderators.

FRIAR: That is an underrepresented group on Nextdoor. And there are others, of course, but we want to start there because we really feel that the Black Lives Matter movement is so critical and important right now to just the health of our country.

ALLYN: In northwest Indiana, Jennifer Jackson Outlaw (ph) had a lukewarm reception to these changes. She's a Black woman who became fed up with Nextdoor and deleted the app. She thinks Nextdoor's mostly white executive office needs a shake-up.

JENNIFER JACKSON OUTLAW: It's important to not only have representation as far as those who are the moderator but also those who are in the leadership of the company who may be more well-versed on some of the issues.

ALLYN: Back in San Diego, Fruzier says after complaining enough, the thread on racial justice reappeared. Scrolling through, she saw a neighbor who said Black Lives Matter was unsettling. Fruzier jumped in the discussion.

FRUZIER: The lady said she was afraid of Black Lives Matter. I said, well, if that's the way you feel, well, maybe we should meet up and talk about it.

ALLYN: And they did meet up. They met for a stroll in a local park. They swapped stories about being mothers and living in the same community. But they also talked about how different it is to be Black in America.

FRUZIER: If my son go out, I have to have a talk with him. I have to say, hey, you know, you can't wear that hoodie. Always look over your shoulder. Don't say this. You don't wear that shirt. Always be polite. And she was like, wow, really? Yeah. Yeah. That's what we have to do.

ALLYN: It's the kind of neighborly exchange on race that Nextdoor would like to see happen more often in an ideal world.

Bobby Allyn, NPR News, San Francisco.

(SOUNDBITE OF THE POSTAL SERVICE SONG, "TURN AROUND") Transcript provided by NPR, Copyright NPR.