David Leblond, a computer programmer based in Durham, North Carolina, decided in December to post on his local Nextdoor feed about participating in a Pfizer Covid-19 vaccine trial. He was almost positive he didn’t get the placebo, and he thought talking about his experience on the hyper-local neighborhood social media platform could reassure people in his community about the new vaccine.
“So you can read this knowing that you know at least one person who got it and is fine,” concluded his post on his local community feed. “I hope that eases your mind a little bit! Yay science!”
Most of his neighbors responded politely and thanked him for sharing. But the comments section of the post quickly devolved into an aggressive argument. Someone accused him of trying to force people to get vaccinated. One user suggested the vaccine was a tool for “population control,” and another user said that “a new vaccine in less than 1 year” frightened her. Some of the more “out there” comments on his post have been removed, he says, and one neighbor apologized. But other posts stayed up: As of February, LeBlond says there are still posts expressing skepticism about the vaccine on his thread and others.
Leblond is one of eight Nextdoor users from across the country who told Recode about similar frustrations with the platform. Many said they went to Nextdoor to meet neighbors and get updates about local events that they couldn’t find on sites like Facebook. During the Covid-19 pandemic, they thought Nextdoor could help keep their communities healthy and safe by being a reliable source of local information about vital topics like quarantining and vaccinations at a time when local media is shrinking or shutting down entirely.
But they say Nextdoor is letting them down. They say their local sites can be taken over by a loud minority that push misinformation about Covid-19, vaccines, and masks. Multiple Nextdoor users told Recode the platform’s reporting tools and community moderators aren’t effectively doing their jobs — and that some even exacerbate the problems — making Nextdoor fall far short of its promise to be a neighborly social network.
At the beginning of the pandemic, Nextdoor users — like everyone else — were figuring out how to deal with Covid-19. Much of that discussion was productive and generally positive: Neighborhoods used the platform to coordinate what songs to sing from their windows during lockdown and offered to help each other out, says Jenn Takahashi, who runs the popular Best of Nextdoor account on Twitter that aggregates some of the funniest posts on the platform.
Before Covid-19, Nextdoor was widely used throughout the United States. But the pandemic had people turning to the site more often, as they spent more time at home and the app became a source of local information and discussion about the pandemic. In April, Nextdoor reported that it saw a dramatic increase in daily active users — more than 60 percent since January — on the site across age groups. As of February, more than 222,000 neighborhoods were registered on the site throughout the country, up from the more than 135,000 that Nextdoor had advertised on its website in the spring of 2017.
As more people began discussing Covid-19 on Nextdoor, the company — similar to platforms like Facebook and Twitter — expanded its moderation policies surrounding the pandemic. Unlike other violations of Nextdoor rules, which are reported to community moderators, Covid-19 misinformation that’s flagged by users goes straight to Nextdoor’s hired support staff. The company also added an interstitial “reminder” that encouraged users to cite accurate sources before posting.
But Nextdoor’s neighbor-oriented moderation — which relies on a combination of unpaid neighbor “leads” who help run local communities and Nextdoor content moderation staff — didn’t seem prepared for the problems that arose as discourse about the pandemic and how to respond to it became increasingly politicized. Nextdoor’s neighborhood moderators are often selected by invitations from other leads, and the first person to “found” their community’s digital presence on the platform can automatically become a lead.
Nextdoor did not provide additional details about the size of its content moderation team or the technology it uses, although it emphasized its work to elevate content from public health agencies like the Centers for Disease Control and Prevention and the World Health Organization.
Nextdoor tried to encourage more neighborly behavior, too. In March, the company expanded the availability of its group feature, which basically functions like Facebook groups. It also launched a “help map” inside its app, which allowed users to flag in their profile that they were available to help neighbors who might need assistance amid the pandemic. Despite these changes, misinformation and resistance to public health measures, including vaccination, still show up on Nextdoor.
“It’s just the kind of classic stuff that you probably saw on other social media,” Mark Boslough, a physicist who uses Nextdoor in Albuquerque, New Mexico, and who took over a coronavirus-focused citywide group early on in the pandemic, told Recode in December. “Like, ‘It’s not that bad,’ ‘It’s the flu,’ or ‘It’s intentionally created by the Chinese’ or just calling it the ‘Wuhan flu’ or the ‘China flu.’”
When one of his neighbors posted an image suggesting that Covid-19 was being exaggerated right before an election and that the spread of the coronavirus was subsiding, Boslough told the neighbor to delete the post (and accused the neighbor of lying). He also wrote to Nextdoor’s support staff, demanding that the “dangerous spread of misinformation needs to be stopped NOW.”
The next day, Boslough found that he had been temporarily suspended for violating Nextdoor’s rules about being “helpful, not hurtful.” Nextdoor would not comment on individual users but emphasized the site has rules in place against profanity, over-posting, and personal disputes.
Boslough, who is part of an informal effort focused on getting Nextdoor users to listen to science experts, isn’t alone in his frustration with the platform.
“I really thought it was such a great, great site. I absolutely loved it,” Serena Spencer, an attorney who has used Nextdoor for about three years in Pasadena, California, told Recode. “And then Covid happened.”
She recalls posting a pattern for homemade masks in her community’s feed, at the start of the pandemic in March, only to receive a slew of angry comments in response. She says that misinformation on the site has moved in “peaks and valleys”; at the beginning of the pandemic, posts would be removed more often, but now people have taken an “agree to disagree” approach on threads about topics like wearing a mask.
Spencer says the environment on the app got even worse following the police killing of George Floyd and that some neighbors began to post racist comments during the Black Lives Matter protests over the summer, adding that she’s one of the few Black leads in her area. She told Recode that while she thinks Nextdoor can be a useful tool, she’s concerned about the way it handles content that includes misinformation and hateful posts from neighbors. Spencer also says that Nextdoor has wrongly reported or suspended her several times, and that she’s had to appeal to get her account back.
“There is a way for it to be a really, really great resource,” she told Recode, but warned that “the way it’s being moderated right now is dangerous.”
How pervasive misinformation is on Nextdoor is difficult to measure: Nextdoor is segmented by design, and what appears in one user’s local feed could be entirely different from what’s in the feed of someone living elsewhere. But some users Recode spoke to say that misinformation is a rampant problem that’s turned their local Nextdoor communities into stressful, contentious spaces.
Joanne Martinez, who uses Nextdoor in Kona, Hawaii, pointed to one user who called Covid-19 vaccination “a large Darwinian I.Q. Test – survival of the intellectually fittest.”
Another member of the same community feed made more alarming comments, pushing the false idea that Covid-19 had a 99.99 percent survival rate and then floating that people should “come back stronger and infect all the old weak people.” (The CDC says that eight of 10 people who have died of Covid-19 in the US are over 65, and contracting the illness can still leave people of all ages with long-term symptoms and side effects.) That wasn’t all: The same person posted that if “I could infect another person and kill them that would be the ultimate power,” later adding: “it’s all your fault boomers maybe you deserve to die.”
Tracy Walker, who is in the same Nextdoor feed as Martinez, told Recode it took about two and a half weeks for that content to be removed from the platform.
How Nextdoor moderates its content also makes things murkier. Nextdoor says that it uses both technology and user-generated flags to identify Covid-19 misinformation. The platform also relies on neighbors who have taken up the job of reporting on other users for spreading misinformation.
“We’re committed to the safety of our members and are taking proactive measures to help our members stay safe and have access to real-time, trusted information from credible sources,” a Nextdoor spokesperson for the company told Recode in December. In 2021, the platform did make one major change: It stopped recommending political groups, following reporting from several outlets, including Recode, that highlighted racism, conspiracy theories, and toxic political fights on the platform.
This approach of combining technology, staff, and unpaid content moderators to identify and report problematic content can mean that Nextdoor sometimes punishes users who say they’re beating back misinformation. “Nextdoor having a rule against misinformation in the wrong hands can backfire on the people who are trying to debunk this information,” Boslough, the physicist in New Mexico, told Recode.
According to emails he was sent from Nextdoor’s moderation staff (which he shared with Recode), Boslough in November was reprimanded by Nextdoor’s content moderation staff for violating the platform’s “public shaming” rules, after he warned against “anti-science, anti-mask, anti-social trolls” making untrue claims and spreading conspiracy theories. In December, his account was temporarily disabled for violating Nextdoor’s “ranting” and “profanities” rules after he emphatically told nonessential workers to “stay the hell home” and to strictly observe social distancing measures.
Stephen Floor, a professor who uses the platform in California, says reporting coronavirus misinformation on Nextdoor can be a frustrating cycle of being urged to continually report content. Even if individual pieces of misinformation are taken down, accounts that share “chronic” amounts of misinformation remain on the site, and people have also used the platform to oppose lockdown and stay-at-home measures. Nextdoor says that misleading claims about Covid-19 measures like social distancing or wearing masks do violate its rules, but it’s not clear how the site, in particular, handles organizing against these measures.
As a self-identified “hyperlocal” platform, Nextdoor tried to provide users with a unique view — and news — about what’s happening in their communities. The company has worked to provide local agencies, including public health agencies, with a megaphone to broadcast information to communities, and its CEO Sarah Friar has even floated that, with the decline of local media, the Nextdoor feed could be a platform for updates about local politics.
Lauren Tostenson, who works at the public information office for El Paso County in Colorado, says Nextdoor users seem to be “a more dedicated group of people,” and don’t seem to visit the site specifically to start fights. “I think people take Nextdoor more seriously than Facebook,” she told Recode.
But others say the local nature of the app is what makes it challenging.
“They’re trying to walk a fine line of being helpful without putting them in a place where they could be held responsible for any incorrect information,” notes Will Payne, a Rutgers University professor who has studied Nextdoor. “That’s a tricky line to walk.” Payne points out that while Nextdoor can give users some local information, it can create a warped sense of local reality, and there’s no guarantee it’s accurately conveying the severity of the pandemic in local communities.
Ultimately, a small minority of users who question or outright oppose the vaccine threaten to drown out accurate information about the vaccine being spread through its platform. Public health agencies, for instance, are already using Nextdoor to announce their vaccine distribution plans, including keeping locals up to date on vaccine distribution, connecting locals with experts, and warning about potential vaccine scams.
Nextdoor would not comment on whether or how it’s changing its approach during the distribution of Covid-19 vaccines. But amid the ongoing inoculation campaign, the platform has also become a resource for many who are trying to get vaccinated. Several users Recode spoke to said that in recent weeks, the platform has seen a surge in people desperate for more information on where and how to get vaccinated, and neighbors trading information about how to find an available dose.
Ultimately, that Nextdoor is a local platform means it’s more closed off than a social network like Twitter, which makes it harder to track how misinformation can spread on the platform. But it’s also made the platform more useful — and given people an incentive to stay.
“It’s one thing when you’re on Facebook and nobody knows who each other is on Facebook,” says LeBlond, of North Carolina. “But Nextdoor, it’s your neighbors. I don’t want to start fighting with my neighbors.”
Open Sourced is made possible by Omidyar Network. All Open Sourced content is editorially independent and produced by our journalists.