The extremist watchers: How a network of researchers is searching for the next hate-fueled attack

One evening in early June, a week after 19 students and two teachers were shot and killed at Robb Elementary School in Uvalde, Texas, extremism researcher Kesa White saw media reports of another mass shooting, at a medical center in Tulsa, Oklahoma.

In turn, she picked up each of the three phones she uses for research and got to work, scrolling fast. Apps. News sites. Twitter and more obscure social media platforms. In the minutes after a shooting, she needed to learn as much as she could.

The news of another attack, so soon after the racist attack in Buffalo, New York, then the Uvalde shooting, was shocking, as always. For White, it crystallized another emotion: dread. This, she feared, could be the one she was waiting for.

White had spent the previous week watching for what extremism researchers call a “copycat” shooting: Another disturbed young man could try to kill more people than the 18-year-old shooter had in Texas. Her research showed her all the signs: increasing rhetoric and violent chatter on encrypted messaging apps; a blooming of support for the Uvalde shooter in dark spaces online; loose cannons increasingly firing shots across cyberspace.

“It made me think, somebody's going to try to do something similar pretty soon,” White said. “I think there's going to be some sort of copycat attack.”

Before long, details emerged about the Tulsa shooter: His race and name were revealed, and researchers and journalists concluded that the attack was not ideologically driven. Tragic but not a copycat.

Multiple people were killed in a shooting at a hospital in Tulsa, Okla. The shooter committed suicide.
Multiple people were killed in a shooting at a hospital in Tulsa, Okla. The shooter committed suicide.

White’s sense of dread never goes away. It is a part of the job for people such as her, an informal team of researchers, academics and professional intelligence gatherers who took it upon themselves to monitor the poisonous melting pot of American extremism.

Connected by Twitter and encrypted messaging apps, these researchers scour the nastiest corners of the internet, watching for trends, tricks and terminology. Some create “sock puppet” accounts to get inside secret chatrooms and eavesdrop on hateful groups. Others pore over data and messages scraped from social media or hacked from extremist groups, searching for clues and the identities of racists and bigots.

They are a diverse collection of personalities, but on the whole, they shun publicity, balking at the suggestion their work is exciting or dramatic and sometimes rejecting the idea that they are “hunting” for extremists. What unites them is the desire to do thankless, often boring, research they hope will shed more light on the country’s dark underbelly.

“It's kind of embarrassing, the whole ‘extremism hunter,’ ‘Antifa’s Secret Weapon’ stuff,” said Megan Squire, a research fellow with the Southern Poverty Law Center, who was bestowed the “Secret Weapon” moniker in a Wired Magazine article in 2018. ”It feels like people want you to be more amazing or more spy-like or something, when what it actually is is just a lot of deliberative plodding – writing stuff down and being super-organized.”

Through this methodical work, this network of extremism watchers has become an invaluable resource for journalists, law enforcement agencies and the general public. Their sleuthing is responsible for much of what becomes known about mass shooters, extremist groups and other domestic terrorists. Their monitoring occasionally sparks investigations and arrests, and their willingness to put themselves online to face harassment or worse from extremists fills a vital gap in the nation’s understanding of a growing threat, said Daryl Johnson, a security consultant and former senior analyst for domestic terrorism at the Department of Homeland Security.

“I'm glad that network exists,” Johnson said. “I'm glad it's been expanded. I'm glad there's more and more analysts and resources being brought to bear on this problem, because when we have more people looking at it, you get answers, and then the picture becomes clearer.”

More: Online, extremists turn shooters into 'saints.' Experts worry others aspire to join the ranks

More: Yes, American voter demographics are changing. No, that’s not what Replacement Theory is

How it begins: ‘Trying to save the world’

Sara Aniano, a researcher who completed a master’s thesis at Monmouth University on Instagram comments in the lead-up to the insurrection at the Capitol on Jan. 6, 2021, began focusing on extremism during the COVID-19 pandemic.

Like many people, Aniano was furloughed from her job and found herself with a lot of free time.

“One day I was on the beach, and a friend of mine texted our group chat, talking about how Ellen DeGeneres was on house arrest and adrenochrome and ‘save the children,’ and I was like, ‘Hold on, is she joking? Surely she's joking,’” Aniano said.

Aniano’s friend, like millions of Americans, had fallen into the trap of nonsense and disinformation sold by people such as Alex Jones and spread on fake news sites such as Infowars. That someone so close to her was spreading QAnon-related conspiracy theories was a wake-up call, she said.

Almost overnight, Aniano said, she started digging into disinformation networks and conspiracy theories on Instagram and other platforms. Before long, she realized she could apply what she learned to her master’s thesis, and within a year, she was in regular contact with other extremism and disinformation researchers, sharing what she knew, searching for leads and falling down ever-more complicated online rabbit holes.

“I'm basically always online,” Aniano said. “It looks like I’m just on my phone, but I’m actually trying to save the world.”

Several researchers who spoke with USA TODAY described similar personal experiences that led them to full-time roles monitoring extremists. For Squire, it was tracking and reporting a neo-Confederate hate group in her hometown in North Carolina.

For White, who works at the Polarization and Extremism Research Innovation Lab, or PERIL, at American University, it was a confrontation with a racist hate crime on her university campus, where somebody scrawled the initials of a predominantly Black sorority onto bananas and tied them up with string made to look like nooses.

Aniano, who was hired by the Anti-Defamation League to continue her work, said she grew apart from her friend, who fell deeper into disinformation. She and other researchers decided to run straight at a phenomenon unfolding across America. As domestic extremist groups flourished and hate crimes spiked, the informal network of individuals determined to understand, monitor and chronicle that movement also flourished.

How they do it: Building a puzzle one piece at a time 

A typical day for an extremism watcher involves hours of scrolling through hateful content online.

As extremists have been pushed off mainstream social media sites such as Facebook and Twitter for violating their terms of service, experts constantly readjust their tracking to new platforms, networks and messaging services.

A lot of their time is spent on the encrypted messaging and social media app Telegram. Often dubbed “Terrorgram” by researchers, the app, founded by a 37-year-old Russian billionaire, which takes a laissez-faire attitude toward extremists, has become the go-to communication platform for many extremist groups and conspiracy mongers.

Researchers monitor groups’ public and private “channels” on Telegram, where users cross-post content from different channels into their own, creating a daisy chain of hate that is trackable across the platform.

White describes her often-monotonous work as like trying to piece together a never-ending jigsaw puzzle without knowing what the image she’s building will be. She said she essentially watches and learns every day, trying to keep up with the latest hateful language and memes, learning about up-and-coming hate groups and new conspiracy theories.

“I’m just, like, falling into rabbit holes all day, every day,” White said. “Because you're always learning something new, and what I learned today is going to be different from what I learn tomorrow, and sometimes something that I learned yesterday is no longer relevant.”

Along with the day-to-day monitoring, there are periods of frantic action.

In the hours after a mass shooting, for example, researchers scramble to learn as much as possible about a suspect before the person’s online life disappears.

“You're racing against the clock to collect as much information as you can get before social media companies remove it,” White said. “It's going to help paint a story of them, because you always have that one person saying, ‘But they were such a nice kid, we didn't see the warning signs.’ But you go on their social media account, and you see them posing with guns and saying racist things online.”

QAnon demonstrators protest in Los Angeles in 2020.
QAnon demonstrators protest in Los Angeles in 2020.

How they specialize: Working together

Several of the researchers formed sub-groups that focus on a particular topic, group or conspiracy theory. The Q Origins Project, for example, is a small collective of researchers focusing primarily on the early days of a conspiracy theory called QAnon and the community that grew from it.

Another collective, the Accelerationism Research Consortium, focuses on the white supremacist concept of accelerationism – seeking to foment a race war and ensuing dystopia to bring about a race-based new global order.

The researchers try to understand relationships between these groups, said Alex Mendela, a member of the Q Origins Project.

“Our work focuses on how QAnon relates to the broader conspiratorial far-right, and the pathways that individuals could take to more programmatic extremist movements and eventually violence,” Mendela said.

Adding to the complexity is the ever-growing network of websites and social media platforms dedicated to hosting extremists. Researchers who spoke with USA TODAY said they monitor accounts on Gab, Gettr, Parler, DLive, Rumble, Cozy and former President Donald Trump’s social media site, Truth Social, to name a few.

This process of watching and learning is a big part of an extremism researcher’s job. Occasionally, that work surfaces real, actionable leads or what law enforcement agencies consider “credible threats” of violence in the real world and not just online.

That’s when the watchers sometimes become more than observers.

How they respond: To report or not report?

In August 2019, Squire was folding laundry while “flipping through Telegram channels” when she noticed one conversation in a channel that was more stark than the typical flow of hate.

“One White man with a gun walked into two mosques and killed 50 invaders, Another walked into a mall and killed 20. Another walked into a church and ended more,” wrote a user called “Anti-Kosmik 2182,” whom Squire had identified as Jarrett William Smith, an ex-soldier formerly stationed in Kansas. “Have you not seen the impact of 3 amateur shootings?”

“I thought, I'm gonna make a copy of this real quick, because this doesn't seem right and also, this guy is using his real photo as his profile picture. I thought, well, that's kind of unusual,” Squire said. “So I made a copy of the chat.”

Researchers such as Squire sometimes face ethical dilemmas when they come across this sort of information: Should they report individuals to law enforcement or keep monitoring them until they announce actual plans of violence?

In this case, Squire didn’t have long to debate. “About 14 days later, the guy was arrested for plotting to bomb some houses,” she said.

Smith was sentenced to 2½ years in prison for distributing information on social media about building a bomb.

The researchers who spoke to USA TODAY were split on the question of whether they should act as a conduit to the police. Some said they are quick to report threats as soon as possible. Others said they see their work more as journalism: watching threats and writing about them but not contacting authorities directly.

Most full-time extremism researchers are aligned with institutions of higher education, which have rules about ethical responsibility and invasion of privacy.

“There are boundaries on what we can and can't do, and I think we follow a very ethical standard,” said Matt Kriner, a senior researcher at the Center on Terrorism, Extremism, and Counterterrorism at the Middlebury Institute of International Studies at Monterey. “If it's available – if anybody can get to it – we'll look at it.”

Kriner said much of the work he and other researchers do isn’t really focused on tracking individuals.

“I think ultimately, everybody wants to say we're trying to stop a shooter from shooting, right?” he said. “It's a romantic notion. It's one that can occur, but it's not typically the one that we're trying to accomplish.”

Instead, Kriner said, extremism researchers focus more on movements and trends. Their job is to help the public understand the connections between the Unite the Right rally in Charlottesville, Virginia, in 2017 and the Jan. 6 insurrection, for example, he said.

“What law enforcement does is look at islands based on the critical thresholds that you need for investigations,” Kriner said. “What we're doing is we're looking at the broader landscape.”

How they survive: Support in a ‘never-ending cycle’

The work this network of researchers does can be physically and emotionally draining. And there’s always more work than they can possibly do.

“It's completely overwhelming,” Squire said. “I could clone myself six, eight times over, and it wouldn't be enough people.”

Rather than compete, extremism researchers tend to collaborate – reaching out to one another to swap ideas and share tips. In the wake of a domestic terrorism incident, or after a big leak of hacked data from an extremist group, the network swings into action to try to learn as much as possible.

“A lot of what we've tried to do is put people in contact with one another, to have really granular discussions about ‘Why does it matter that this person put X symbol on their gun?’ ‘What are we seeing across the table?’” Kriner said. “We share tips and tricks, we talk to each other about what works and what doesn't and why it’s important for us to be shifting to this topic area versus that topic area.”

The collaboration isn’t just academic, Mendela said. “These guys are my rock,” he said. "We often discuss things from our personal life, personal achievements. We hold Dungeons & Dragons games as a good way for us to connect off the clock.”

The extremism watchers know they’re never going to “win.” Their work will never eradicate hate and prejudice. They will never know or understand all the extremists or even all the extremist groups in America. But at least they’re doing something.

“It's a never-ending battle, and we’re pretty much just pawns in their little game,” White said. “In terms of end goals, of course, you want world peace and everything like that, but it’s just a never-ending cycle.”

More from USA TODAY

For subscribers: 'Replacement theory' fuels extremists and shooters. Now a top Border Patrol agent is spreading it

For subscribers: After Uvalde shooting, moments of silence, yet so much left to say

For subscribers: The great white shark mystery in Southern California

This article originally appeared on USA TODAY: Extremist watchers: They track online hate to learn what's coming next