
Terrorist groups and extremist organizations have been observed grooming children using artificial intelligence (AI) and other techniques. But how else are they recruiting internet users?
-
Terrorist groups and extremist organizations are employing various technological tactics that help bring in new members.
-
Extremist groups use the same technology as the rest of us, commonly using social media to bring in new recruits.
-
Much of the terrorist propaganda is actually created using artificial intelligence.
Europol, the official European Union Agency for Law Enforcement Cooperation, recently observed terrorist groups using artificial intelligence to tailor content to young people in an attempt to radicalize them.
Alongside AI, extremist organizations are also targeting young men who may struggle to form meaningful relationships outside of online spaces.
The gamification of terrorism, AI, and taregting involuntary cellibates or “incels” are some of the main tactics we’re seeing when it comes to recruiting the latest generation of terrorists.
To learn more about the technologies and tactics used by terrorists to recruit vulnerable or perhaps disenfranchised people, I asked experts to weigh in on this disturbing trend.

Radical groups are using the same tech as everyone else
Some people might think that large-scale terrorist organizations might have their hands on the latest tech, but this really isn’t the case.
“Extremist groups are using the same technologies as everyone else,” Jermey Blackburn, Assistant Professor of Computer Science at Binghamton University, told Cybernews.
This is because they want to increase the scope they have when it comes to recruitment, Blackburn continues.
Typically, extremist groups will recruit and have people within their circles who are “technically competent and count technically talented people among their membership.”
“From AI to social media to cryptocurrency to video games and beyond, extremists are as digital as the rest of us.”

Extremist stomping grounds: social media platforms
Extremist groups are often lurking in the depths of social media, usually, where impressionable young people communicate and hang out.
But it might not be as simple as big tech companies culling the amount of extremist content featured on their platforms.
Media Professor at the University of Florida, Andrew Selepak, told Cybernews that social media algorithms continue to contribute to the proliferation of this type of content.
“While social media companies have gotten better at censoring extremism on their platforms and groups like ISIS no longer have accounts on mainstream platforms, social media algorithms continue to radicalize users as they feed them content based on content consumption.”
Social media platforms form echo chambers where content reverberates through our for you pages, rabbit-holing users down the wrong path.

Blackburn echoed Selepak’s earlier statement, mentioning that there's typically a “sort of funnel that people transition through from less to more extreme” content.
“Social media provides many 'touch points' to further push people down the funnel…and this is often associated with the use of alternative social media platforms with looser moderation policies or less publicly visible interactions.”
Furthermore, Selepak mentioned that although content moderation is better in English, extremist content in foreign languages might be trickier to spot.
Blackburn explained how “social media, across all its modalities and platforms, accounts for a huge amount of engagement time, especially for extremely (or even “terminally”) online people.”
Meaning that terrorist groups are capitalizing on this by creating content specifically for this demographic.

Extremeist propaganda, made with artificial intelligence
Artificial intelligence (AI) has affected every aspect of our lives since the release of OpenAI’s ChatGPT in 2022. And everyone is using it, even radical extremists.
“AI content is starting to have some impact on radicalizing individuals, as it has become easier to create propaganda and misinformation about events,” Selepak told Cybernews.
Extremist groups are also changing the content and propaganda they create to appeal to modern audiences.
These groups tend to “propagate memes and have vibrant (so to speak) communities with influencers, and operate across multiple platforms,” Blackburn told Cybernews.
In regards to AI, Blackburn agreed that, like every organization right now, “AI is a force multiplier for extremist organizations.”
Furthermore, Selepak believes that “the internet and AI may make a person interested in being part of an extremist organization and become indoctrinated by extremist ideology.”

Terrorists capitalize on the “incel epidemic?”
It’s no surprise that extremist organizations tend to prey on the young and vulnerable, which may be why the incel epidemic is forcing young men to become radicals as these individuals struggle to foster a sense of community outside of online spaces.
“Extremist groups recruit from the vulnerable and disenfranchised. They target people who are looking for meaning or who want an easy solution to their problems while providing a target for their anger.”
While Selepak disagrees with this concept of the “incel epidemic,” he told Cybernews of the epidemic of loneliness that is forcing people to find community in the extremes.
“Essentially, we don’t have an “incel epidemic,” but we have an epidemic where people are lonelier than ever before and less connected to families, romantic partners, community, institutions, and depending on the extremist group, to religion,” Selepak told Cybernews.
On the flip side, Blackburn says “Incels have exhibited violent behavior in the past, and are thus sort of a proven commodity in the violent extremist world.”
Furthermore, Blackburn notes that misogyn is “often a corte principal of terrorist ideology” which makes incels “a particularly good pool to recruit from.”
Your email address will not be published. Required fields are markedmarked