Julia Ebner: going undercover with extremists
Today’s technology and social media platforms are providing extremist groups with powerful tools for spreading their ideas, recruiting new members, and shaping a new era of terrorism. That’s the argument put forward in the newly published „Radikalisierungsmaschinen: Wie Extremisten die neuen Technologien nutzen und uns manipulieren“. Author Julia Ebner, a researcher at the Institute for Strategic Dialogue, went undercover to examine extremist groups from the inside. She spoke to Joanna Bostock about what she found and her conclusions.
Joanna Bostock: Before you began this project to actually infiltrate various different radical groups, where did you get the idea from?
Julia Ebner: I’d been (researching) counter extremism organizations for the past four years, and I always had the impression that I just got to see one side of the whole picture. I really was interested in how extremist organizations operated and what they looked like from the inside. Because what we can do at our Think Tank is of course do data analysis; we can track, for example, how extremist language gets more and more radical over time and also see what they’re doing. But it’s quite hard to really understand what the human dynamics and the social processes are within those movements.
Joanna Bostock: Tell me what you did exactly, you physically went into some of these groups?
Listen to the whole interview with Julia Ebner in our Interview Podcast
Julia Ebner: Sometimes it was online contact and so on. Over a period of two years, I adopted five different identities across the whole ideological spectrum to go undercover with about 12 different extremist groups from ISIS hacking groups, all the way to Neo Nazi trolling armies. I did that online as well as offline. For some of the groups it was too dangerous to actually go there in person. But for others I did meet up with some of the white nationalists for example; I also went to a (neo-nazi) rock festival in Germany.
Joanna Bostock: Give me an example of perhaps one of the biggest surprises, something that you weren’t expecting.
Julia Ebner: I think one of the most surprising but also shocking experiences was to see how normal the interactions were and how casual the way in which people spoke about expanding their extremist movements abroad.
For example, I was recruited by The Identitarian movement when they were founding their new Great Britain and Ireland offshoot in London. I was invited to one of their strategy meetings and we met up in an airbnb in Brixton and they were talking about it as if it was kind of a startup, about the market expansion, how they wanted to expand into the English-speaking world. Of course, the ideologies are highly dangerous. And we could see some of the connections, in recent months where some of these ideologies have also inspired terrorist attacks, like the New Zealand attack.
Joanna Bostock: You were also in Vienna and met members of the Identitarian movement here, what was that like? Was it what you expected?
Julia Ebner: I didn’t really know what to expect to be honest. I had of course studied their ideologies and knew about how they would try to seem very much like a counter culture. But the language they used and the concepts they were trying to propagate and then meeting them in person was quite different, because I could also see how people would get drawn into their networks because of the way that they create a really strong in-group feeling and the way that a lot of the members immediately get used to these cultural elements of the movement.
The first time I met a member, we met in a Viennese coffee house and it was quite interesting because it was a very casual conversation. But at the same time, it was also part of the recruiting process to actually go through several stages of vetting, because they really make sure to recruit new members who are young enough on the one hand, but who are also educated enough for not overstepping any of the boundaries that they’ve set for themselves, because they have a very coherent brand attached to the movement.
They are quite selective. I would say one of the more selective extremist movements. If they have such a specific profile, the ideal target audience is young, hip and educated because they want to influence politics by influencing culture, by creating a strong counter culture that appeals to young university students. They have long-term political goals and I think they see how important optics are and how important their reputation is for achieving that. They even, for example, have questionnaires that they make every new member fill out when they join the movement.
They do interviews with new members in order to make sure that they don’t accept anyone who would then end up, for example, harming the reputation or the image (of the movement). And it’s all very much above board.
Joanna Bostock: So, it’s not secretive?
Julia Ebner: Most of it is quite open. The way that they plan their media stunts and their provocative actions, that’s all the secretive part of the strategy, to plan and stage those (stunts) behind the scenes. When they, put a Burqa on the statute of Maria Theresa for example, everyone would be surprised. They can then create a wave both in the online sphere on social media, but also in the offline media reporting that would prompt more media coverage of their movement.
Joanna Bostock: What is it about today’s communication technology that is specifically useful to these radical groups?
Julia Ebner: We can see it from Islamist groups to the Identitarian Movement and other white nationalist networks. They are really skillful at using some of these new technologies and social media to their advantage to really spread what used to be fringe ideologies to a much bigger global audience. And they’ve managed, to take some of the really niche ideas and put them on a global level, give them a global megaphone by making their campaigns go viral. But it has also led us into a new era of terrorism, I would say.
We also see, with terrorist acts like in Christchurch, how the perpetrator would then also use some of the newest functions like live streaming, but also some of the gamified, subculture elements of these fringe forums in the language that he used.
Joanna Bostock: Obviously, part of it is bringing together like-minded people, but how much of it is planting these radical ideas in somebody’s mind?
Julia Ebner: Yeah, that’s critical. I think that’s kind of the second stage that follows recruitment. In my book, I also explored the different stages of radicalization, because often we talk about online radicalization as if it was a short process that would turn an individual from today into an extremist tomorrow.
In the second chapter, which focuses on this kind of brainwashing and socialization dynamics, I show how both ISIS groups and also far-right or white nationalist groups really do their best to create a common vocabulary, common insider references and create a whole subculture that would then allow them to transport their ideologies in a more subtle way to their new members.
Joanna Bostock: Did you find a common thread among people who are being drawn into these groups? You looked at a wide spectrum of different groups, was there a common denominator?
Julia Ebner: There were quite some varying profiles among the extremists. You could find individuals from all age categories, from all educational and socioeconomic backgrounds. But what most of them had in common was some kind of fundamental identity crisis.
That might be something that has to do with changing gender roles, changing notions of belonging on a national level, but also on a smaller group level. And a lot of them were looking for a friendship which they found in some of those extremist groups. We also have to admit that these extremist groups do offer something that we can’t currently find in other environments and that’s definitely a sense of belonging, but also something that allows them to get rid of experiences of humiliation by blaming those experiences on some kind of outside enemy. That was definitely something that was a pattern among most of the extremists that I talked to.
Joanna Bostock: We’re talking about extremists. You’ve used the word terrorism several times. Obviously you’re doing this kind of research to come up with strategies to combat a threat. Can you give us an idea of the kinds of things that decision makers and society more broadly should be thinking of?
Julia Ebner: One of the biggest problems I currently see, both with policy makers, but also sometimes on the side of security forces, is that they’re very reactive to what’s happening. We saw this with the latest attacks in New Zealand, but also in the US, there was a very strong political reaction to that. And also the security forces immediately put more money into extremist research for example, into some of the capacities that had been short in terms of looking at some of the fringe channels like 4Chan, some of these niche extremist messaging boards. But it was always reactive, and we definitely need a more proactive approach in dealing with the problem, because otherwise we’re always just chasing behind extremists as they’re inventing new methods.
One of the biggest challenges would be to actually foresee some of the newest technological trends because if we know what the next tech elements are that could be exploited and could be used by radicals or by extremist movements, that also gives us an indicator of what the next dangers are for potentially violent action. We often have these debates about stuff that’s posted on social networking sites and who should be checking it, how should it be checked and so on. Is that part of the bigger debate? Absolutely. I think we’ve only just started this debate because the way unfortunately a lot of business models and the infrastructure of the big tech firms, especially Facebook, Twitter and also Youtube work currently, is that they really prioritize content that is potentially radical and that could reach from conspiracy theories to violence or violent content to hate speech content.
Automatically these are unfortunately the kind of contents that would capture the user’s attention. Ultimately the game of each of those business models is to keep users as long as possible on their platforms. There is something to be done about not just taking down content that is violence inciting, but also thinking about how we could potentially have political regulation that corrects these failures of the business models of big tech firms and how we could make sure that someone who creates, for example, a neutral account on Youtube doesn’t end up in an extremist echo chamber almost automatically within 24 hours. That sounds rather extreme. There have been studies with reading neutral avatar accounts where someone clicks on some content that might be political in nature, but not even close to extremists, and ends up very quickly in conspiracy theory materials, but also in racist material, because that’s essentially how the algorithms are trained and how the recommendation mechanisms often work.
Joanna Bostock: It really sounds like a virtual rabbit hole that you go to.
Julia Ebner: It absolutely is. And that is one of the challenges that I think will be very tricky for the policymakers to deal with, because it definitely means not just having a conversation with the big firms, but potentially also finding new legislation that would regulate exactly that.
Publiziert am 17.09.2019