November 4, 2024

Front page layout
Site theme
Sign up or login to join the discussions!

Last December, the United Nations warned of an overlooked but critical “emerging terrorist threat”: extremists radicalizing members of online gaming communities.
Despite ample interest in saving gamers from such exploitation, experts say that a lack of research funding on the topic has put the gaming industry behind social networks when it comes to counterterrorism efforts. That’s starting to change, though. Within the past week, researchers told Ars that the US Department of Homeland Security has, for the first time, awarded funding—nearly $700,000—to a research group working directly with major gaming companies to develop effective counterterrorism methods and protect vulnerable gamers.
The new project will span two years. It’s spearheaded by Middlebury College’s Institute of International Studies, which hosts the Center on Terrorism, Extremism, and Counterterrorism (CTEC). Vice reported that other partners include a nonprofit called Take This, which focuses on gaming impacts on mental health, and a tech company called Logically, which Vice says works “to solve the problem of bad online behavior at scale.”
The researchers have summarized their overarching goals for the DHS project as “the development of a set of best practices and centralized resources for monitoring and evaluation of extremist activities as well as a series of training workshops for the monitoring, detection, and prevention of extremist exploitation in gaming spaces for community managers, multiplayer designers, lore developers, mechanics designers, and trust and safety professionals.”
Take This research director Rachel Kowert told Ars that the primary objective of the project is to develop gaming industry-focused resources. Her group’s ambitious plan is to reach out to big companies first, then engage smaller companies and indie developers for maximum impact.
Alex Newhouse, deputy director of CTEC, told Ars that the project will start by targeting big gaming companies that “essentially act like social platforms,” including Roblox, Activision Blizzard, and Bungie.
Although project funding was just approved, Newhouse said that CTEC’s work has already begun. For six months, the group has been working with Roblox, and Newhouse said it is also in “very preliminary” talks with the Entertainment Software Association about ways to expand the project.
Newhouse said that within DHS, the FBI has become increasingly interested in research like CTEC’s to combat domestic terrorism—but, to his knowledge, no federal organization has funded such data collection. Although his project is only funded for two years, Newhouse wants to push the gaming industry within five years to implement the same standards for combating extremism that social networking platforms already have.
“I want game developers, especially big ones like Roblox and Microsoft, to have dedicated counterextremism in-games teams,” Newhouse told Ars. “In these days, we need to push to be that sophisticated on the games industry side as well.”
Newhouse plans to rely on his experience helping tech giants like Google and Facebook organize counterterrorism teams. He says that CTEC’s biggest priority is convincing the gaming industry to invest in proactively moderating extremist content by “implementing increasingly sophisticated proactive detection and moderation systems” that social networks also use.
Historically, Newhouse said that gaming companies have relied mostly on players to report extremist content for moderation. That’s not a good enough strategy, he said, because radicalization often works by pumping up a gamer’s self-esteem, and people who are manipulated to view this sort of online engagement as positive often don’t self-report these radicalizing events. By relying strictly on user reports, gaming companies are “not going to actually detect anything on the initial recruitment and radicalization level,” he said.
Daniel Kelley, the associate director for the Anti-Defamation League’s Center for Technology and Society, told Ars that online gaming companies are approximately 10 years behind social media companies in flagging this issue as critical.
Kowert, of Take This, first became interested in the link between online gaming communities and real-world violent extremism after she encountered a 2019 nationally representative survey from ADL. It found that nearly 1 in 4 respondents “were exposed to extremist white supremacist ideology in online games.” Newhouse said that estimate is “probably too conservative at this point.”
Still, ADL said, “the evidence of the widespread extremist recruiting or organizing in online game environments (such as in Fortnite or other popular titles) remains anecdotal at best, and more research is required before any broad-based claims can be made.”
Today, the research base remains limited, but it has become apparent that the issue is not just impacting adults. When ADL expanded its survey in 2021 to reach almost 100 million respondents, the survey included young gamers ages 13–17 for the first time. ADL found that 10 percent of young gamers were “exposed to white supremacist ideologies in the context of online multiplayer games.”
Kowert immediately responded to the 2019 ADL report by pivoting her research and teaming up with Newhouse. She told Ars that the reason there’s so little research is because there’s so little funding.
Kelley told Ars that while it’s good to see research finally receive funding, ADL recommends that government inject way more funding into nipping the issue in the bud. “This is not a time to be supporting stuff with drop-in-the-bucket funds,” Kelley said. “There’s a lot more that the Department of Justice needs to do to fund these kinds of efforts.”
You must to comment.
Join the Ars Orbital Transmission mailing list to get weekly updates delivered to your inbox.
CNMN Collection
WIRED Media Group
© 2022 Condé Nast. All rights reserved. Use of and/or registration on any portion of this site constitutes acceptance of our User Agreement (updated 1/1/20) and Privacy Policy and Cookie Statement (updated 1/1/20) and Ars Technica Addendum (effective 8/21/2018). Ars may earn compensation on sales from links on this site. Read our affiliate link policy.
Your California Privacy Rights | Do Not Sell My Personal Information
The material on this site may not be reproduced, distributed, transmitted, cached or otherwise used, except with the prior written permission of Condé Nast.
Ad Choices

source

About Author