Addressing the Digital Challenges Posed by Extremism and Exploitation

IEEE Computer Society Team
Published 09/18/2023
Share this on:

Scientists and technology practitioners feel an ethical responsibility to promote the well-being of humanity. It’s what drives us to engage a broader community on any number of societal issues, and it’s the basis behind the 2023 IEEE Computer Society (CS) Tech Forum Digital Platforms and Societal Harms, taking place 2-3 October in Washington, D.C.

One of the central topics of this year’s event is extremism and exploitation and how they can be negated on digital platforms. But to address the digital spectrum of these offerings, we first must acknowledge that extremism and exploitation have a long-standing history. As a March 2023 paper put out by the Brookings Institute argues: “The abuse of the internet by terrorists and hate organizations is persistent and longstanding. Indeed, such abuse long predates the modern search, recommendation systems, and ad-based businesses models often blamed for current digital dysfunction.”

Technical pitfalls and solutions


However, over the years, the evolution of digital platforms has exacerbated the problem, enabling broader engagement from mal-intentioned parties, and introducing new technologies that can be manipulated for hate-infused purposes. For instance, the Global Network on Extremism and Technology in February 2023 discussed how extremist groups are using AI to adapt their propaganda to be more interactive: “Unlike the extremist propaganda of the past, these new digital media products allow extremist groups to interact with audiences in unprecedented ways… this emerging technology may enable and accelerate the production of a greater quantity and quality of digital propaganda manufactured by non-state extremist actors.”

Addressing extremism and exploitation on digital platforms is an ongoing and evolving challenge. But options exist. For example, tools such as algorithmic interventions and content monitoring to limit the spread of extremist and exploitative content; reporting mechanisms and data sharing that make it easier for users to report it; and legal, regulatory and data sharing collaboration that allow coordination and resource sharing all can help.

But the trick is to ensure that the scientists and engineers working on the technology solutions have the support of the platforms themselves and global governments in introducing new ways to mitigate harm. That’s why events like Digital Platforms and Societal Harms can make a difference: They bring together technology problem-solvers with government organizations and industry stakeholders to come to common solutions.

Worldwide engagement


And with the far-reaching extent of digital extremism and exploitation, a worldwide effort is necessary. Digital platforms know no bounds, and without the coordinated efforts of a global community, malicious activities will continue to pop up in silos.

Take the United States, for example. In January, Time magazine reported that between 2016 and 2022, extremists raised more than $6.2 million for their activities on crowdfunding sites and white supremacists, neo-Nazis, militias, QAnon conspiracists, and far-right groups like the Oath Keepers and the Proud Boys were able to raise millions of dollars across multiple fundraising websites.

An event to identify solutions


Certainly, digital platforms have the potential to do right by society, but also to create opportunities for the abuse of the power they enable. This Tech Forum will deliver insights from experts developing cutting-edge technical solutions, offer ideas for how governments and civil society can tackle these challenges, and dive into what public policy experts recommend for the future. Participants will hear from leaders, join Q&A sessions with speakers, and participate in small group discussions to identify paths forward in preventing extremist and exploitative content from perforating digital channels.

In fact, this year’s extremism and exploitation panel, which can be attended in-person or online (including on demand anytime in October via recording), will be moderated by Ye Bin from American University and will feature: 

  • Adam Hadley, Founder and Executive Director of Tech Against Terrorism, a public-private partnership established as an initiative of the United Nations Counter-Terrorism Committee Executive Directorate (CTED)
  • Vidhya Ramalingam, CEO of Moonshot, a company working to end online harms by applying evidence, ethics and human rights
  • Nicholas Rasmussen, U.S. Department of Homeland Security (DHS) Counterterrorism Coordinator Taskforce Director, DHS’ most senior official charged with coordinating counterterrorism-related activities
  • Tom Thorley, Director of Technology, Global Internet Forum to Counter Terrorism (GIFCT), where he works to develop cross-platform technology solutions to combat terrorist threats

For a taste of our panels, which bring together experts working on the problem from the perspective of AI, technology, government, and public policy, here’s last year’s remarks on the extremism and exploitation panel by Emily Cashman Kirstein of Google.

Panel on Extremism & Exploitation – Emily Cashman Kirstein

Using digital platforms to counter extremism and exploitation is a complex and multifaceted challenge, but it’s also an important one. Digital platforms maintain a key role in preventing the spread of harmful content and in promoting constructive narratives.

For more information or to register for this year’s event (either in person or online), visit https://tech-forum.computer.org/societal-harms-2023/.