Skip to Content, Navigation, or Footer.
Logo of The Middlebury Campus
Friday, Apr 19, 2024

Middlebury alums talk online extremism

Two Middlebury alumni, Kris McGuffie ’97 and Alex Newhouse ’17, spoke last Thursday about the dangerous role of artificial intelligence in online extremism. They connected the growth of the Internet to the creation of large extremist communities online, challenging the conventional notion that extremists act as lone wolves and arguing instead that such online communities sometimes inspire real-world attacks.

The Center on Terrorism, Extremism, and Counterterrorism (CTEC) at the Middlebury Institute of International Studies at Monterey (MIIS) sponsored the lecture, titled “The Language of Terror: How Online Extremism and Artificial Intelligence Deepfakes Threaten Our Future.” McGuffie and Newhouse, both English majors who studied at MIIS, now work with the CTEC. 

McGuffie began by analyzing the language that extremists use online, explaining that extremists often speak in group — “us versus them” — language and create memes to attract followers. They adapt the language they use to avoid moderation or takedown on major online platforms, often making obscure and codified references to several ideologies in a short phrase. This highly-compressed language is difficult to comprehend for outsiders, but easy for those in extremist networks who are familiar with these ideologies.

“Funny how you’ve been conditioned to react that way,” read one tweet from a right-wing Twitter account, responding to another user with antisemitic criticism of the Israeli state. It accused Mossad, Israel’s national intelligence agency, of controlling large portions of the U.S. government and involving in conspiracies. 

“Too bad the Mossad pedophile blackmail network has infiltrated every aspect of American government. They control huge portion of U.S Congress, ran Epstein and Maxwell, ran false flag ops on us, they sell US secret intelligence.”

Such phrases suggest that information is being intentionally withheld from the reader. 

“Notice the placement of [the first sentence],” McGuffie said. “That’s at the beginning, right? As a second person, ‘Funny how you’ve been conditioned to react that way. So the implication is you don’t have all the information; you’re a pawn of somebody. And then what follows is some of that information you’re missing. Like, let me fill you in, you’re really missing out on — the implication is — the truth.”

One project of the CTEC is to dissect such tweets to understand which ideologies influence different phrases. The organizations behind many Twitter accounts have skilled teams that use bots to amplify their message — one account posted 50,000 times in 12 hours.

McGuffie stressed the urgent need for both public and private sector policy in this area of research, but she also encouraged audience members to take individual action by voting, assembling and petitioning their governments. 

“We all are consumers of technology, we’re all online,” she said. “Policy is important, but setting cultural and social norms is even more important.”

Faculty from multiple disciplines are involved in this project, including professors of linguistics and political science. 

Bea Lee ‘20.5, who attended the lecture, said it helped her better understand the risks associated with artificial intelligence. 

“[The lecture] was a really important reminder that technological innovation (machine learning specifically) has the potential to be weaponized,” Lee wrote in a message to The Campus. “Just because it has the potential to benefit society doesn’t mean it can go unregulated.” 


Emmanuel Tamrat

Emmanuel Tamrat '22 is Digital Director.

He began working for The Campus as a photographer and online editor  in the fall of 2018, and previously served as senior online editor.

An Environmental Policy major, Tamrat hails from London, GB but calls  Alexandria, VA home. At Middlebury, he is involved in Rethinking  Economics and works as a Democracy Intiatives Intern with the CCE.


Comments