These ISIS news anchors are AI fakes. Their propaganda is real.
Four days after the Islamic State attacked a Russian concert venue in March, a video started circulating on a private platform affiliated with the terrorist group. The 92-second broadcast showed a news anchor in a helmet and fatigues saying the attack was not a terrorist operation, but part of “the normal context of the raging war between the Islamic State and countries fighting Islam.”
The “anchor” was fake, an artificial intelligence-generated clone created by Islamic State supporters as part of a new AI-generated media program called News Harvest, according to videos and chat messages shared with The Washington Post by SITE Intelligence Group, which tracks terrorist and extremist movements online.
Since March, the program has offered near-weekly video dispatches about Islamic State operations around the globe. Made to resemble an Al Jazeera news broadcast, the program - which has not been previously reported on - marks the emergence of AI as a powerful propaganda tool as Islamic State affiliates gain steam and rebuild the group’s media operations, said Rita Katz, co-founder of SITE Intelligence Group.
“For ISIS, AI means a game changer,” Katz said. “It’s going to be a quick way for them to spread and disseminate their … bloody attacks [to] reach almost every corner of the world.”
The shows start by displaying a graphic logo and title sequence. AI-generated news anchors appear on screen, dressed in battle gear or formal tunics, as news tickers scroll and b-roll video footage shows Islamic State members carrying out terrorist missions. The anchors read dispatches from official Islamic State media outlets, including al-Naba and Amaq.
The videos offer some of the earliest signs of AI helping terrorist groups quickly disseminate propaganda and recruit members, terrorism media experts said - and have even sparked an internal debate over the use of the technology under Islamic law.
The Islamic State is known for its splashy media operation, which creates Hollywood-level recruitment videos that are appealing to youths. AI video generators now allow them to create such content for a fraction of the cost. Several Islamist militant organizations, including al-Qaeda, have signaled interest in using AI chatbots, image generators and voice cloners to quickly create disinformation, messages shared with The Post show.
Artificial intelligence gives Islamic State supporters the ability to create propaganda that’s disguised in an innocuous manner, such as a news broadcast, making it more difficult for tech companies to moderate, Katz said. Cheap and easy-to-use AI tools allow the videos to be made quickly and on a shoestring budget, benefiting groups such as the Islamic State and al-Qaeda which have been weakened by military attacks.
- - -
‘Mujahideen of media’
Islamic State supporters have been eager to use AI for propaganda. On March 15, a supporter, “Al Kurdi 500,” made a case on a private messaging server for why fellow supporters should adopt AI.
“It would be great if the brothers produced videos regarding daily news, as an alternative to reading the news in text and looking at images,” Al Kurdi 500 wrote, “like how news channels like Al Jazeera” talk about events.
“Technology has evolved a lot and this would be easier to do nowadays, especially with the use of AI,” Al Kurdi 500 added. “Media is as important as physical warfare, or even more. Because it has a very big influence over the people.”
Seven days after that message, Islamic State members killed at least 133 people in an attack on a Moscow-area music hall. A supporter of the extremist group called Hisad quickly mobilized to create an AI-generated video. On March 26, Hisad posted the first episode of News Harvest on a private messaging platform.
Since the Russian attack, six News Harvest broadcasts have posted, providing an overview of recent operations by Islamic State affiliates around the globe, including in Niger, Cameroon, Iraq, Syria and Nigeria. It’s unclear what AI technology is being used, though some supporters in posts have encouraged the use of “good” AI software, such as the voice cloning tool from Eleven Labs, an AI audio generation company. (Eleven Labs did not return a request for comment.)
Steven Stalinsky, executive director of the Middle East Media Research Institute (MEMRI), said the episodes are not made by the official arm of the Islamic State, so it’s unclear whether they align with the group’s overarching strategy. But he said he wouldn’t be surprised if it created an AI media division, since it has always been quick to adopt cutting-edge technology.
Pro-Islamic State media outlets are already seeking people with AI skills, he said. A message obtained by MEMRI from April 23 shows one soliciting experts in poster making, writing articles and video editing, and proficient in Adobe Photoshop, Premiere and AI. “O mujahideen of media,” the post said. “The media is waiting for your attack.”
Other Islamist militant organizations, such as al-Qaeda, are also discussing the use of AI. On Feb. 9, an al-Qaeda-affiliated group announced it would start hosting online AI workshops. The next day, it partnered with another al-Qaeda affiliate organization to release a 50-page guide titled “Incredible Ways to Use Artificial Intelligence Chat Bots,” which provided a basic overview of OpenAI’s ChatGPT - including reporting from outlets such as Wired Magazine about AI translated into Arabic, according to a document that SITE Intelligence Group shared with The Post.
“ChatGPT is a powerful tool,” the guide’s introduction said. “So, let’s see what you can do with this tool, shall we?”
Stalinsky said technology companies where Islamic State supporters spread content - such as on the secure messaging platform Rocket.Chat, Telegram and X - do a poor job taking down Islamist militant materials. Now that AI allows more of this type of media to surface, he said, it will be a challenge for regulators and counterterrorism efforts.
Laís Machado Rocha, a public relations representative for Rocket.Chat, said the company is an open-source platform that does not host messages on its servers, and therefore cannot take them down. “The dissemination of messages is limited to the users of the specific illegal workspace,” Rocha said.
Telegram and X did not return requests for comment.
- - -
Faces forbidden?
Over a decade ago, the Islamic State’s media arm, al-Hayat, allowed the Islamic State to churn out high-quality videos, a key tactic in recruiting people from across the world. While Western military attacks have stamped out the division, terrorism experts said, AI allows Islamic State supporters to quickly create propaganda in the days after a terrorist attack that can remain online longer than gory content, when the opportunity for radicalizing new members is high.
“If you carry out the best attack in the world, and you don’t propagate it, you lost your achievement,” Katz said. “What AI is doing now is enabling them to create as much as they want in no time.”
The videos, meanwhile, have also sparked an internal controversy among Islamic State supporters on whether artificially generated depictions of humans are forbidden under Islamic law, referred to as “haram.”
“Do not show face of drawings or face of … animations, IT IS FORBIDDEN HARAM,” an Islamic State supporter called Hamed 123 wrote to the creator of the News Harvest series on a private platform, according to messages SITE Intelligence Group shared with The Post.
Hamed 123 wrote that people who made images “by drawing or engraving” would be punished on Islam’s Day of Resurrection. “These prohibited images are images of animate beings, such as humans, animals and birds,” Hamed 123 added.
Hisad responded in early May. “I went and talked to the sheikh about this and he said it’s haram to have animation or AI drawing being displayed,” Hisad wrote. “So he told me to blur the face of the model and that’s why you see it like this now,” referring to more recent episodes of the News Harvest broadcast where the AI-generated anchor is made to look more cartoonish, with an obscured face.
Aaron Zelin, a senior fellow at the Washington Institute for Near East Policy, said AI could aid in the growth of homegrown extremism by offering people anywhere the ability to imbibe propaganda, create AI-generated media and become involved with the extremist group.
“If you throw as many pieces of spaghetti at the wall, eventually one of them is going to stick,” he said. “It could provide more avenues for the virality of this type of content.”
Related Content
Binoculars and batteries: What it’s like covering Trump’s hush money trial
In Md. Senate race, Hogan pivots on abortion and backs Roe
‘All scraped up’ inside: Maui fire survivors grapple with health effects