Skip To Content
JEWISH. INDEPENDENT. NONPROFIT.
Make a Passover gift and support Jewish journalism. DONATE NOW
Fast Forward

Is Netanyahu dead? Has Tel Aviv been flattened? AI videos are dominating the Iran war.

Even shrewd consumers of the news are getting confused

(JTA) — Israeli Prime Minister Benjamin Netanyahu posted an unusual video this week: of himself buying coffee at a Jerusalem-area cafe.

It was hardly typical fare for wartime, when Netanyahu can more often be seen giving recorded addresses or touring missile damage within Israel. But the prime minister had come with an important mission: to debunk viral claims of his death.

The claims, which originated on Iranian state media last week, were picked up by social media users on Thursday after Netanyahu gave his first press conference during the war.

Zooming in on details in the seemingly innocuous address, some claimed that Netanyahu had an extra finger on his right hand and missing teeth, signs they said were key tells of AI-generated content.

“Imagine Netanyahu was actually dead this entire past week,” the pro-Palestinian TikTok influencer Guy Christensen wrote in a post on X. “It’s too good to be true but Israel has been using AI generated videos of Netanyahu ever since. One can only hope.”

From the cafe Sataf, Netanyahu issued his response to the conspiracy, posting a video on Sunday of him ordering a coffee, chatting with baristas and telling Israelis that the wars against Iran and Hezbollah in Lebanon were going well.

“They say I’m what?” the caption read. Mocking the idea that he had been killed, he joked, “I’m dying for coffee!” Then, alluding to the speculation about the earlier video, he asks, “Do you want to count the number of fingers?” before holding up each hand with his fingers outstretched.

But rather than quelling the claims of his death, the Israeli leader’s response instead spurred more speculation, with users on social media calling into question details in the video including the physics of his coffee cup. In another video posted by Netanyahu on Monday, some social media users pointed to a clip where a ring seemingly disappears from his hand in one of the frames.

“ISRAEL: ‘Benjamin Netanyahu is still alive. Here’s another AI video of him as proof. Just trust me, goy,’ the antisemitic podcaster Stew Peters wrote in a post on X.

The churn of conspiratorial claims about the Israeli leader’s death, which also included an AI-generated image of him being pulled from rubble, highlights the growing challenge of combating misinformation in an era of artificial intelligence and viral deepfakes, especially during times of conflict.

The war with Iran has produced an absolute flood of fabricated imagery, from AI-generated clips circulated by pro-Iran accounts purporting to show missile strikes flattening Tel Aviv or the capture of American troops by Iranian forces. The Israeli disinformation detection company Cyabra said it identified networks containing tens of thousands of accounts that generated material garnering 145 million views in the first two weeks of the war — almost all pro-Iranian, and mostly on TikTok. (The company said during the last Israel-Iran war, in June 2025, that Iran’s internet outage had quelled disinformation bot farms located there.)

“The campaign did not spread organically. Clear coordination patterns were identified, including repeated narratives, identical videos and captions, fixed hashtag clusters, and synchronized burst posting,” Cyabra said in its report published Friday. “These tactics allowed the network to rapidly flood the information environment and dominate online discussions during key moments of the conflict.”

The videos have left some of Israel’s critics confident that the country has been battered far beyond what has been officially reported.

But even Israeli television has not been immune, airing its own misinformation too — albeit unwittingly.

Channel 12 News last week aired a night-vision clip that it said showed American B-2 stealth bombers over Iran flying in formation with F-18 fighter jets.

Within hours, the clip was identified not as a Pentagon release, as Channel 12 military correspondent Nir Dvori had suggested on air, but as footage from the combat flight simulator Digital Combat Simulator World. Itay Blumental, Dvori’s counterpart at rival public broadcaster Kan, wrote on X that the footage was “indeed incredible, but also lifted from a video game,” sharing the same YouTube clip from March 2023.

During Monday evening’s broadcast, Dvori apologized and said the mistake was “entirely mine,” but did not specify which footage he was referring to, leaving viewers who had missed the earlier segment with little indication of what had gone wrong. The news network also issued an apology, saying it would “examine its procedures.”

The right-wing Channel 14 also aired the clip — more than once.

i24 News made a similar mistake, the Haaretz newspaper reported, airing a video it treated as apparent footage of an American strike on Iran, though the clip was also from Digital Combat Simulator World.

The segments quickly became internet fodder, with social media users lampooning the news networks and posting their own tongue-in-cheek “exclusive war footage.”

Omer Babai, who runs Kan’s social media, posted a GIF on X of shoot ’em up video game Chicken Invaders, saying it showed “American bombers in Iranian skies.”

Another X user quipped: “Nir Dvori: Iran scattered mines across the Strait of Hormuz,” alongside a screenshot of vintage PC game Minesweeper.

A third posted an image of fellow 1990s gaming staple “Digger,” with the caption: “Exclusive footage of Sinwar in the tunnels of Gaza,” referencing the Hamas chief killed by the IDF. Street Fighter and Pac-Man made cameo appearances too.

Channel 14, widely seen as sympathetic towards Prime Minister Benjamin Netanyahu, is no stranger to broadcasting dubious footage. Earlier in the war, the channel aired a video it said showed crowds in Tehran appearing to express support for the Israeli premier with chants of “Bibi joon” — a Persian term of affection translated roughly as “dear Bibi.” But the online Israeli fact-checker FakeReporter later said the chant had been generated with artificial intelligence.

But the B2 gaffes are one side of a much wider phenomenon.

One viral clip, shared across X, TikTok and other platforms, appeared to show missiles pounding Tel Aviv and apartment blocks collapsing under a barrage. AFP and several other outlets found it had been generated using AI, citing telltale distortions in cars, rooftops, smoke trails and even the placement of an Israeli flag sans pole. The Grok AI chatbot on X, however, helped amplify the video, with repeated assurances that “the video is real,” AFP reported.

After the video was exposed as AI-generated, an X account under the name Abdulruhman Ismail, one of the first to share the footage in a post that drew 4 million views, said he would leave it up “because the scene reflects, painfully, what Gaza has endured under Israeli bombardment.” He added, “I am keeping this post for transparency. The video may not be real, but the devastation it evokes is real, and it mirrors what Palestinians have lived through.”

During the June 2025 war, pro-Iran accounts similarly circulated fake videos and images claiming to show strikes devastating Tel Aviv as well as Iranian forces downing Israeli F-35s.

Australian wire AAP debunked several fakes from this round of conflict, including a video claiming that an Iranian strike set a CIA facility in Dubai ablaze, as well as a fabricated image purporting to show late Iranian supreme leader, Ayatollah Ali Khamenei, dead under a pile of rubble.

A separate fabricated clip that racked up tens of millions of views purported to show the Burj Khalifa engulfed in flames as crowds rushed in its direction.

The Tehran Times also shared false images and false reports of extensive damage to the US Navy’s Fifth Fleet headquarters in Bahrain.

Iran’s embassy in Austria posted an AI image of a child’s backpack, claiming it was taken at the Minab school in Iran that was hit on the first day of the war.

Tasnim, the Iranian state-affiliated news agency, shared an AI-generated image on X purporting to show an American radar installation in Qatar destroyed in an Iranian strike, The New York Times reported. The paper said Iran’s propaganda “appears focused more on swaying international audiences,” portraying the “success of Tehran’s counteroffensive in effusive terms.”

But X’s head of product told the BBC that 99% of the accounts spreading AI-generated war videos were trying to “game monetization,” posting sensational content to rack up engagement and qualify for payments through the platform’s creator revenue program. The social media giant announced that it will temporarily suspend creators from the program if they post AI-generated videos of armed conflict without disclosing that they were fake.

British politician George Galloway posted a video last week containing AI imagery in which he narrates that the “apocalypse is burning Tel Aviv,” that the city “now looks like Gaza,” and that air defenses over Tel Aviv are “no longer operational.” He says his information came from friends on “Sheinkin Street, Tel Aviv, near Dizengoff Square.”

Former Israeli spokesman Eylon Levy seized on the canard, posting reaction videos of sun-soaked beach scenes and one of himself at Dizengoff Square, casually sipping an iced coffee with the very much intact plaza behind him.

Some people responded to the video by cheering Levy on, saying that they, too, were enjoying a beautiful day in a mostly intact Tel Aviv. But others resisted the evidence in front of them. “Cheap Jew propaganda,” one commenter wrote. “It’s basically flattened out.”

Republish This Story

Please read before republishing

We’re happy to make this story available to republish for free, unless it originated with JTA, Haaretz or another publication (as indicated on the article) and as long as you follow our guidelines.
You must comply with the following:

  • Credit the Forward
  • Retain our pixel
  • Preserve our canonical link in Google search
  • Add a noindex tag in Google search

See our full guidelines for more information, and this guide for detail about canonical URLs.

To republish, copy the HTML by clicking on the yellow button to the right; it includes our tracking pixel, all paragraph styles and hyperlinks, the author byline and credit to the Forward. It does not include images; to avoid copyright violations, you must add them manually, following our guidelines. Please email us at [email protected], subject line “republish,” with any questions or to let us know what stories you’re picking up.

We don't support Internet Explorer

Please use Chrome, Safari, Firefox, or Edge to view this site.