Skip to main content

The Engagement Is the Operation

Manipulation Breakdowns · 10 min read · By D0

The Videos

In March 2026, a series of polished animated videos began flooding English-language social media. LEGO-style figures of Trump and Netanyahu. An Iranian commander rapping about turning US military bases to rubble. Trump falling into a bullseye built of Epstein files. The former president depicted as a Teletubby in the Oval Office. Casino imagery. Missiles. Catchphrases.

The production quality was not amateur. The cultural references were precise — Trump’s bruised hand, MAGA infighting, Pete Hegseth’s confirmation hearing, the specific credibility vulnerabilities circulating in American political discourse at that exact moment. The rap was catchy enough that the question of whether it was human or AI-generated was genuinely unclear on first listen.

The accounts posting them operated under names like “Akhbar Enfejari” — “Explosive News” — and “Explosive News Team.” On X, they introduced themselves as “that Iranian Lego animation guys.” Grassroots creators. Independent operators. Just some people making content.

Within days, millions of views.

Analysts at Cambridge, the Atlantic Council’s Digital Forensic Research Lab, and media research organizations reached roughly the same conclusion: the content demonstrated levels of production sophistication, cultural fluency, and internet access that indicated ties to the Iranian government. Iran operates under domestic internet restrictions that would typically prevent the bandwidth required for AI video generation at this scale. “If you’re able to have the bandwidth needed to generate content like that and upload it,” concluded Mahsa Alimardani at WITNESS, “you are officially or unofficially cooperating with the regime.”

The Iranian government denied involvement. The videos kept spreading.

The Wrong Lens

The first mistake in analyzing content like this is treating it as disinformation that needs debunking.

Standard fact-checking protocol: identify the false claim, find the counter-evidence, publish the correction. This works well for disinformation that makes factual assertions — that a politician said something they didn’t, that an image is altered, that a document is authentic. The debunking apparatus can address claims.

LEGO satire does not make claims. A comedic animation of Trump as a stumbling minifigure does not assert anything that can be proven false. The “message” — that American leadership is feckless, that the war is a failure, that the US is internationally isolated — is delivered through tone, aesthetic, and format. Through the choice of the Epstein reference as a visual motif. Through the confidence of the rap. Through the implication encoded in the LEGO format itself: these leaders are toys.

There are no declarative sentences to rebut.

This is a design choice, not an incidental property of the format. Satire has structural immunity from the standard counter-messaging toolkit. The “correction” to a rap video about US military failure is what, exactly? An official statement about military performance? A counter-video? The genre mismatch is insurmountable in any timeframe relevant to the content’s spread.

When Iran produces a LEGO video about attacking US bases, it is not trying to convince you that attacking US bases is correct. It is trying to make the image of the US being attacked entertaining and shareable. Those are different operations, and they require different responses.

Who the Audience Isn’t

Traditional state propaganda targets populations it wants to influence directly: domestic audiences to sustain internal support, enemy populations to undermine morale, allied populations to build coalitions. The medium is broadcasting. Success is measured in reach and belief-change.

The LEGO videos represent a different model. Their target audience is not Iranian state TV viewers. It is American internet users — and the intended response is not conversion. The intended response is engagement: a share, a screenshot, a reply dunking on the video, a quote-post calling it Iranian propaganda.

Every engagement is a distribution event. The algorithm does not distinguish between a share from someone who finds the video convincing and a share from someone outraged that it exists. Both extend the content’s reach. Both deliver the video to new audiences. Both register as signal that the content is worth promoting.

Atlantic Council researcher Emerson Brooking named what this is: “It’s like this commodification of war — becoming part of the attention economy — which is a very strange and discomfiting experience.”

The operation does not need you to believe it. It needs you to engage with it. Your outrage is free distribution.

This is the operational shift worth marking. Virality is the goal. Persuasion is a side effect. If the videos spread among people already skeptical of US war policy, they reinforce existing views. If they spread among people outraged by Iranian propaganda, they still spread. The operation succeeds regardless of how you feel about the content.

Using Your Language Against You

What distinguishes this campaign from prior Iranian propaganda is fluency.

State-sponsored content from adversarial governments has historically looked like state-sponsored content — heavy-handed, culturally misaligned, obvious in origin. Partly a resource problem, partly a cultural gap: producing content that reads as native on American social media requires understanding American internet culture at a granular level. What references land. What formats feel organic. What aesthetic registers signal legitimate creator versus foreign government account.

The Explosive News Team videos demonstrate that gap has narrowed substantially.

The LEGO format is not randomly chosen. It references the LEGO Movie franchise — culturally recognizable, visually distinctive, and subtly infantilizing toward the figures depicted. Presenting world leaders as LEGO minifigures is a choice that simultaneously signals “net-literate and irreverent” to one audience, and “these figures are toys, not threats” to another. The format is the frame.

The Epstein references, the bruised-hand details, the confirmation hearing specifics — these are not artifacts of proximity to American politics. They represent monitoring of American political discourse at a granularity sufficient to identify specific credibility vulnerabilities in real time and deploy them. Propaganda scholar Nancy Snow named the principle directly: “They’re using popular culture against the number one pop culture country, the United States.”

University of Oregon media ethics professor Whitney Phillips took it further: “This is the language in which Trump speaks — and this is the language in which world leaders are now speaking to him.”

The propaganda runs on American internet fluency as infrastructure. The cultural competence required to produce content this legible to American audiences is itself a capability that Iran has developed and deployed.

The Grassroots Shell

The “Explosive News Team” presents explicitly as a grassroots group. “That Iranian Lego animation guys” is a self-description designed to frame the account as independent creators, not state actors.

The functional purpose of this framing is not to deceive researchers who will investigate. It is to create ambiguity during the window of first encounter — the moment that happens before any analysis, when most people encounter the video through their timeline without knowing anything about the account that posted it.

The grassroots presentation is designed for that encounter. It buys operating time. By the time forensic analysis establishes state links, the video has millions of views. The gap between when content spreads — immediately — and when analysis establishes attribution — days to weeks, in a report, for a much smaller audience — is the operational window the attribution shell is engineered to preserve.

There is also a second function: platforms have existing policies against accounts operated by state actors and state media entities. “Explosive News Team” does not trigger the same automated review flags as an explicitly government-identified account. The grassroots presentation provides cover for continued platform operation.

The attribution shell does not need to hold indefinitely. It only needs to hold long enough for the content to achieve viral distribution. After that, enforcement is retrospective, and retrospective enforcement has no reach.

The Platform as Infrastructure

A video that accumulates millions of views on social media did so partly because the platform’s recommendation system treated it as content worth promoting.

This is not a platform accusation. It is a structural description. Engagement signals — views, shares, replies, time-watched — are positive inputs to recommendation systems. Content that generates strong engagement, regardless of emotional valence, tends to receive additional algorithmic promotion.

The Explosive News Team content was produced with this architecture in mind. Visually distinctive. Tonally provocative. Culturally legible enough to generate strong reactions. Short enough to complete without commitment. These are not incidentally appealing properties — they are engineering decisions made by producers who understood that platform recommendation infrastructure would be the distribution mechanism.

This represents a significant structural shift in how state propaganda operates.

The prior model required distribution control: own the broadcast channel, capture the newspaper, operate the radio station. The new model requires only engineering content that existing platform infrastructure will distribute. You do not need to control the distribution network. You need to build content that the distribution network’s engagement-optimization function will amplify.

The platform is the broadcast system. The algorithm’s engagement incentives are the amplification mechanism. The operation just needs content engineered to exploit them.

What Counter-Messaging Cannot Do

Standard institutional responses to propaganda of this type run on two tracks: debunking and platform enforcement.

Debunking addresses false claims. There are no false claims here to address.

Platform enforcement — identifying coordinated inauthentic behavior, reporting to the platform, triggering account suspension — sometimes works, eventually. The operational timeline for platform-level action typically runs days to weeks. Content that goes viral does so in hours. Removal after the fact does not unsee a video that tens of millions of people have already shared.

What neither track addresses: the algorithm amplified the content before counter-messaging could deploy. The engagement-optimization function does not wait for fact-checks. It distributes now, based on current engagement signals.

Counter-messaging strategies developed for a world where propaganda required controlled distribution channels — where you attacked the bottleneck in the distribution infrastructure — have not adapted to a world where the platform’s algorithm is the distribution infrastructure, and any sufficiently engaging content can access it instantly.

The asymmetry is structural. Content spreads faster than analysis. Satire spreads faster than correction. Entertaining disinformation spreads faster than accurate but less entertaining refutation. None of this is new as a problem. What is new is its scale and automation: AI generation means the content supply is no longer bottlenecked by human production capacity.

The Principle

The LEGO propaganda campaign is notable as a case study less because of what it claims and more because of what it demonstrates about how state-sponsored influence operations have adapted to the algorithmic media environment.

The old propaganda problem was distribution: how do you get your message to the target audience? The new propaganda problem is attention: how do you build content that the platform’s own infrastructure will distribute for you?

Iran has answered the second question with cultural fluency, satirical format, and engineering for algorithmic performance. The result is propaganda that spreads through American social media partly because American social media’s recommendation systems are optimized to spread it.

The operation doesn’t need a broadcast tower. It has the algorithm.


This article is part of Decipon’s Manipulation Breakdowns series, which dissects real influence tactics using the NCI Protocol framework.


Sources: