My Substack addresses countering disinformation, which has become the core “infowar” concept since the 1980s as opposed to the 1950s, 1960s, and 1970s, when people focused on Soviet propaganda instead.
This post addresses the relationship between propaganda and disinformation and the main differences between the two – a “compare and contrast” type of analytical exercise.
An aside and preview of my following post(s) after this
The March 1 Russian “hack and leak” operation revealing a discussion among German defense officials about Germany’s Taurus missile reminds me of the importance of a largely unknown but very important concept in Soviet/Russian information influence operations: directed information. You can’t fully understand Soviet/Russian information influence operations without understanding directed information. Disinformation is very important and attention-grabbing, but an insufficiently broad concept. In the next post, I’ll explain “directed information,” which has become increasingly important in recent years because of dramatic changes in ways in which we store information, namely digitally rather than in very-hard-to-purloin hard copy.
If you’re curious about directed information and prefer not to wait, I recommend the article, “The transformation of propaganda: The continuities and discontinuities of information operations, from Soviet to Russian active measures” by Roman Horbyk of Örebro University, Sweden and Södertörn University, Sweden, Yana Prymachenko of National Academy of Sciences of Ukraine and Princeton University, USA, and Dariya of Kyiv-Mohyla Academy, Ukraine and the University of Pennsylvania, USA. It notes the concept, gives a good working definition, and relies on solid documentary sources, in particular a 1989 KGB training manual on “Political intelligence from the Soviet Union’s territory,” which has a chapter on “active measures,” made available online by the Free Russia Foundation in its “Lubyanka Files” feature, and partially translated in their document, “Notes on Political Espionage from USSR Territory.” We all owe Michael Weiss, the Free Russia Foundation’s Special Correspondent for Special Investigations a debt of gratitude for posting 29 KGB training manuals online, as well as translations of some of them.
The Free Russia Foundation article uses the translation “targeted information” rather than “directed information,” but I believe “directed information” is the better translation and it is the one with which I am familiar reading about these issues in the early 1990s.
I will likely write more than one post on “directed information” as it is a very important Soviet/Russian information influence technique that has received scant attention previously. End of aside.
Propaganda: Definition or Examples?
There are many different definitions of propaganda, but, as a counter-disinformation and counter-propaganda practitioner, I prefer not to reference them. The main reason is that definitions tend to be denotative while propaganda’s connotative, overwhelmingly negative associations are uppermost in most people’s minds. There is no way around this problem when engaging in public discourse. Academics can draw careful distinctions in the cloistered environment of the classroom, but people involved in the practical work of counter-propaganda and counter-disinformation do not operate in this environment.
Rather than trying to define propaganda, I find it more useful to provide examples of what it is. In my post on “Countering Soviet and Russian Disinformation,” the last section, “A Detailed Look at Russian Propaganda Themes in Lithuania,” listed the main propaganda themes that the Kremlin was spreading to try to influence Lithuanian audiences and themes they spread about Lithuania to influence other audiences, as explained to me by the then-head of the Strategic Communication Department of the Lithuanian Armed Forces in 2015. That is one very good example.
Russian Propaganda about Ukraine
Vienna-based political scientist Anton Shekhovtsov, who is director of the Centre for Democratic Integrity and visiting professor at the Central European University (Austria), provides a similar comprehensive analysis of Russian propaganda regarding Ukraine in his 2023 Euromaidan Press article, “Four towers of Kremlin propaganda: Russia, Ukraine, South, West,” which is worth reading in its entirety.
The main point I wish to make is that Russian propaganda takes the form of what can appear to be seemingly logical arguments, even though many propaganda arguments are false or misleading. They can also be described as themes or narratives, as Shekhovtsov refers to them. For example, for the domestic Russian audience, he says there are three main Russian strategic narratives:
“The Ukrainian nation does not exist, while so-called ‘Ukrainians’ are simply confused or manipulated Russians”
“As a country, Ukraine was granted existence by Russia”
“Sovereign Ukraine is a project of ‘anti-Russia’”
He says additional “tactical narratives” aimed at the domestic Russian audience are:
“Russia’s war against Ukraine is not a war but a ‘special military operation’” (SMO)
“Those Ukrainians who oppose Russian rule are Nazis” (“de-Nazification”) of Ukraine as an objective of the SMO
“NATO uses Ukraine to attack Russia,” or “Russia fights NATO not Ukraine”
“Russia never lost wars: ‘we will necessarily win’”
“Ukraine commits genocide of ethnic Russians in the Donbas and elsewhere in Ukraine”
“Ukraine’s territory belongs to Russia: ‘we do not occupy Ukrainian lands – we return to Russia what is rightfully ours’”
“Ukraine’s leaders are Satanists.”
Shekhovtsov then describes what he sees as the main strategic and tactical narratives aimed at audiences in Ukraine, the West, and the Global South, each of which are tailored to the outlooks of these respective audiences. For example, he says the three main strategic narratives for the Global South are:
“Russia is the leader of the global anti-imperialist and anti-colonial front”
“The West is using the Ukraine war to reclaim global domination”
“Ukraine is part of the Russian legitimate sphere of influence”
For Ukrainian audiences, he sees Russia’s three main strategic narratives as:
Russians and Ukrainians are brotherly nations”
“Ukraine is part of the Russian civilization”
“Ukraine can only be successful together with Russia”
For the West, he describes the three main Russian strategic narratives as:
“Russia is a global power that has a right to have its own sphere of influence, and Ukraine belongs there”
“Ukraine as part of the West poses an existential threat to Russia”
“The West is using NATO to encircle Russia”
He also lists “tactical” narratives aimed at the West, which include:
“Ukraine is run by Nazis or, at the very least, has an immense Nazi problem”
“Western sanctions are damaging for European businesses and households”
“Ukraine is one of the most corrupt countries in the world – it cannot be part of the West” (My boldface, as I explain below how this propaganda theme relates to Russian disinformation claims)
“Russians and Ukrainians are one people”
“Russia is interested in peace negotiations but Ukraine and the West are not interested in peace”
“European support for Ukraine will result in the geopolitical decline of Europe”
“The US uses the Ukraine war to cement its position as the dominant power”
“Western weapons given to Ukraine will end up with international terrorists”
“Nuclear threat: the West should not oppose Russia because it has nuclear weapons; or Ukraine is making a ‘dirty bomb’”
Shekhovtsov’s Euromaidan Press article is based on a video lecture he presented on “Russia's Strategic and Tactical Narratives in Its War against Ukraine,” in which you can see him present his ideas in person.
The propaganda themes chosen for each audience are designed to appeal to their central concerns and fears. Propaganda themes take the form of seemingly logical (though debatable) arguments. When the arguments become increasingly absurd and specific, they turn into disinformation.
Disinformation Seeking to Bolster the Propaganda Theme of Corruption in Ukraine
In contrast to propaganda’s broad arguments, disinformation purports to be a supposed specific fact. Typically, phony disinformation claims support the main propaganda themes. Thus, with regard to the propaganda theme that “Ukraine is [supposedly] one of the most corrupt countries in the world,” when Ukrainian president Volodymyr Zelenskyy visited New York in October 2023 to speak at the United Nations, disinformation circulated falsely claimed that his wife had “spent $1.1 million at a Cartier jeweler” with “money was likely siphoned from Western aid to Ukraine,” according to Clemson University researchers Darren Linvill and Patrick Warren. Their December 15, 2023 report, “Infektion’s Evolution: Digital Technologies and Narrative Laundering” stated:
On October 4, 2023, a story circulated on social media that reported Olena Zelenska, wife of the president of Ukraine, spent $1.1 million at a Cartier jeweler while visiting New York City with her husband to speak at the United Nations. Many spreading the story suggested that this money was likely siphoned from Western aid to Ukraine. For several days the story circulated across platforms and languages, gathering thousands of reposts and millions of views. The X (formerly Twitter) account @Megatron_ron gained ten thousand reposts and a million views when it shared the story. On TikTok, the Russian language @dymkxjyft66g received fifteen thousand likes and over 1.5 million views for their post spreading the story.
The narrative was improbable on its face; the idea that Olena Zelenska could slip away unseen to Cartier while on her extremely public visit to New York without note from press or public seems unlikely. Many reasonably questioned that narrative, but it was a story that was easy to believe for those who were already inclined to do so, and it was in these so-inclined online communities that the story predominately spread. It was another example, users claimed, of Ukrainian corruption and the wasting of Western aid. …
… Initial placement of this narrative occurred using an online video of a Black woman with a West African accent recounting her purported experience as a former Cartier employee. In the video the individual claimed to have been an intern who assisted Zelenska with her purchase. She describes how Zelenska became angry with her service and insisted that she be fired. The day after Zelenska’s visit the supposed intern claimed she was dismissed, but as she left, she took with her a copy of Zelenska’s receipt for $1.1 million. …
On October 7, the Italian news site Open (Puente, 2023) published proof that the improbable story was a complete fabrication. The protagonist of the video did not, in fact, live in New York and had never worked for Cartier. Careful online investigation revealed her to be a student and salon manager living in Saint Petersburg, Russia—home of the former Russian Internet Research Agency and birthplace of many of the Kremlin’s influence operations.
… The Zelenska-Cartier video was first reported on in an October 2 story on NetAfrique.net, a French language news page based in Burkina Faso. Within hours, variations of the story had appeared on at least four additional news sites based in Africa, including English language pages in both Ghana and Nigeria. All versions of the story included a link to the purported intern’s video and, in every case, this video was the only evidence offered for the validity of the story. The first English-language version of the story was posted on the Nigerian site The Nation. …
… Shortly after appearing in African sources, the story was also circulated by Russian-language media. Between October 4 and 5, the story appeared in over three dozen Russian news sources. These sites include relatively small outlets like Rossiyskaya Gazeta [the official newspaper of the Russian government], Argumenty i Fakty, and Ren TV on October 4 and then comparatively prominent sources such as RT and Lenta.ru on October 5. …
… On October 4 a link to a version of the story posted on the Nigerian website Naija Loaded was shared on X by pro-Russian influencer and RT contributor Tara Reade who received over 800 reposts. On October 5 the Russian Embassy to the United Kingdom’s X account posted a link to The Nation’s version of the story and received over 400 reposts. Together, these efforts paid dividends. The Zelenska-Cartier story began to be shared by large numbers of real users starting on October 4. On X, alone, the narrative was shared in English well over 20,000 times with several thousand additional posts spreading particularly in French, Russian, and Polish.
… The one news source with an apparent Western, specifically U.S., target audience we were able to identify in which the Zelenska-Cartier story appeared was a website called DCWeekly.org. On its face this source appears credible. It has a professional layout, is regularly updated flow of stories, and presents itself in every way as a right leaning D.C. news outlet with a full staff of contributing journalists. DCWeekly’s ethos is wholly fabricated, however, and close inspection shows it to be a purpose-built tool for narrative laundering, with likely links to the Russian government.
… The “About Us” page claims, as the name suggests, that DC Weekly began its life over 20 years ago as a weekly paper. No such paper existed …. The entire “About Us” section is a work of fiction, extremely likely to have been written by AI.
… the domain was inactive in late 2018 and did not reappear until April 2021, when the current website began operating …. At its relaunch in 2021, the dcweekly.org domain pointed to an IP address that was shared with many other unusual domains, all of which are affiliated with John Mark Dougan, a former police officer and conspiracy theorist who fled to Russia in 2016 and has since reinvented himself as an independent pro-Russian journalist in Donbass covering the Russian invasion of Ukraine.
A Similar Disinformation Claim Hits its Targets
Shayan Sardarizadeh of BBC Verify says the phony DCWeekly website was also used to spread the false disinformation claim that Ukrainian President Zelenskyy “bought two luxury yachts with US aid money, later repeated by some members of Congress,” specifically Senator J.D. Vance and House representative Margorie Taylor Greene.
A December 20, 2023 BBC Verify/BBC News story, explored in conjunction with the Clemson researchers, stated:
A website founded by a former US Marine who now lives in Russia has fueled a rumour that Volodymyr Zelensky purchased two luxury yachts with American aid money.
Despite the false claim, the disinformation plot was successful. It took off online and was echoed by members of the US Congress making crucial decisions about military spending.
It was an incredible assertion - using two advisers as proxies, Mr Zelensky paid $75m (£59m) for two yachts.
But not only has the Ukrainian government flatly denied the story, the two ships in question have not even been sold.
Despite being false, the story reached members of the US Congress, where leaders say any decision on further aid to Ukraine will be delayed until next year.
Some are vehemently opposed to further support.
On X, formerly Twitter, Republican Congresswoman Marjorie Taylor Greene said: "Anyone who votes to fund Ukraine is funding the most corrupt money scheme of any foreign war in our country's history."
She linked to a story containing the yacht rumour.
Tom Tillis, a Republican Senator and a supporter of military aid to Ukraine, spoke to CNN shortly after senators held a closed-door meeting with Mr Zelensky last week.
"I think the notion of corruption came up because some have said we can't do it, because people will buy yachts with the money," Mr Tillis said. "[Mr Zelensky] disabused people of those notions."
Mr Tillis has butted heads with another Republican Senator, J D Vance, who has also mentioned Mr Zelensky and ships in the same breath.
While discussing budget priorities on a podcast hosted by former Donald Trump adviser Steve Bannon, Mr Vance said: "There are people who would cut Social Security, throw our grandparents into poverty, why? So that one of Zelensky's ministers can buy a bigger yacht?"
The BBC story also notes:
BBC Verify also found that part of the DC Weekly website is hosted on a server in Moscow.
Earlier this year Mr Dougan was identified as being a DC Weekly commentator when he gave several talks at an academy affiliated with the Russian Foreign Ministry.
… Mr Dougan said via text message that he "emphatically denies these assertions", and that he sold DC Weekly for $3,000 several years ago. He said he does not recall the person he sold it to and has lost the paperwork due to being kicked off payment platforms and losing access to email accounts because of financial sanctions against Russia. He says he has nothing to do with the site's current operations.
The [Clemson] researchers say the site is part of a much larger pro-Russia propaganda machine.
"Whether this one particular guy is behind it doesn't really matter much," Mr Warren [of Clemson] said. "The key point is that it is an important element in a very substantial and effective pro-Russian influence operation that needs to be exposed and understood."
BBC Verify says they and the Clemson researchers found several other disinformation stories on DC Weekly articles in late 2023, which:
falsely alleged that Prince Andrew made a secret visit to Ukraine, that Ukraine provided weapons to Hamas, that an American non-profit organisation harvested organs in Ukraine and that Zelensky's administration allowed Western companies to use Ukrainian farmland for disposal of toxic waste.
… Some of the stories were picked up by other outlets and accounts. But with the story about the yachts, the people behind DC Weekly appear to have achieved a level of success that had previously eluded them - their allegations being repeated by some of the most powerful people in the US Congress.
The Institute for Strategic Dialogue in London, which works against extremism, hate, and disinformation, highlighted very similar false Russian disinformation claims about supposed corruption by the Ukrainian president in its February 2024 report “Two Years On: An Analysis of Russian State and Pro-Kremlin Information Warfare in the Context of the Invasion of Ukraine.” It states:
A tactic that seemed to be used more frequently over the second half of 2023 is to cultivate fake online personas, such as “whistleblowers”, doctors”, “activists” or journalists” to spread disinformation. Often these accounts are used to plant fake content or unverified claims relating to corruption by high-ranking Ukrainian officials.
For example, a YouTube video in August 2023 originally posted by an account calling himself Mohammed Al-Alawi claimed he had proof that President Zelenskyy’s mother- in-law bought a villa in El Gouna, Egypt for 5 million dollars of humanitarian aid money sent to Ukraine. The video was the only piece of content posted by Mohammed Al-Alawi. This claim has been picked up by numerous international disinformation sources including Natural News (which has 74K followers on Telegram), a US based commercial enterprise which acquired notoriety for promoting conspiracy theories and disinformation during the COVID-19 pandemic. Natural News claimed that Zelenskyy himself bought the villa through his mother-in-law using Western aid money.
A similar case involved the alleged “journalist” Shahzad Nasir, who in November 2023 spread the idea that there was a corruption scandal involving the Ukrainian president Volodymyr Zelenskyy and two close associates, Boris and Serhiy Shefir, who [supposedly] purchased two luxury yachts with a combined value of 75 million dollars. Despite being quickly debunked by Community Notes on X, the claim about “Zelenskyy’s yachts” was spread by pro-Kremlin outlets in various languages including the Strategic Culture Foundation (which has been banned on Facebook, Twitter and YouTube but has 12K followers on Telegram), reportedly linked to Russian intelligence service SVR and several pro-Kremlin Telegram channels including the influencer Alina Lipp. Nasir’s YouTube channel was created in November 2022 and previously only published one video. “Nasir” appears only briefly in the video. His account on X was recently repurposed and only started its activity as Shahzad Nasir in November 2023. ISD could not find any evidence that a person with the same name is doing professional journalistic work.
In another case in November 2023, an alleged “French eco activist and journalist” claimed that Alexander Soros, son of George Soros, made a deal with the Ukrainian government to use 400 square km of Ukrainian land for free to deposit toxic waste. The X account which initially published this claim appears to only have been active since mid-November 2023. The claim was picked up by numerous Russian state and pro-Kremlin websites such as Sputnik, Tass or Pravda.ru and spread in German by Alina Lipp and the Swiss website Uncut News. Similarly, a “whistleblower doctor” from Africa who allegedly worked for the US medical company Global Surgical and Medical Support Group (GSMSG) claimed in November 2023 that wounded Ukrainian soldiers were being brought to Germany where their organs are harvested for NATO officers and high command who need organ transplantations. … These claims were laundered through African news websites such as The Nation, NewsGhana and NetAfrique, including as sponsored posts, and later picked up by Russian state and pro-Kremlin media outlets and accounts. (p. 12-13)
Déjà vu all over again
Reading these various allegations reminds me of my past years countering misinformation and disinformation. From 1987 to 1996, I spent an enormous amount of time countering false rumors of supposed (but non-existent) child organ trafficking. Alleged organ trafficking is one of those purportedly horrifying crimes that is a great way to try to smear an opponent if you are spreading disinformation. For example, in October 2023, the Vice Speaker of the Russian State Duma Anna Kuznetsova made the absurd claim, “Our soldiers found documents on the sale of children and human organs from which Ukraine derives 7% of its national budget, with the support of private British military companies and Coca Cola.”
Somehow, I find it less than believable that Ukraine generates fully 7% of its national budget from the sale of children (in actuality, Russia is stealing thousands of Ukrainian children) and organ trafficking or that one of the largest companies in the world, Coca-Cola, which has the world’s sixth most valuable brand and sells more than 1.8 billion company beverage servings each day, would risk its extraordinarily valuable business by engaging in organ trafficking to supplement its massive worldwide multi-billion dollar beverage trade, but disinformers love to include widely recognizable supposed “bad guys” to try to bolster their phony claims.
Similarly, Russian Foreign Ministry Spokeswoman Maria Zakharova falsely claimed in June 2023 that “Ukraine is ready to trade the organs of its people for Western military assistance” and “is rapidly turning the country into a global hub of human organ trafficking,” the Russian news service TASS reported.
These false organ trafficking claims were not new, although their endorsement by senior Russian officials was. In April 2022, the Canadian Communications Security Establishment said, there had been “a coordinated effort by Russia to create and spread false reports that Ukraine has been harvesting the organs of fallen soldiers, women and children, and using mobile cremators to dispose of the evidence.”
When it comes to creating false stories that disinformation practitioners want to go viral, false organ trafficking claims, outrageous corruption allegations, and other appeals to sensationalism and fear (biological weapons and alleged biolabs being a favorite) are a sure bet. See the GEC Counter-Disinformation Dispatch #13, “Exploiting Primal Fears,” published on January 13, 2022, for information on how misinformation and disinformation often revolve around such primal fears, because shocking claims can exploit horror to bypass rational analysis.
The Symbiosis between Propaganda and Disinformation
In the above example, the propaganda themes are like a conductor’s instructions, providing overall guidance and direction. It is up to disinformation practitioners to take these broad talking points and transform them into emotionally compelling, false, stories because it seems quite likely that no child has ever said, “Mommy, could you tell me some bedtime talking points?”
Children – and everyone else – find emotionally compelling stories memorable in making sense of the world. Disinformation specialists have the challenging, very difficult task of doing the very demanding scut work of inventing such stories and somehow bringing them to the attention of those whom they wish to influence – in this case, U.S. lawmakers – in a way that will seem credible to them, despite the fact the claims are baseless. This is difficult, creative work, which takes a tremendous amount of hard work, imagination, attention to detail, knowledge of which “hot buttons” to press in the minds of the target audiences, and a total lack of conscience. It is work that is perfectly suited to meticulous, highly motivated sociopaths or other similarly morally-compromised people.