If the first round of Ukraine’s presidential election went smoothly, the road to the second is getting bumpy.
As the country prepares for the April 21 runoff vote, President Petro Poroshenko — who came in second in the March 31 first round — is fighting for his political career. Meanwhile, front-runner Volodymyr Zelenskiy, who lacks political experience, is coming under greater scrutiny.
Amid increasingly fierce competition, the big guns are coming out: negative campaign ads, so-called “black PR,” and online disinformation.
Most seriously, a website with ties to the Ukrainian security agencies has accused the Zelenskiy campaign of receiving financing from the Russian security service and a Russian-backed militant who fought in Ukraine’s occupied Donetsk Oblast.
With voters facing a choice between five more years of Poroshenko and a new, largely untested leader, the politicized media campaign is likely only making their decision more difficult.
“I think there’s a lot of playing hard and fast with the rules of the information space,” says Nina Jankowicz, a Global Fellow at the Kennan Institute and an expert on disinformation. “And I think both candidates would be better served…by taking a pledge not to engage in disinformation.”
Continuing the tradition
In Ukraine’s tempestuous political climate, dirty campaigns are nothing new.
In 1999, one of the presidential candidates, Nataliya Vitrenko, was injured in a grenade attack. Later, a former presidential bodyguard would implicate President Leonid Kuchma in organizing the attack and pinning the blame on a rival.
In 2004, presidential contender Viktor Yushchenko was poisoned with dioxin during an election that also featured significant electoral fraud. Ultimately, Yushchenko would be elected president in a re-vote held after the Orange Revolution.
By those measures, 2019 is quite tame. But ads and social media posts intended to mislead and scare voters have come to play a central role in the presidential race.
On April 9, the Poroshenko campaign made waves by publishing an image of the president staring down Russian President Vladimir Putin, instead of Zelenskiy, on billboards and newspapers. “A decisive decision,” the billboards declared.
Facebook groups like “Sluga uroda” — or Servant of the Freak, a play on the name of Zelenskiy’s television show and political party, Servant of the People — publish a near-constant stream of anti-Zelenskiy and pro-Poroshenko memes. The Facebook group AntiPor offers the rough equivalent against Poroshenko.
And social media users have long complained of facing harassment from porokhobots — i. e. Poroshenko bots — who vocally defend the president. Some, they allege, are not just ordinary citizens expressing their honest opinions, but paid “trolls.”
Drawing the line between simple negative campaign ads and disinformation can be a challenge. But Jankowicz says negative campaigning becomes disinformation when it is deliberately misleading.
She cites a video clip shared by a pro-Poroshenko Twitter account on April 8 as a clear example. The clip showed Natalia Poklonskaya, a well-known Russian lawmaker and former prosecutor of Russia-occupied Crimea, appearing to “endorse” Zelenskiy. And someone had emblazoned the clip with the Zelenskiy campaign’s logo.
“It is deliberately misleading. An uneducated or simply busy viewer might think it is a legitimate Zelenskiy product,” Jankowicz wrote in a tweet at the time. Currently, that video has been retweeted over 600 times.
There have also been other glaring examples of disinformation during the presidential campaign. In late March, the 1+1 television channel broadcast a program which accused Poroshenko of corruption and implied he had killed his own brother.
The television channel is owned by Ihor Kolomoisky, a Ukrainian oligarch with a longstanding grudge against Poroshenko, and the latter claim was outlandish even by the standards of Ukraine’s biased television media.
On April 10, an organization associated with Poroshenko sent subscribers to its messenger app accounts a video which showed Zelenskiy being hit by a garbage truck and strongly implied he was a drug addict. The next day, the Zelenskiy campaign said it would encourage the candidate to strengthen his security detail.
Disinformation can also mix facts with distortion. A video shared on April 8 by a Facebook page called Zhovta Strichka (“Yellow Tape”) claims that, as part of a drug and alcohol test that both candidates underwent, Zelenskiy only gave a blood sample, not a hair and nail sample. According to the documents published by the Zelenskiy campaign, that appears to be true.
The video also notes that the doctor who administered the test had been on reality television and points out that the Zelenskiy campaign initially published results from the test that listed the wrong date (the clinic took the blame for this mistake).
However, while all these details have raised questions about the trustworthiness of Zelenskiy’s drug test, the video uses this to draw three extreme — and unsubstantiated — conclusions: the doctor was fake, the test results were falsified, and Zelenskiy is a drug addict.
“He’s not just a drug addict, he’s also a con artist,” the video declares.
Finally, there is another defining feature of disinformation online: when the accounts sharing the images, videos, or links are not authentic users.
Vitalii Moroz, an expert on disinformation who worked for Internews Ukraine for over a decade, offers an example: a few days ago, he saw a photo of a Poroshenko campaign billboard displayed over a heavily potholed street.
What made the image disinformation? Moroz says that the image was actually made up of two photos spliced together. He discovered that the photo of the road was fairly old and from Russia. Additionally, there were signs that not all the users sharing the image were actual people.
“In two days, this photo got 18,000 shares,” a number requiring multiple shares every minute, he told the Kyiv Post. “I’m not (wholly) confident, but it might be inauthentic behavior.”
Blurred lines
If informed voters can fairly easily spot social media posts that distort the facts, other information is more difficult to parse.
In the last week, at least two entities have published information suggesting that the Zelenskiy campaign is tied to or receiving financing from Russia.
On April 2, Christo Grozev, an entrepreneur and researcher who works with the Bellingcat online investigations website, wrote on Twitter that a 2014 mass leak of files from Russia’s right-wing Liberal Democratic Party included a document outlining a plan to insert a “comedy candidate” into Ukrainian elections. The plan aimed to “splinter the mainstream vote,” alienate youth from traditional politicians, and undermine Ukraine’s global image, Grozev wrote and shared an image of the document.
While Grozev openly stated that he was not implying that Zelenskiy was “a Russian political technology project,” some appear to have taken the document that way.
The EuroMaidan Press blog subsequently characterized the plan as “eerily similar” to the Zelenskiy campaign. The newcomer politician’s “vague statements on Russia and total inexperience in politics… (make) one suppose that (Zelenskiy) is the preferred candidate for Ukraine’s huge, aggressive neighbor,” EuroMaidan Press editor-in-chief Alya Shandra wrote.
Then, on April 3, the Myrotvorets Center, a website that leaks the personal data of people it claims have committed crimes against Ukraine, published an article alleging that the Zelenskiy campaign had received funding from Russia’s Federal Security Service, or FSB.
The publication included screenshots of three emails. The first two appeared to show an FSB agent named Andrei Pinchuk agreeing with a former Russia-backed separatist fighter named Dmitry Khavchenko to send “200,000” in cryptocurrency to Khavchenko’s account. Khavchenko then confirms that he received the money and will convert it into cash.
Finally, the third email appeared to show Khavchenko sending a short, rather cryptic message the Mykolaiv office of Zelenskiy’s campaign: “4 200.”
On April 7, Myrotvorets followed up its publication with a letter to the heads of the SBU security agency, the National Security and Defense Council, and the Verkhovna Rada’s national security committee calling on them to take action on the matter.
The Zelenskiy campaign has not commented directly on the Myrotvorets accusations, but did release a video explaining “black political technology” strategies and decrying their usage.
For all the seriousness of Myrotvorets’ claims, the publication has met skepticism both due to the limited evidence it provided and to the site’s checkered reputation.
Myrotvorets is widely recognized as tied to the SBU, and its database effectively serves as a list of people banned from entering Ukraine.
In May 2016, it provoked outrage by publishing the names of 4,500 international and Ukrainian journalists who received press accreditation from the Russia-backed fighters controlling parts of Ukraine’s eastern Donetsk Oblast, accusing them of “cooperating with terrorists.” That move brought the organization harsh criticism from the international press, the Organization for Security and Co-Operation in Europe, and Reporters Without Borders, who alleged it put the journalists lives at risk.
At times, the site has also waded directly into domestic politics. In 2017, it added former Prime Minister Yulia Tymoshenko to its database. She had accompanied former Georgian President Mikheil Saakashvili when he broke through the Ukrainian border on Sept. 10, 2017.
The Kyiv Post asked multiple political analysts to comment on Myrotvoret’s publication. Several declined, fearing harassment or the possibility of being added to the site’s database.
Mark Galeotti, a senior associate fellow at Royal United Services Institute and an expert on post-Soviet security agencies, says he finds Myrotvorets’ claims about Zelenskiy doubtful.
“I think one thing the Russians are deeply aware of is the extent to which they are toxic in terms of Ukrainian politics,” Galeotti told the Kyiv Post. “If they were in any way going to funnel assets to the Zelenskiy campaign, they would have come up with better routes than Donbas fighters and FSB operators.”
Given the amount of Russian money sloshing around Europe, Galeotti says, it would have likely come from the west, not the east.
Galeotti also isn’t exceedingly impressed with suggestions that there is any connection between Zelenskiy and the plan outlined in the Liberal Democratic Party leaks. He suggests that the “comedy candidate” document likely represents Russian politicians “spinning their ‘bright ideas,’ hoping that someone in the presidential administration will like it.”
Other analysts take a more restrained view of Myrotvorets’ claims of ties between Zelenskiy and Russia.
Sean Townsend, a Ukrainian hacktivist who operates exclusively under a pseudonym, told the Kyiv Post in a Facebook message that he will wait for the results of the official investigation. “For me personally, the evidence provided does not seem adequate,” he said.
“For me, it isn’t true until it is proven by at least one source,” says Illia Kusa, an international affairs expert at the Ukrainian Institute of the Future. “They cite their own sources, but this is not an official state institution or security service. To me, the Myrotvorets website is not a primary source of information.”
Kusa says people must be very cautious about information from the site when the subject is particularly sensitive, like elections.
“Information from Myrotvorets should be double-checked because they are a pro-government website,” disinformation expert Moroz says. “On the other hand, it should be considered while researching other sources.”
Fighting back
Since the 2016 United States presidential election — when Russian trolls posed as ordinary Americans online in an attempt to tilt the vote toward Donald Trump and to exacerbate divisions in society — the subject of Russian influence operations has gained global importance.
Unlike the U.S., which was largely caught off guard by the Russian trolls, Ukraine has long-standing experience facing disinformation from Russia. Naturally, Russian interference was a serious concern in the run-up to 2019.
This was among the reasons why, on March 18, Facebook rolled out new rules requiring advertisers in Ukraine to identify their ads as political, confirm their identities, and include a “paid for by” disclaimer. The social media network will also archive all the ads for seven years, making it easier to trace and study them.
Representatives of Facebook also held a series of meetings with Ukrainian civil society organizations dealing with technology and political issues.
“They rely on local expertise and they want to hear what is the context of Ukrainian elections, what are the main threats, what are the challenges for the Facebook community in Ukraine,” says Moroz, who took part in the meetings.
So far, however, domestic disinformation has largely overshadowed foreign.
“I think most of the disinformation that we can confirm was actually distributed by the campaigns themselves and by domestic Ukrainian actors for political purposes,” Jankowicz says.
And despite Facebook’s outreach to civil society, social media sites appear to still have a lot of work left to do.
On April 11, a group of Western journalists on Twitter posted several screenshots of the Servant of the Freak group violating Facebook’s new rules and publishing political ads without a disclaimer.
They were quickly criticized by a veritable porokhobot — an anonymous pro-Poroshenko account with only eight followers. Beyond attacking the journalists, it was also actively sharing and retweeting other political content: the Myrotvorets Center, the Natalia Poklonskaya “endorsement,” and claims that Zelenskiy has ties to Russia.