With Ukraine under quarantine, social media has proven to be an extremely useful tool for the health ministry and the government to keep the public up to date on the latest developments.
But social media also rapidly spreads false and harmful information.
For most Ukrainians, sites like Facebook and Twitter have already become key sources of information, according to a survey by the nonprofit Internews and U.S. Agency for International Development.
But the risk of falling for fake news there is high, especially during the COVID-19 pandemic, when people are staying home, spending more time online and are worrying about their health.
Sometimes, the fake news comes from sites and accounts pushing groundless conspiracy theories to discourage people from listening to scientists. Other times, publications twist the facts or use manipulative images. And some attempt to turn a profit by selling ineffective or even harmful remedies for COVID-19.
To fight these fakes, the Ukrainian department at Facebook launched a fact-checking campaign on March 27, supported by local analytical platforms VoxCheck and StopFake.
VoxCheck has been detecting 20 to 40 coronavirus-related fakes daily, according to its manager Maksym Skubenko. StopFake has long been breaking down fake news about Ukraine coming from Russian mainstream media. It was among the first organizations to adapt already existing fact-checking methods in Ukraine. Now, Facebook is joining them with its 13 million users in Ukraine.
According to Facebook’s public policy manager in Ukraine, Kateryna Kruk, the platform will analyze posts on the site, mark “manipulative” posts, and move them to the bottom of people’s news feeds.
“It helps to reduce and eventually to stop the spread of the unreliable content,” Kruk said.
But Facebook won’t be deleting fakes.
According to Facebook’s product manager, Tessa Lyons, people share fake news for different reasons: to garner clicks, yield profits or, in a broader sense, to achieve political gains and find supporters. Coronavirus disinformation is no different.
“In most cases, we deal with ‘light fakes’ created by ordinary users,” Skubenko from VoxCheck told the Kyiv Post.
But there are also deliberate fakes published by Ukrainian media and then shared via social networks. They are harder to detect and remove, he said.
For instance, a woman saying that garlic can cure COVID-19 is a light fake, because it causes no immediate harm. A few days ago, however, VoxCheck detected a video of a tour guide in Italy who claimed that the coronavirus was a conspiracy and there was no need to follow restrictive quarantine measures. This video got nearly 100,000 shares before it was exposed by Facebook.
Internet bot farms
On April 1, the Ukraine’s Security Service (SBU) reported that it had uncovered and broken up a group of internet agitators in Kyiv who spread fake information about the coronavirus on Russia’s dime.
They used a network of bot accounts to share disinformation about COVID-19 with the Ukrainian Facebook audience. The authorities believe the fraudsters received payment directly from Russia.
Since the beginning of the quarantine on March 12, the SBU has identified 81 people who spread false information about the coronavirus. The Ukrainian cyber police have identified 170 COVID-19-related fakes.
According to Skubenko, the pandemic and the growing number of fakes were among the reasons Facebook launched its anti-disinformation campaign now, even though it had initially planned it for summer.
Now, Facebook’s fake-detecting tools are adjusting to the Ukrainian language and market. The social media platform promises to expose many more fakes in the future.
How it works in Ukraine
As a platform that claims to “make the world more open,” Facebook has long been pushing its fact-checking initiatives.
Facebook departments in as many as 55 countries have already launched the content-verification program in an attempt to fulfill three of Facebook’s principles for fighting fake news: to remove, to reduce, to inform.
Detecting fakes at Facebook Ukraine consists of several steps. First, the algorithm collects data about content that can be considered “false information.” To distinguish the presumably manipulative posts, it relies on machine learning, user complaints and even comments that suggest the content is unreliable.
Next, the posts are verified by two local fact-checking organizations, Stop Fake and VoxCheck, each certified by the International Fact-Checking Network.
They will then mark the news that does not meet Facebook’s criteria as “false information” and explain to users what is wrong with this content.
If users keep sharing false information even after the warning, Facebook will display their content at the bottom of the newsfeed, making their paid promotions – when organizations pay Facebook to make their publications more prominent – pointless.
The same algorithm is used on Instagram, another popular Facebook-owned social network with about 11 million users in Ukraine.
But there is one major exception to Facebook’s rule: The company will not verify claims made by politicians, unless they share content already marked as manipulative. Their goal is to show the real face of public figures who share false information.
“Facebook thinks that people should be able to judge politicians by themselves,” Stop Fake’s founder Ruslan Deynychenko told the Kyiv Post.
Anxiety-driven response to crisis
During a natural disaster, people look for more information to reduce their anxiety and “make sense” of what is going on, according to a 2016 study on crisis informatics, the use of information and technology to respond to emergencies.
As users lose their trust in official sources, which often share contradictory and inconsistent information, social media plays a huge role in the process of “collective sensemaking,” the study says.
Even government agencies responsible for managing the crisis use search engines and social media feeds to promote reliable content and communicate their messages.
Before rolling out the COVID-19 fact-checking tool, Facebook was already cooperating with the Ukrainian Ministry of Health. On March 15, Facebook started prioritizing health ministry news in people’s feeds to navigate users to official coronavirus information.
But even before Facebook got involved, the ministry had already tried to fight fake news on its own by launching official accounts across the most popular social media platforms in Ukraine.
For example, it launched a coronavirus-dedicated channel on the most popular messenger in the country, Viber. It has already gained 3 million subscribers, while almost 800,000 have joined a similar channel by the ministry on Telegram.
The government has also created accounts on the photo- and video-sharing social platforms Instagram and TikTok, which largely target young audiences. These platforms have even launched a new challenge hashtagged #washyourhands, which gained support from Ukrainian celebrities and politicians. So far, only Kyiv Mayor Vitali Klitschko has taken part in the challenge.
The health ministry says that 15% of COVID-19 patients in Ukraine are people from 11 to 30 years old and, according to the U-Report by the United Nationsl Children’s Fund, almost a quarter of youth do not know how to behave when the first symptoms of the respiratory disease appear.
But despite the government’s attempts to reach the public with official coronavirus information, half of Ukrainians still depend on unreliable sources, according to the latest survey by the Center for Content Analysis.
And although Facebook launched its fact-checking tools in Ukraine, there’s still a high chance that many people will continue reading fake news.
“It is not a panacea,” Facebook Ukraine’s Kruk said, “but it is a substantial step forward.”