Imagine a nation with deep political divides. Different groups do not trust each other, and violence is likely. Imagine a social media flooded with political images, mocking videos, and hateful memes from both domestic and international sources. What will happen next?
Social media’s widespread use during political turmoil and violence makes it difficult to build peace and prevent conflict. Social media has changed, and new technologies and tactics are available to influence people’s thoughts during political crises. There are new ways to gain support, promote beliefs and goals and dehumanize the opposition, justify violence, create doubt and dismiss inconvenient information.
is also becoming more sophisticated. Social media campaigns are increasingly using images, such as videos, memes and photos, whether they’re edited or not. These have a greater impact than text.
Images are harder to understand for AI systems than text. It’s much easier to track down posts saying “Ukrainians Are Nazis” rather than to understand and find fake images of Ukrainian soldiers wearing Nazi symbols. These images are more and more common. A meme is as valuable as a 1,000-word picture.
The team, consisting of computer experts and subject matter experts, has taken on the challenge of interpreting images by using artificial intelligence methods in conjunction with human subject-matter experts. They have studied how visual social media postings change when high-risk situations occur. Our research shows these changes, particularly in images on social media, are indicators of upcoming mass violence.
The memes are exploding
In our recent analysis, we found that during the two weeks prior to Russia’s invasion of Ukraine in 2022 there was an increase of nearly 9,000% in the number posts as well as a greater than 5,000% rise in the number manipulated images by Russian milbloggers. Milbloggers focus on the current conflicts in military affairs.
These massive increases demonstrate how Russia used social media to manipulate people’s opinion and justify the invasion.
This shows that visual content needs to be better monitored and analyzed on social media. We collected all the posts and images on the Telegram accounts of 989 Russian milbloggers to conduct our analysis. Nearly 6 million posts, and more than 3 million images are included. Each image and post was timestamped, categorized and sorted to allow for detailed analysis.
Media forensics
We developed a set of AI Tools that detects image manipulations and alterations. One detected image, for example, shows a pro Russian meme mocking anti Putin journalist and former Russian soldiers Arkady Bchenko, whose assassination was faked by Ukrainian Security Services to expose a plot against him.
The meme uses the phrase “gamers don’t die, they Respawn” to refer to video game characters that come back to life when they die. This is a humorous way to make light of Babchenko and shows how manipulated images can be used to influence public opinion and convey political messages.
This is only one of millions of images which were strategically manipulated in order to promote different narratives. Our statistical analysis revealed that both the number and extent of the manipulation of images had increased dramatically prior to the invasion.
The political context is crucial
These AI systems, while excellent at detecting fakes are unable to understand the political context of images. To properly interpret the findings, it is important that AI scientists collaborate closely with social scientists.
Our AI systems also classified images by similarity. This allowed subject experts further analyze image clusters on the basis of their narrative content, as well as culturally and politically specific interpretations. It is impossible to achieve this at scale without AI.
A fake image of French President Emmanuel Macron and Ukrainian Governor Vitalii Kim, for example, may have no meaning to an AI scientist. To political scientists, the image seems to praise Ukrainians for their courage in comparison to other leaders who appear to be scared of Russian nuclear threats. The aim was to make Ukrainians doubt their European allies.
Meme warfare
In recent years, the shift from text to visual media has brought a new kind of data which researchers haven’t studied in depth.
Researchers can learn from images how political conflict can be sparked by the framing of adversaries. Researchers can study visual content to see how ideas and stories are spread. This helps them understand the psychological factors and social factors at play.
It is important to find out how people are influenced in subtler and more sophisticated ways. These projects can also contribute to improving the early warning efforts, and reduce risks of violence and instabilities.
Tim Weninger is funded by the US Department of Defense as well as the US Agency for International Development.