Education News Technology

First Ever Study of Wartime Deepfakes Reveals their Impact on News Media

A first ever study of wartime deepfakes reveals their impact on news media and outlines implications for social media companies, media organisations and governments.

Deepfakes are artificially manipulated audio-visual material. Most deepfake videos involve the production of a fake ‘face’ constructed by Artificial Intelligence, that is merged with an authentic video, in order to create a video of an event that never really took place. Although fake, they can look convincing and are often produced to imitate or mimic an individual.

Researchers at University College Cork (UCC) examined tweets during the current Russian-Ukrainian war, in what is the first analysis of the use of deepfakes in wartime misinformation and propaganda. The study is published in PLOS ONE.

Close to 5,000 tweets on X (formerly Twitter) in the first seven months of 2022 were analysed in the UCC study to explore how people react to deepfake content online, and to uncover evidence of previously theorised harms of deepfakes on trust. As deepfake technology becomes increasingly accessible, it is important to understand how such threats emerge over social media.

The Russo-Ukrainian War presented the first real-life example of deepfakes being used in warfare. The researchers highlight examples of Deepfake videos during this war including, the use of video game footage as evidence of the urban myth fighter pilot “The Ghost of Kyiv”, a deepfake of Russian president Vladimir Putin, showing the Russian, president announcing peace with Ukraine, and the hacking of a Ukrainian news website to display a deepfaked message of Ukrainian President Volodymyr Zelensky surrendering.

The study found fears of deepfakes often undermined users trust in the footage they were receiving from the conflict to the point where they lost trust in any footage coming from the conflict. The study is also the first of its kind to find evidence of online conspiracy theories which incorporate deepfakes.

The researchers found much real media was labelled as deepfakes. The study showed that the lack of deepfake literacy led to significant misunderstandings of what constitutes a deepfake, showing the need to encourage literacy in these new forms of media. However, the study demonstrates that efforts to raise awareness around deepfakes may undermine trust in legitimate videos. News media and governmental agencies need to weigh the benefits of educational deepfakes and pre-bunking against the risks of undermining truth, the study asserts. Similarly, news companies and media should be careful in how they label suspected deepfakes in case they cause suspicion for real media.

The study was led by UCC School of Applied Psychology researcher John Twomey and co-written with fellow researcher Didier Ching, along with Supervisors Dr Conor Linehan and Dr Gillian Murphy of UCC, Dr Matthew Aylett of CereProc Ltd. and Heriot-Watt University, and Prof. Michael Quayle of the University of Limerick.

John Twomey, UCC researcher, said, “much of the misinformation the team analysed in the X (formerly Twitter) discourse dataset surprisingly came from the labelling of real media as deepfakes. Novel findings about deepfake scepticism also emerged, including a connection between deepfakes fuelling conspiratorial beliefs and unhealthy scepticism. The evidence in this study shows that efforts to raise awareness around deepfakes may undermine our trust in legitimate videos. With the prevalence of deepfakes online, this will cause increasing challenges for news media companies who should be careful in how they label suspected deepfakes in case they cause suspicion around real media.”

Dr Conor Linehan, in UCC’s School of Applied Psychology, said, “researchers and commentators have long feared that deepfakes have the potential to undermine truth, spread misinformation, and undermine trust in the accuracy of news media. Deepfake videos could undermine what we know to be true when fake videos are believed to be authentic and vice versa.”

Source: University College Cork

Related Posts