LIVE JOURNALISM: Fake News and Disinformation in Election Campaigns
7 to 11 September 2024, Potsdam
Disinformation is used to influence election results and is an active attack on our democracy. Journalists must be aware of the issues of fake news and disinformation in order to help protect our democracy. This is an especially important task in the 2024 super-election year, in which more than 80 elections will take place, not least the US election on 5 November. The super-election year is also of particular importance in Germany due to the elections in three eastern German states (Thuringia, Saxony and Brandenburg).
In 2024, the M100 Young European Journalists Workshop (M100YEJ) was held for the 20th time. For five days, 19 young journalists gathered at the Potsdam Truman Villa, the headquarter of the Friedrich Naumann Foundation, to discuss “Live
Journalism: Fake News and Disinformation in Election Campaigns”. On Wednesday, they were hosted by the Media Innovation Centre Babelsberg (MIZ).
In its anniversary year, the participants not only received workshops to deepen their journalistic knowledge in practical and theoretical exercises, but also had the opportunity to present a journalistic article in the form of a reporter slam on stage. Especially in times when trust in journalism is dwindling, formats that rebuild trust through personal encounters between journalists and users are particularly important. On the evening of 11 September, the participants performed live in Potsdam on the topic of ‘Disinformation in election campaigns’ and told exciting stories from their everyday work as reporters.
The young journalists from 17 European countries discussed with each other, shared their experiences and used the time to exchange ideas and network. They learned about the different conditions and challenges for journalism that arise from the different political and social systems in their home countries. The participants came from Armenia, Belarus (Poland), Bulgaria, Czech Republic, Denmark, France, Georgia, Greece, Italy, Latvia, Malta, Moldova, Romania, Serbia, Sweden, Turkey and Ukraine.
SATURDAY, 7 SEPTEMBER
Belma Bagdat, programme coordinator at the Friedrich Naumann Foundation, welcomed the participants at the start of the workshop at the Truman Villa on Griebnitzsee. M100 Head of Programme, Sabine Sasse, motivated the workshop participants to take the opportunity to exchange ideas with young journalists from across Europe and to apply the skills they had learned in relation to ‘live journalism’ and their knowledge of ‘fake news’ and ‘disinformation’ in their home countries.
In the first workshop, Caroline Lindekamp, project manager of ‘noFAKE’ from CORRECTIV, presented the work of the non-profit and independent editorial team CORRECTIV. She shared insights into the medium’s research and approaches to fact-checking. She drew the participants’ attention to the fact that disinformation with the narrative that elections have been stolen or that there has been electoral fraud can be found in almost all countries.
The participants provided examples of how far-right parties also use this narrative in their own countries, such as in Romania. Ms Lindekamp then presented various examples of fake news and discussed with the participants whether a fact check could or should be carried out. She believes that journalists have a particular responsibility to fact-check when it comes to topics that directly affect democracy. In her view, fact-checks are particularly important when it comes to personal details about candidates, especially before elections.
‘Fake news’ can take on the most diverse forms: false statements, fake images or even completely fake articles with false source information. The latter are known as ‘duplicate operations’. These involve creating almost identical-looking websites that give the impression of being articles from quality media outlets. The only way to tell the difference is by looking at the URL. Lindekamp shared tools and approaches for uncovering different types of ‘fake news’:
1) Images can be taken out of context and not belong to a specified event or show a location other than the one specified. An initial way to check whether an image has already been published is to perform a ‘reverse image search’. Google or TinEye, for example, can be used for this. Google Maps can be helpful to check whether it is actually the specified location, since images are stored for each location.
2) Wolfram Alpha can also be used to check the weather on a particular day, providing a further clue as to whether an image matches an event.
3) Map checking can be used to determine whether the numbers given for demonstrations can be correct.
4) One indicator of whether an image has been generated by AI is the level of detail. It is recommended to look at the images closely and consider whether anything looks strange.
Lindekamp then presented the categorisation ‘Information Disorder’ by Claire Wardle. The following types can be distinguished:
• Misinformation: unintentional dissemination of false information
• Disinformation: the deliberate production and dissemination of false information with the aim of deceiving
• Malinformation: information that is based on reality but is disseminated with the intention of causing harm.
It is important to understand this theoretical background in order to be able to take the right action against misinformation, disinformation or malinformation. Building on this, Lindekamp provided further insights into the work of CORRECTIV. The medium is involved in both ‘prebunking’ and ‘debunking’. ‘Prebunking’ is about raising awareness of the existence of ‘fake news’ and showing people how to deal with it. ‘Debunking’ is about finding and correcting false reports that have already been published.
CORRECTIV differentiates in detail between different types of ‘fake news’, because not all ‘fake news’ is the same. The medium uses various ‘labels’ such as ‘true’, ‘completely fictitious’ or ‘wrong headline’ to differentiate exactly what has been changed. Especially when an article also contains truths and refers to reputable sources, it is a particular challenge to identify the false information.
The M100YEJ is so special and enriching for the participants because, among other things, they get insights into journalism from different countries across Europe. In the afternoon, the participants presented initiatives against disinformation from their
home countries in elevator pitches. A list of initiatives with links can be found here:
• CRTA – Truth O’Meter – Database of Statements made by political figures, Serbia
• Journalism Trust Initiative, France
• Fake Off, France
• Psychological Defence Agency, Sweden
• Prismag Magazine, Italy
• Open The Box, Italy
• The Media Education Programme, Romania
• VoxCheck project, Ukraine
• Czech Elves, Czech Republic
• factchecking chatbot, Georgia
• Explainer, Bulgaria
• WATCHDOG.MD, Moldova
• Safeguarding Children’s democratic agency, Denmark
• Belarusian Hajun Project, Belarus
• Fact Investigation Platform, Armenia
• “European elections: what and why we will be doing”, Latvia
• Fact check, Malta
• Ecology and Climate Journalism School, Turkay
• VouliWatch, Greece
Afterwards, the young journalists debated two different topics in an ‘Oxford-style debate’: ‘How do we solve the information deficit? Prebunking instead of debunking?’ and “AI – danger or cure-all for public discourse?” In the discussions, the various arguments and ideas presented during the day were reflected upon and exchanged. The result of both discussions was the realisation that there is no universal solution to combat disinformation. Different approaches must be combined to combat disinformation and thus protect our democracy.
SUNDAY, 8 SEPTEMBER
How does artificial intelligence influence elections? This question was the starting point for the workshop by Dr Katja Muñoz, Research Fellow at the DGAP Center for Geopolitics, Geoeconomics and Technology. From ‘cheap fakes’ to ‘deep fakes’ and ‘robocalls’, there are various ways in which artificial intelligence can be used to influence elections. The strategies vary. In addition to the goal of deceiving people and spreading untruths, it can also be about reinforcing existing beliefs. Examples of this can be found in the 2024 US election campaign, such as the portrayal of Kamala Harris as a communist.
Muñoz showed the participants examples of how AI has been used in a variety of countries in a political context, in election campaigns or for self-staging by politicians. Some examples: the depiction of the Argentine President Javier Milei as a lion in memes; the depiction of former Indonesian general Prabowo Subianto as an endearing grandfather through an avatar; the creation of a video of imprisoned former Pakistani President Imran Khan in which he appears to be speaking in a presidential office.
In several cases, deep fakes or robocalls (telephone calls in which a computer-controlled answering machine plays a pre-recorded message) have also been strategically used shortly before elections to influence the outcome. It is difficult to determine the extent of the influence, but the attempt itself is an attack on democracy. One example is a fake audio file that emerged just 48 hours before Slovakia’s 2023 elections. It suggested that Michal Šimečka, leader of the liberal Progressive Slovakia party, and Monika Tódová of the daily newspaper Denník N. were discussing how to manipulate the elections, including buying the votes of marginalised Roma.
The goals of using AI differ significantly. AI is used in elections to mobilise people, reinforce existing beliefs and persuade them to vote or polarise them.
AI facilitates cognitive warfare because images, videos or audio files can be created quickly and with little use of resources. The aim is not always to fake the authenticity of a photo, for example. It is about reinforcing existing beliefs and appealing to emotions. Some general observations on the use of AI in elections were presented by Katja Muñoz:
1) AI posts are not created solely to deceive, but often to reinforce existing beliefs. They aim to trigger strong emotional responses such as fear, anger, pride, fun or hype. It is important to be aware that even funny things may be trying to manipulate someone.
2) AI content is often not labelled.
3) There is an exponential increase in the use of hyper-personalisation in political communication.
4) There is a high scalability of political communication for voter mobilisation.
5) It enables the fragmentation and personalisation of political communication in terms of languages or minorities.
6) It uses robocalls to persuade people not to vote.
Muñoz raised the question of why influencing people through social media works so well. She then introduced the ‘four horsemen’, which symbolise four different biases. We are always exposed to these biases when we are online in social media:
• Anchoring bias: first impressions count.
• Availability bias: information that is readily available and catchy, emotionally charged or visually appealing is remembered.
• Confirmation bias: We select information that confirms our own beliefs.
• Belief perseverance bias: We reject proven facts that do not match our beliefs (conspiracies).
Everyone has these prejudices. It is important to be aware of them and to reflect on what you consume on social media yourself. Furthermore, the platforms on which ‘fake news’ can be spread were discussed. Not only Instagram or TikTok should be considered, but also messenger services such as Telegram or Whatsapp. Especially in WhatsApp groups, e.g. family groups, ‘fake news’ can be spread unfiltered and uncontrolled. This environment is particularly dangerous because it involves groups or people who are particularly trusted. Special caution is advised here.
In the second part of her workshop, Muñoz discussed with the participants how the social media ecosystem works. Regarding AI, it was noted that various platforms have stipulated in their ‘code of conduct’ that AI-generated images must be labelled, but in reality this does not happen. It is questionable whether the platforms will fulfil their responsibility in this regard in the future.
In social media, there are two ways to generate reach: organic growth or paid content.
In the workshop, the participants analysed how much reach the parties had before the European elections and how much money they invested in it. It was striking that, for example, the Greens or the CDU invested many times more than the AfD, but the AfD still had many times more reach than the Greens and the CDU. Money is not everything here. Through clever data analysis and precise targeting, the Volt Party was able to generate many times more votes than in the last European election. Simply looking at the money invested in political advertising does not provide a complete picture. Muñoz showed how the AfD, for example, operates with multidirectional, cross-platform strategies. This strategy consists of three phases:
Phase 1: Tactical planning / resource allocation (money)
Phase 2: Strengthening the fringes / spreading anti-democratic narratives
Phase 3: Mobilisation / defining a call to action
This strategy can be described as an industry of contract campaigns. At the beginning, there is an ‘influencer broker’ who controls the entire operation and coordinates the content with several ‘mega-influencers’. These in turn coordinate the content with a large number of ‘nano-influencers’. The way disinformation campaigns often work is also shown by the size of the network of pro-Russian websites that are part of the Russian information war in Europe. An overview of these websites can be found here.
There are various ways in which people are exposed to disinformation. This can be through influencers, they can come across it by chance in their feed, it can be spread by bots and they can interact with trolls.
Platforms can also be used to initiate political change. Two examples of this are the ‘tampon tax’ or the ‘Stay on Board’ movement. In both cases, influencers and a social media campaign ultimately led to a change in the law.
At the end of the workshop, the participants were asked to consider in small groups how a government, journalists or civil society could take action against disinformation. Concrete ideas were developed. A collection of the ideas can be found below:
• Cyber units in ministries that also use influencers to engage in ‘pre-bunking’ in the event of disinformation campaigns.
• Setting up a centre for strategic communication that can advise all ministries in the event of disinformation. This should also enable proactive action.
• A government should be more present on social media, explain its content to citizens and interact with them.
• Creation of a tool that can verify the authenticity of URL addresses in order to quickly identify duplicate websites.
• Sending push messages from the government to warn against disinformation.
• Phone calls, emails and leaflets to warn against disinformation, also reaching the older population
• Provision of Telegram channels with fact checkers
• Development and dissemination of counter-narratives
• Provision of money by the government for NGOs, journalists and activists to do ‘pre-bunking’.
When it comes to events that are initially difficult to assess, it is important to communicate transparently what you know and what you don’t know.
If there is not enough information, one way to explain the background or history of an event is to explain it.
It is important to work with short videos that can go viral.
Ideally, the content should be conveyed by one person, because people believe people and this is how trust can be built.
• Journalists can use social media to distribute their articles and build a community that trusts them.
• There is no need for journalists and content creators to be contradictions. In the future, it is important that there are also content creators who want to maintain a certain journalistic standard.
Traditional media houses should also work with influencers to disseminate quality journalism.
There needs to be more awareness of disinformation in schools. Digital literacy should be made a compulsory subject.
Grassroots journalism needs to be strengthened. By involving society, different information can be collected and people are included in the process, which can also help to rebuild trust.
After the theoretical input on disinformation, fake news and AI in elections, the first Reporter Slam session took place on Sunday afternoon. In this first session, the participants thought about story ideas for the Reporter Slams. Everyone presented their ideas and received feedback from coaches Christine Liehr and Jochen Markett, both managing directors of the Potsdam agency Headliner.
MONDAY, 9 SEPTEMBER
The workshop day started with theoretical input on the topic of ‘live journalism’. Jochen Markett and Christine Liehr worked with the participants to identify the most important elements of good ‘live journalism’:
1) Passion
2) Individuality and personal experience
3) Surprise
4) Transparency
5) Performance on stage
Together with the coaches, the participants analysed various reporter slams from the finals in 2023. This allowed them to see how the elements discussed earlier were implemented.
The three principles of rhetoric from Aristotle are also important for live journalism: ethos, pathos and logos.
The following tips can help you present stories even better on stage:
1) Make a personal connection: Show what personal experience, connection or interest you have in the topic.
2) Show the importance of the story: Why do you think the story needs to be told? What will the talk reveal about the world?
3) Tell the story behind the story: What happens behind the scenes?
4) Put the audience in the place of the action: Consider what materials / original sounds / sounds / visualisations, etc. could support the story
5) Use dramatic storytelling: Think about the plot and the development of the characters. Use detailed descriptions. Tell the story with a personal narrative voice. Guide the audience through the story with ups and downs.
6) Revise the story: rewrite the story several times.
7) Practice, practice, practice: to perform well, it is important to have rehearsed several times.
8) Performance time: engage with the audience and win them over.
The rest of the day was spent working on their texts for the Reporter Slam.
TUESDAY, 10 SEPTEMBER
The participants continued their work and all of them received individual feedback from the coaches during the day. Ideas became stories and stories became reporter slams. The participants used videos, pictures, memes, songs and sound effects to make their stories even more tangible. In the afternoon, the participants received another training session on ‘Speaking on Stage’ from Isabelle Feldwisch. In the workshop, Feldwisch taught the participants various techniques that they can use before a performance to focus, loosen up and relax. She worked on stage presence, appearance and posture. Afterwards, the 20 participants performed their reporter slams in two semi-finals. The top three from each semi-final went through to the final, which took place the next day in front of a large audience at the Potsdam cultural centre FreiLand.
WEDNESDAY, 11 SEPTEMBER
After the semi-finals the night before, the morning was used to evaluate the performances again. The participants received valuable feedback that will help them to present their stories even better outside of the Reporter Slam. The six winners of the semi-final used the rest of the day to further improve their performances and prepare for the grand final. The rest of the group worked on the presentation for the M100 Sanssouci Colloquium to share with the experienced journalists what they had learned over the past week about disinformation, AI, elections and live journalism.
In the evening, the public reporter slam took place at Potsdam’s cultural centre ‘Freiland‘. Moderated by Jochen Markett and accompanied by the live band ‘Bommi & Brummi‘, the six finalists – Flora Alfiero (Italy), Tobias Bundolo (Denmark), Tsisia Kirvalidze (Georgia), Pavela Kostova (Bulgaria), Christoph Schwaiger (Malta), and Liza Tkachenko (Ukraine) – were voted for by the audience. Christoph Schwaiger won with his gripping and moving story about the murdered Maltese investigative journalist Daphne Caruana Galizia.
THURSDAY, 12 SEPTEMBER
The young journalists took part in the M100 Sanssouci Colloquium entitled ‘Democracy under Attack. Disinformation Campaigns, AI and the Role of the Media in the 2024 Super Election Year’. This year’s M100 was opened with a speech by Anna Wieslander. After the first plenary discussion, the 19 young journalists presented their new insights into disinformation, AI and ‘live journalism’ to the plenary session, which was made up of around 80 leading representatives from the media, political institutions and academia (click here for the presentation).
Afterwards, the winner of the reporter slam, Christoph Schwaiger from Malta, presented his story about the murdered Maltese journalist Daphne Caruana Galizia to the international participants of the M100 Sanssouci Colloquium.
(Summary: Florentin Siegert)
The M100 Young European Journalists Workshop is an initiative of Potsdam Media International e.V. and the City of Potsdam.
The 20th edition was sponsored by the Friedrich Naumann Foundation for Freedom, the Deutsche Postcode Lotterie and the ZEIT Stiftung Bucerius.