Deepfakes that defame political opponents and calls to vote on platforms where political advertising is officially banned: in the super election year of 2024, several instances showed that major digital platforms are not doing enough to combat misinformation and disinformation, despite having announced relevant policies. This is the conclusion of the second CoP Monitor published by the German-Austrian Digital Media Observatory (GADMO). This year’s report focuses on the European elections as well as the German state elections in Saxony and Thuringia in 2024, examining the effectiveness of measures taken by Meta (Facebook and Instagram) and TikTok to counter politically motivated disinformation in German-speaking regions.
A research team from TU Dortmund University, led by Dr. Susanne Wegner, evaluated the platforms’ commitments under the Code of Practice (CoP) and compared them with actual cases of disinformation identified by GADMO fact-checking organisations CORRECTIV, dpa, AFP, and APA. In parallel, a seminar project at the Institute of Journalism at TU Dortmund University analysed the overall strategies of the social media activities of the six major parties and their leading candidates to understand their approaches to election campaigning.
The Code of Practice on Disinformation (CoP) is a self-regulatory framework initiated by the EU Commission and signed by the major technology companies (Google, Meta, Microsoft, TikTok), among others. It provides for numerous commitments that and how platforms should act against dis- and misinformation. These include, for example, transparency rules, the enforcement of community guidelines or the obligation to cooperate with fact-checkers and researchers. The first version of the CoP was published in 2018 and subsequently revised. The current version was published in 2022 and contains 128 concrete measures against dis- and misinformation. You can find the full text of the code here.
The report reveals: Despite clear progress by platforms – including policies on collaborating with fact-checkers and labeling AI-generated content – significant challenges persist. “The effectiveness of these measures is particularly limited during election campaigns, likely because the platforms’ efforts are hindered by underlying structural issues,” explains research leader Dr. Susanne Wegner.
Challenges and problematic patterns in election campaigns
The analysis identified the following problematic patterns in how platforms handle disinformation:
- Exploitation of migration and asylum issues: In both the European and state election campaigns, migration and asylum emerged as key triggers for disinformation. The right-wing populist political party AfD (Alternative for Germany), in particular, strategically used these topics, deploying AI-generated content to create scapegoats and deepen societal divisions.
- Instrumentalisation of violent incidents: Knife attacks in the Germany cities Mannheim (ahead of the European elections) and Solingen (before the state elections) were exploited, even across national borders, to stoke fear and fuel anti-migrant sentiment.
- Allegations of election manipulation: Across all three elections examined, allegations of supposed election fraud circulated, primarily driven by AfD supporters. These narratives sought to undermine trust in the electoral process by spreading unfounded claims of vote rigging.
- Improper Political Advertising: Inconsistent guidelines from Meta and TikTok led to a lack of transparency, complicating the detection of political advertising. While TikTok officially prohibits political ads, the platform was still used by politicians and party supporters for campaign purposes.
Recommendations for strengthening measures against disinformation
The GADMO Report concludes with specific recommendations for platforms and supporting institutions to enhance efforts in combating disinformation.
In light of the upcoming early elections in Germany, it is essential to address the emotional impact of disinformation more effectively and to improve mechanisms for detecting misleading patterns. A standardised definition of political advertising across all platforms is a critical requirement. Additionally, the reporting process for disinformation and the misuse of election campaigning must undergo a fundamental overhaul.
Given the expectation that an increasing number of politicians will use social media platforms for their election campaigns, and studies indicating that young people often struggle to distinguish credible from non-credible content, the fight against disinformation must be intensified at all levels.
You can download the CoP Monitor 2023 here.
If you have any questions about the CoP Monitor or GADMO’s work in the area of platform monitoring, you can reach us at [email protected].