The study found that both platforms significantly promote content that supports the AfD:
- TikTok: 78% of political content recommended to new users supported the AfD — almost four times the party's voter support level, which currently stands at around 20%;
- X: 64% of the political content shown to users also supported the AfD.
The study also found that users who did not express a political affiliation were more than twice as likely to receive right-wing content as left-wing content. Specifically, TikTok promoted right-wing content 74% of the time, while X promoted right-wing content 72%.
Instagram (Meta) was also included in the study, and although the platform showed less bias, 59% of political content on it was right-wing in nature.
To analyze potential bias, the researchers created test accounts on TikTok, X, and Instagram. To ensure balance, the accounts followed the pages of the four main German parties — the CDU (conservative), SPD (centrist), AfD (far-right), and the Greens (center-left), as well as the accounts of their leaders.
Each account interacted with content from all four parties: watching videos for at least 30 seconds, scrolling through images, and reading posts. In this way, the researchers sought to create the image of a “neutral” user to see what content the platforms recommend without bias.
Ellen Judson, Senior Digital Threats Researcher at Global Witness, noted:
“Our biggest concern is that we don’t know why this particular content was recommended to us. While we see bias, the mechanisms behind the algorithms remain opaque.”
Judson suggested that such bias is a side effect of algorithms designed to maximize user engagement.
“When algorithms try to hold users’ attention, they often amplify content that evokes emotion — and politically polarized content works best,” she added.
TikTok responded to the report by saying that the research methodology was flawed because it was based on only a few test accounts, which the company believes is not representative.
X did not comment on the study results. However, the platform's owner, Elon Musk, openly supports the AfD - he used his own account to campaign for the party and also streamed with AfD leader Alice Weidel, which significantly increased the party's visibility.
“We don’t know if X’s algorithms were modified to promote AfD content,” Judson said. “But we hope the European Commission investigates these findings.”
In response to such threats, the European Union passed the Digital Services Act (DSA), which obliges major technology platforms, including TikTok, X, and Instagram, to ensure transparency in their algorithms and combat systemic risks such as political bias.
However, full implementation of DSA is progressing slowly. Article 40 of the regulation, which allows verified researchers to access internal data of platforms, has not yet entered into force.
“We are closely monitoring when researchers get access to the data,” Judson said. “Only then will we be able to make a definitive assessment of the platforms’ bias.”
A number of other studies have previously found political bias on social media. In 2021, an internal investigation by Twitter (before Musk acquired the company) found that the platform’s algorithms favored right-wing content. Similar findings were found in research on YouTube.
These findings are alarming, as they indicate the potential influence of algorithms on shaping public opinion during elections.
“Social media should remain neutral during election campaigns,” Judson says. “But we’re seeing algorithms pushing users to consume certain content — sometimes even unconsciously.”
Global Witness has already submitted the findings to the European Commission, demanding that it launch an investigation. The DSA provides for severe penalties for violators, up to 6% of a company’s annual global revenue. The EU can also temporarily block access to a platform if it fails to comply.
However, despite numerous complaints, no specific action has been taken against TikTok, X, or Instagram in the area of ensuring election integrity.