A study published Wednesday in the journal Nature finds that TikTok’s algorithm systematically prioritized pro-Republican content in three states leading up to the 2024 US elections.
Researchers created hundreds of dummy accounts and conditioned them to mimic real users’ behavior by watching a set of videos either aligned with the US Democratic or Republican parties. Then, they tracked the videos TikTok recommended on these accounts’ For You pages, TikTok’s main feed.
“We found a consistent imbalance,” they wrote in Nature.
About 42% of US social media users say that these platforms are important for getting involved with political and social issues, according to Pew Research, but it’s not often clear how recommendation algorithms shape what appears in feeds. Professors Talal Rahwan and Yasir Zaki at New York University’s Abu Dhabi campus set out to study how partisan politics shows up on TikTok – a platform that has become a key source of political information, especially for some young adults. Their study notes that this demographic – ages 18 to 29 – shifted by 10 percentage points towards Trump between the 2020 and 2024 elections.
TikTok did not offer comment by press time.
Bots that were trained on pro-Republican content viewed about 11.5% more content that agreed with their views compared to their pro-Democrat counterparts. That imbalance held for exposure to opposing views, too. Bots trained on pro-Democratic content were about 7.5% more likely to be viewing pro-Republican content on their For You page, the study found. The researchers used 323 dummy accounts in the study, setting their locations to New York, Texas and Georgia. For 27 weeks of the 2024 presidential campaign, the researchers sifted through more than 280,000 recommended videos using a combination of human and AI review.
“Our finding isn’t just about reinforcement; Democratic accounts were shown significantly more anti-Democratic content than Republican accounts were shown anti-Republican content,” said Rahwan, one of the study’s authors. “The algorithm wasn’t just giving people what they want; it was giving one side more of what the other side says about them.”
The types of issues that surfaced in videos differed, too – as pro-Democrat accounts in the study were fed disproportionately more cross-partisan content on immigration and crime, and pro-Republican accounts saw more cross-partisan content on abortion.
“This suggests the algorithm may amplify content designed to attack the opposing side on its weakest ground, which is a more targeted and arguably more concerning pattern than a uniform ideological drift,” adds Hazem Ibrahim, a PhD student at NYU Abu Dhabi who worked on the study.
The bots in the study were located through “mock” GPS and virtual private network (VPN) routing in strongly Democratic New York, strongly Republican Texas and Georgia, a battleground state. The researchers caution that their findings shouldn’t be generalized beyond these states.
The study authors acknowledge that many users self-select and curate the content they see on various social media platforms. However, they say that TikTok’s For You page gives users less control than the main interfaces of other social media platforms, being “almost entirely driven by the platform’s algorithm”, the paper notes.
On TikTok, “users don’t need to follow anyone; the system decides based on behavioral signals like watch time. That makes it a uniquely clean setting for studying algorithmic influence, because user self-selection is minimized,” Ibrahim says. “Skews here are harder to attribute to users’ choices.”
The authors note that while their study unpacked the kind of political content users are exposed to, it doesn’t analyze the influence of these videos on political beliefs and behavior, or the reason why this imbalance exists. They also note that the bots only captured the early states of a user’s experience on the platform, and analyzed English-language video transcripts, which wouldn’t capture political cues conveyed through visuals or other languages.
Still, they stress that studying the extent to which political content can be skewed on TikTok feeds are relevant to ongoing debates about platform transparency and algorithmic accountability. The Nature article pointed out: “The European Union Digital Services Act, for example, already requires large online platforms to assess and mitigate risks to electoral processes, whereas U.S. Law grants platforms broader editorial discretion.”
Zaki adds: “In an environment where margins are thin, systematic differences in the kind of political information recommended to tens of millions of young voters are worth taking seriously.”



