TikTok’s survival in the United States was at stake in 2024. Then-president Joe Biden had passed a law requiring TikTok to separate from its Chinese parent company, ByteDance, if it wanted to continue operating in the U.S. After Donald Trump won the election, the U.S. reached an agreement with TikTok. Today, the platform remains in the country, and Trump is capitalizing on it: “I like TikTok. It helped get me elected,” Trump said after winning the election.
Trump’s impression was true: TikTok helped him win the election. On Wednesday, the journal Nature published an article confirming that TikTok showed more conservative than progressive videos to its users in the U.S. in 2024. Researchers created 323 TikTok accounts in New York, Texas, and Georgia, collected more than 280,000 recommended videos over several months leading up to the 2024 election, and detected a significant bias in favor of the Republican Party: accounts set as Republican received 11.5% more pro-Republican content, while Democratic accounts saw 7.5% more anti-progressive content. TikTok has not responded to this newspaper’s inquiries about the study.
According to the study, a TikTok user interested in politics in 2024 ended up seeing more pro‑Trump content. The hard question to answer from the outside is why. TikTok has an algorithm that decides which video each user sees next — a system ultimately controlled by the company itself. Countless variables come into play: how many people watch a video, what type of content tends to go viral, or what each user prefers. Was this pro‑Trump tilt the result of an internal decision by TikTok, or simply that Americans were more eager to watch pro‑Trump content?
“It’s a reasonable question, and we investigated it,” says Yasir Zaki, co-author and professor at New York University in Abu Dhabi. “We tested whether the results could be explained simply by TikTok promoting posts based on likes, shares, and views. To do this, we built 48 different models that simulated this type of system. In every case, the bias we observed in the real data was greater than what these models predicted. In fact, some of them suggested the opposite pattern, with Democratic content tending to have a higher overall engagement level.”
According to the authors, audience interest had little to do with it. So could TikTok have a button to favor a particular individual or trend? Yes, it does. In 2023, Forbes published an investigation based on internal documents revealing that TikTok has an internal control known as “heating” that allowed employees to manually boost specific videos outside the algorithm. The goal was to guarantee those videos a set number of views and to help attract brands and influencers, but some employees used it to benefit their own accounts or those of people close to them. “ Only a few people, based in the U.S., have the ability to approve content for promotion in the U.S., and that content makes up approximately .002% of videos in For You feeds,” the company told Forbes.
No one outside the study knows whether this method, or a similar one, was used to favor Trump and ensure the new president allowed TikTok to continue operating in the U.S. “Our study can tell us what the algorithm does, but not why,” says Hazem Ibrahim, co-author and professor at New York University in Abu Dhabi. “The bias passed all the robustness checks we performed, but biases can arise from optimization goals and automated loops without anyone choosing them,” he adds.
But could TikTok have done it on purpose? “Technically, yes, and it would be very simple,” says Ibrahim. But could it also have been accidental? “It could also be an emergent property of the way the system optimizes interactions. From the user’s point of view, the effect is the same in either case,” says Ibrahim. Whether intentionally or not, TikTok helped Trump win the election.
In the Biden vs. TikTok case, one of the concerns was precisely TikTok’s ability to influence U.S. public opinion. The argument was simple: if ByteDance obeys the Chinese government, it could order the platform to sway public opinion toward the candidate or party that best suited its interests. This influence would be exerted on a segment of TikTok’s more than 170 million users in the U.S. — a far from trivial number.
It’s also not known how this bias might influence voting intentions: “There is a large body of academic literature showing that social media can influence how people think and act,” says Zaki. “While our study didn’t directly measure those effects, it does show that TikTok offers an uneven mix of information. This matters because recommendation systems determine what people are exposed to over time: which issues seem important, which candidates receive criticism, and which narratives gain traction. They help create the context in which people decide how to vote.”
This case concerns the United States. But it could obviously happen in any other country: “In principle, algorithmic selection overriding user preferences can happen anywhere TikTok operates. Country-specific audits would be needed to understand how this bias manifests in other contexts,” says Talal Rahwan, co-author and professor at New York University in Abu Dhabi.
Sign up for our weekly newsletter to get more English-language news coverage from EL PAÍS USA Edition
