This article is part of a series. Bots and ballots: How artificial intelligence is reshaping elections around the worldprovided by luminate.
When Hamas attacked Israel on October 7, many people turned to social media, their main source of news, for updates.
But unlike previous global conflicts, where digital discourse was dominated by Facebook and X (formerly Twitter), the ongoing Middle East crisis has seen millions of Thousands of people are flocking to TikTok.
Despite the video-sharing app's growing popularity, the inner workings of its complex artificial intelligence-powered algorithms remain a mystery.
Individuals only see a small portion of what is posted on TikTok each day. And what they see is highly curated by the company's automated systems designed to keep people glued to their smartphones. These systems use AI technology known as machine learning and so-called recommender systems to decide within milliseconds what content to show to social media users.
POLITICO uncovers how TikTok's algorithm works to root out whether Israeli or Palestinian sides in the Middle East war are winning hearts and minds on the young social network started.
The issue comes after pro-Israel groups and some Western lawmakers accused TikTok, owned by Beijing-based ByteDance, of unfairly promoting pro-Palestinian content for potential political influence. has become a hot political issue. TikTok denies the accusations.
The political implications of this conflict are already evident in partisan clashes across Western democracies as people decide which side to choose in the war and how to vote. US President Joe Biden's support for Israel has drawn criticism from Arab Americans and could ultimately cost him the November election. In the UK, populist independent candidate George Galloway used pro-Palestinian sentiment to win a seat in the UK parliament in March. College campus protests are erupting on both sides of the Atlantic.
TikTok's algorithm is critical to how political content of all kinds reaches social media feeds. Examining the company's algorithms is good evidence of how artificial intelligence plays a key role in determining what we see online.
POLITICO teamed up with researcher Laura Edelson at Northeastern University in Boston to track pro-Palestinian and pro-Israel TikTok content over a four-month period from October 7, 2023 to January 29, 2024.
That included creating a list of 50 popular hashtags that could be directly associated with either side, such as #IStandWithIsrael and #SavePalestine. More non-political hashtags, such as #Gaza and #Israel, were used to collect data on posts without a specific trend.
In total, Edelson analyzed 350,000 TikTok posts from the United States.
To make the data more understandable, she divided her posts into three-day windows centered around specific events. This includes the first Hamas attack (October 7-9). Israel's invasion of Gaza (October 27-29). To control for bias, she also included Nov. 6-8 in her analysis as a proxy for a period of no major events.
“TikTok, like other social media platforms, amplifies some content over others,” Edelson said. “That can have a distorting effect on what people see in their feeds.”
What has become clear is that TikTok is living up to its role as one of the main squares in the global digital town where people come together to express their opinions and often disagree. It was proof of that.
Edelson's research found that approximately 20 times more pro-Palestinian content was produced than pro-Israel content over a four-month period, based on the hashtags analyzed. But that didn't necessarily mean that ordinary people's TikTok feeds would see more pro-Palestinian posts.
Instead, Edelson said there were three times when the likelihood that people saw pro-Israel or pro-Palestinian content within their TikTok feed changed significantly, regardless of how much the overall material was produced by either side. I found a clear time.
TikTok did not respond to a request for specific comment on the Northeastern University study. The company announced in an April blog post that it had removed more than 3.1 million videos and suspended more than 140,000 livestreams in Israel and Palestine for violating its terms of service.
How these social media algorithms work is largely unknown. It's unclear who within a company – engineers, policy makers, executives – decides how the company functions. Regulatory efforts by the European Union and the United States seek to bring greater focus to these practices, but it is also difficult to determine when changes will be made.
Here's an example of how, when you dig deeper into the numbers, much of what you see on social media relies heavily on complex algorithms that have little oversight and are regularly tweaked. .
TikTok posts were independently collected via Junkipedia, a social media content repository managed by the nonprofit National Conference on Citizenship. These represent the most viewed partisan posts in each time period.
October 7th to October 27th: Mostly pro-Palestinian content
During the first three and a half weeks of the conflict, views per post (the number of times the actual content was served on people's TikTok feeds) were skewed towards pro-Palestinian content.
During that time, generally non-political content, such as mainstream news, attracted the most pragmatic opinions. But for pro-Israel and pro-Palestinian posts, the latter post was more likely to appear in someone's feed, regardless of that person's views on the conflict.
October 7th-9th: Hamas launches attack on Israel
As soon as Hamas attacked Israel, TikTok was flooded with pro-Palestinian viewpoints, many of which showed solidarity with the Palestinian cause despite the violent attack.
October 13-15: Israel warns Palestinians to leave northern Gaza
In the early days of the war, social media users posted harrowing videos of deplorable life in Gaza and demonstrations in support of the Palestinian cause.
October 18-20: US President Joe Biden visits the Middle East
During the US president's tour of the region, pro-Palestinian content dominated people's feeds, based on average views per post. This included a rally calling on the wider Islamic world to support Gaza.
October 27th to December 15th: Pro-Israel content takes the lead
In late October, without warning, things started to change for TikTok.
Between October 27 and December 15, pro-Israel content was more It surpassed Palestinian content.
This means that during those seven weeks, TikTok users were, on average, significantly more likely to see content favorable to Israel. The most likely explanation is an adjustment to how the company's algorithm populates people's feeds, given that overall pro-Palestinian content still outpaces pro-Israel posts. Edelson, the academic, told POLITICO that more research is needed to replicate her results.
October 27-29: Israel invades the Gaza Strip
On TikTok, influencers pushed back against people who accused them of copying Israeli government talking points and attacking high-profile celebrities for pro-Palestinian bias.
November 6th-8th: Quiet period
Pro-Israel groups in the United States produced a viral video depicting pro-Palestinian activists as callous disregard for the plight of the hostages, while other groups defended the country's law enforcement agencies.
November 15-17: Israeli forces enter Al Shifa Hospital in Gaza City
Given the close ties between the United States and Israel, American social media influencers, many of whom are connected to the country's evangelical churches, took to TikTok to take up the cause. Some linked the Middle East conflict to domestic American politics.
November 24-27: Hamas releases first hostage
The most viewed content during this period was about the release of Israeli hostages. It included an emotional reunion between a family member and a pro-Israel TikTok user explaining what happened.
November 30th – December 2nd: Hamas-Israel ceasefire ends
After hostilities resumed in late November, official social media accounts made their presence felt. Among them was the Israel Defense Forces, whose posts collectively received hundreds of thousands of views.
December 15th – January 29th: Both sides lose spectators
Then, starting December 15th, TikTok's algorithmic approach to these posts changed again.
As the conflict continued with no end in sight, both pro-Israel and pro-Palestinian content, based on the number of views per post, was increasingly missing from TikTok users. Part of this is because the world's attention has turned elsewhere and it has become indifferent to the war.
But the drop in views for content on either side fell faster than expected due to fewer TikTok posts made about the war, Edelson said. There may be other explanations besides the company tweaking its content algorithms. However, changes in viewing patterns did not match changes in the amount of material produced over the same period.
December 15-17: Israel Defense Forces accidentally kill three hostages
Despite the drop in views, pro-Israel posts still offered vivid first-person accounts of what life is like in the country during the ongoing war.
January 2-4: Israel kills Hamas deputy leader Salih al-Arouri in Beirut
Tel Aviv has been comfortable using TikTok to send political messages to the world, especially after a South African-led push to hold Israel legally responsible for the alleged genocide.
January 20-22: Israeli Prime Minister Benjamin Netanyahu says there is no two-state solution
Four months after the conflict began, social media influencers sought to rally global support for Palestine through so-called TikTok challenges, which were recreated on multiple accounts.
January 26-29: UN Palestine Refugee Agency accuses some staff of ties to Hamas
Part of the crowdsourced pro-Palestinian strategy was to highlight supporters around the world while condemning the alleged hypocrisy of those who support Israel in the conflict.
TikTok effect
Many people, especially those over 30, think of video-sharing networks as just frivolous things, primarily dance fads or digital fads that have nothing to do with politics.
they are wrong.
Edelson said TikTok's algorithm is similar to other social media giants in that it's designed to promote what's popular. The reason is to give people what they want to see and keep them there as long as possible.
Viral videos of dogs or cute babies are fine. It's completely different when it comes to highly tense political content about geopolitical hotspots where people are dying every day. Events like this put social networks like TikTok and their automated curation models in the unenviable position of deciding what's popular, putting minority voices at risk of being shut out.
“When it comes to politics, like anything else, social media discourse favors the majority,” Edelson added. “We should really think about what that means.”
This article is part of the series “Bots and Voting: How Artificial Intelligence is Reshaping Elections Around the World” by Luminate. This story was produced by POLITICO's reporters and editors, with full editorial independence. Learn more about editorial content provided by external advertisers.