The TikTok saga: what it actually teaches us about platform governance

Data Sovereignty, Privacy and Surveillance ▪ May 5, 2026

By Asma Sifaoui, Open Terms Archive team member

The TikTok saga in the United States has become a policy story that never quite resolves. What began as a debate over data practices has evolved into something much broader: legal battles, congressional hearings, executive bargaining, and geopolitical negotiation between Washington and Beijing. And yet, the central question keeps getting lost: has any of this actually made Americans safer? The answer is not obvious, and that is precisely the point.

The problem was never just TikTok

The concern has always been less about the app itself and more about its ownership structure. ByteDance, a Beijing-headquartered company, owns TikTok. That matters because of the legal environment it operates in.

China’s 2017 National Intelligence Law, particularly Article 7, requires organizations to cooperate with state intelligence work. The U.S. interpretation is straightforward: ByteDance could be compelled to provide access to data or systems, potentially without public visibility.

Whether that risk is likely is debated. But U.S. policymakers have not framed this as a question of probability. They treat it as a question of structural possibility, and in national security terms, that is enough.

There have also been concrete incidents that reinforced these concerns. Internal reporting showed that engineers in China had access to U.S. user data. ByteDance later confirmed that employees had accessed journalists’ data during an internal investigation. TikTok framed these as policy violations; critics saw them as evidence of structural dependence.

Project Texas solved the wrong problem

TikTok’s main response was Project Texas, which is a large-scale effort to localize U.S. data and isolate it from ByteDance. It involved U.S.-based data storage with Oracle, a dedicated entity (USDS), and oversight mechanisms meant to reassure regulators. In fact, it was an expensive and technically serious “solution,” but it did not resolve the core issue.

Data localization changes where data is stored. However, it does not change who ultimately controls the system producing that data. TikTok’s recommendation algorithm, which is the core of the platform, remained tied to ByteDance.

This is where the policy conversation lost precision. Lawmakers dismissed Project Texas, often without clearly articulating what would have counted as sufficient mitigation. Hearings generated strong political messaging but little clarity on the actual threat model. Data access, content moderation, and geopolitical concerns were frequently conflated. Because in simple words, you cannot regulate a risk you have not clearly defined.

The shift to ownership as the problem

The 2024 Protecting Americans from Foreign Adversary Controlled Applications Act marked a clear shift. Instead of trying to regulate behavior, it treated ownership itself as disqualifying. This is a significant move in platform governance because it abandons the idea that compliance can mitigate risk and replaces it with a structural claim: certain ownership configurations are incompatible with operating in the U.S. But this shift introduces its own tension. If the risk stems from legal obligations tied to ByteDance, then partial divestment does not necessarily eliminate that risk, especially if key assets like the algorithm remain linked to the parent company, and that is exactly where the current arrangement remains unclear.

The ban that didn’t quite work

When TikTok briefly went offline in January 2025, users were greeted with a message acknowledging the law and noting that President-elect Trump had indicated he would work on a solution once he took office. Users quickly migrated to other platforms, including RedNote, another Chinese-owned app that had received far less scrutiny. This moment exposed a key weakness in the policy approach. If the concern is foreign access to user data, targeting a single platform does little to address the broader ecosystem.

User behavior also matters. Demand for these platforms does not disappear; it shifts. Platform restrictions don’t mitigate risks; they displace them, driving users to less regulated spaces, fragmenting oversight, and reducing transparency while claiming to increase security. Selective enforcement can thus end up undermining its own goals.

Where this leaves us

By early 2026, TikTok’s U.S. operations had been restructured. ByteDance retained nearly a 20 percent stake in the new company, with non-Chinese investors controlling the remaining 80 percent of the TikTok USDS Joint Venture. The managing investor group, Oracle, Silver Lake, and MGX, each hold 15 percent, or 45 percent collectively. But the core question remains unresolved: has the underlying dependency actually been removed, or simply repackaged?

Much of the answer depends on details that are not fully public. Oracle can audit what the algorithm does; it cannot unilaterally change how it works or prevent updates that originate from ByteDance’s retained intellectual property. The algorithm will be retrained on U.S. user data, but the underlying IP remains in Beijing. Senator Ed Markey criticized the White House for providing “virtually no details about this agreement, including whether TikTok’s algorithm is truly free of Chinese influence”. What role ByteDance continues to play, how the algorithm is governed, and whether meaningful independence exists are still open questions.

What the saga actually reveals

The TikTok case is not just about one platform. It is a preview of how governments might approach foreign-controlled digital infrastructure more broadly. Two things can be true at once: the national security concern is real; large-scale behavioral data controlled by foreign entities raises legitimate risks. At the same time, the policy responses have been shaped by politics, market competition, and negotiation dynamics as much as by a clearly defined threat model. ByteDance will continue to control TikTok Shop and the platform’s advertising operations even under the new structure, a detail that quietly complicates the narrative of clean separation.

The result is a compromise that is politically workable and commercially acceptable, but analytically unresolved. And the core question remains: can a platform ever be meaningfully separated from the legal and political system of its country of origin? So far, the TikTok saga suggests that we do not yet have a clear answer.