CookieHub Logo
When data goes where consent hasn’t

When data goes where consent hasn’t

Table of contents

A massive €530 million GDPR fine hit TikTok for illegally transferring EU user data to China without adequate consent or protection, highlighting a critical privacy issue. Weak consent, dark patterns, and vague notices erode trust. Solutions mandate enhanced transparency, flexible design, and strict legal enforcement to ensure data only goes where permission is explicitly given.

Imagine a world where your personal data drifts away from your control—used in ways you never agreed to, stored on servers in distant lands, or even accessed by unknown parties. This isn’t a scenario from a dystopian novel—it’s happening now. 

The TikTok wake-up call 

In May 2025, Ireland’s Data Protection Commission (DPC) imposed a staggering EUR 530 million fine on TikTok for illegally transferring European user data to China without adequate safeguards or transparency. The DPC found that TikTok failed to demonstrate that user data accessed remotely by staff in China received protection equivalent to EU standards—a core requirement under GDPR Articles 46(1) and 13(1)(f). The argument? Chinese laws, such as the National Intelligence Law, could compel companies to provide data to authorities—raising serious privacy risks for EU citizens

This fine also followed TikTok’s earlier claim that no data was stored in China—a statement later revealed to be false, eroding trust further. 

This ruling is more than a headline; it’s a vivid reminder: data can—and will—travel beyond borders and into hands users never imagined—especially when consent isn’t clear or honored. 

When consent falls short 

TikTok isn’t alone. History has shown how data can be misused when consent is weak, vague, or just poorly enforced. 

The Cambridge Analytica scandal exploited Facebook data of around 87 million users without their explicit permission, using it to sway political behavior. That case underscored how data collected under one guise can be repurposed for entirely different—and potentially harmful—use cases. 

Consent interfaces often use “dark patterns”, nudging users into agreeing by making optouts harder. A study found that only ~12% of consent popups in the UK met minimal legal standards—and that design decisions like hiding optout buttons could bump consent by over 20 percentage points. 

Researchers have also found that many privacy notices fail to help users truly understand how their data will be used. In interviews with Europeans, participants said descriptions of purposes—like “analytics” or “ads”—were too vague to be meaningful. 

These issues aren’t theoretical—they erode trust. In a Pew Research survey, 46% of Americans said they had no trust at all in socialmedia executives not selling their data without consent, while nearly 90% were worried about how platforms treated children's information.  

Avoid consequences of weak consent 

The fallout from weak consent can be severe, and include (among other things):  

Privacy and surveillance risks, including unauthorized access by state or corporate entities. 

Erosion of public trust, especially among vulnerable groups like children—potentially reducing willingness to share data even when benefits are real. 

Economic and legal repercussions, as seen in mounting GDPR and other mounting regulatory penalties. 

Disproportionate harms, since marginalized communities may lack the digital literacy to detect or opt out of misuse. 

Bolster consent: Ensure data only follows where consent is given 

To avoid scenarios where data “goes where consent hasn’t,” consent must evolve in key ways: 

Transparency and clarity 
Consent isn’t truly informed when it's hidden in dense legalese or opt-outs are buried. GDPR requires that information be “concise, transparent, intelligible and easily accessible”. But in practice, platforms often fall short. 

Intentional design, not persuasion 
Removing dark patterns and ensuring users can easily opt out are essential. User interfaces must empower people—not manipulate them. 

Ongoing, flexible consent—Dynamic is key 
Particularly in research and medical contexts, dynamic consent allows people to modify their choices over time and see how their data is used. This model collects consent on an ongoing basis and doesn’t just collect a onetime “yes”. 

Legal accountability and enforcement 
Fines like TikTok’s EUR 530 million are vital to signal that serious breaches come with serious consequences. GDPR empowers regulators to levy penalties up to 4% of global turnover or EUR 20 million—whichever is higher. 

Informed on informed consent 

“When data goes where consent hasn’t”—that’s not just catchy phrasing. It’s the reality we face today. From TikTok’s massive penalty to consent practices that push users toward “agreeing,” data often slips into unintended places. If we want data to serve—not harm—individuals, we must make consent more than a checkbox: transparent, flexible, and enforceable. 

Protect Your Data (and Your Wallet): Implement CookieHub

The verdict is clear: weak consent leads to fines and broken trust. Ensure your data practices are fully compliant, transparent, and user-focused. Choose CookieHub to easily implement an ethical, GDPR-compliant Consent Management Platform today.

30 day free trial

No credit card required