Table of Contents
As the saying goes, when the product is ‘free’, you are the product. Big Social, as we’re gradually learning, treats its product – you, and more especially your children – with even less regard than the meat-packing companies exposed by Upton Sinclair’s The Jungle.
The first crack in Silicon Valley’s great firewall of secrecy came when New Zealander Sarah Wynn-Williams, formerly director of global public policy at Facebook, leaked internal research that showed the company knew how harmful its products are, especially to children, but continued to market them aggressively, to children. Then, in late 2024, a court clerk in Kentucky uploaded the lawsuit against TikTok with the confidential sections still visible. Before anyone at the court twigged what had happened, the secret corporation’s business was out in the wild.
What was inside was TikTok’s own engineers, in their own words, describing what their app does to a human brain.
Your brain. Your kids’. If you let your kids anywhere near TikTok, delete it yesterday.
Because the company’s own internal documents lay it out in cold, corporate English. Not activist hysteria. Not a journalist’s spin. TikTok’s own engineers, in their own words.
TikTok ran the math on how long it takes to develop ‘compulsive use’ of the app. The number is 260 videos… roughly 35 minutes. The company’s internal documents call this the compulsive-use threshold.
That’s not accidental. That’s the business model. Like a designer drug, it’s purposely engineered to get you hooked and begging for your next dopamine fix.
TikTok’s own research describes what compulsive use causes: ‘diminished analytical ability, impaired memory, contextual reasoning, conversational depth, empathy, and heightened anxiety.’
Again, their words. A team inside the company called ‘TikTank’ reported compulsive use was ‘rampant’. After 30 minutes straight, users are shoved into algorithmic filter bubbles they didn’t choose and can’t escape.
Doubleplusgood, brothers.
Then comes the great screen-time charade, the one TikTok parades as proof it ‘cares’. Supposedly, a little ping after an hour – twice the time the company knows is enough to get you hooked – is all the digital methadone the addict needs. It’s yet another Big Lie from Big Social.
TikTok ran an experiment on the 60-minute screen-time prompt. Daily teen usage dropped from 108.5 minutes to 107. A reduction of 1.5 minutes.
Once again, the system was working exactly as planned. Because, as their internal documents admit, reduced screen time was never their actual aim. It was all about fooling its users, and more especially the media. ‘Improving public trust in the TikTok platform via media coverage,’ they called it. Spin, to use a simple word.
Once again, their own words damn them. Project managers openly wrote, ‘our goal is not to reduce the time spent’. Executives signed off only if it didn’t dent ‘core metrics’ by more than 10 per cent. The ‘Are you still scrolling?’ videos are nothing more than cynical PR theatre.
The algorithm is even more cynical than a boy racer with a ‘No Fat Chicks’ sticker on his hot rod. An internal report flagged too many ‘not attractive subjects’ in the For You feed. TikTok fixed that by suppressing them, actively promoting a narrow beauty norm even though it knew it could damage young users. Public moderation stats? ‘Mostly misleading,’ because they only count what they catch.
None of this is rumour. It’s court-admissible evidence from TikTok’s own files. Fourteen state attorneys general – bipartisan, not some fringe outfit – are suing on the back of it.
This is exactly what I’ve been writing about for Good Oil for over a year. Social media makes Big Tobacco look like amateurs.
For those inclined to cry that this is ‘nanny statism’, spare us. This isn’t government bureaucrats banning an app or censoring adults. This is basic tort law, the same principle that has underpinned our legal system since the ancient Germanic concept of weregild. Whoever causes harm is liable to make redress. The principle is the same whether it’s a company dumping heavy metals in the water supply, or a company deliberately hooking kids on its dopamine machine.
TikTok didn’t just stumble into addicting its users: it measured the compulsive-use threshold to the minute, quantified the brain damage, built fake safeguards whose only purpose was better headlines and kept the dopamine machine running, because retention beats mental health every time.
That’s deliberate, documented harm to users, especially kids, for profit. Holding them accountable via civil suits and attorney-general action isn’t statism. It’s the rule of law protecting individual rights from corporate predation.
The documents are TikTok’s. The math is TikTok’s. The harm is TikTok’s.
Time to make them pay the weregild.