Skip to content

What Meta Know but They Don’t Want Us To

Why you should ditch Facebook.

Don’t let this guy get his hooks on your kids. The Good Oil. Photoshop by Lushington Brady.

As I’ve been reporting, the giant social media companies are the Big Tobacco of the 21st century. Their products are demonstrably harmful: they may not dramatically raise cancer risks, but they do dramatically exacerbate mental health risks. Mental health is still health, even if a tumour is far more immediately obvious than a deep depression.

Just as Big Tobacco’s internal research in the ’50s and ’60s proved to them how harmful their products are, the internal research of companies like Meta (Facebook, Instagram, WhatsApp) has proved to them how obviously harmful their products are. And just as Big Tobacco lied about and covered up what they knew for decades, Meta is also lying and covering up what it knows.

Rumours about Facebook’s advertising business – now generating more than $US160bn ($254bn) a year – had long circulated in industry circles, but until 2017 there was no concrete evidence of seriously questionable practices.

Whenever I picked up the phone or met a media agency CEO or advertiser for coffee and asked about these rumours, the mood would shift – they’d change the subject or shift uncomfortably in their seat. Eventually, though, I found someone who promised to share confidential internal Facebook documents that might shine a light on what had been whispered for years.

What whistleblowers have exposed is damning.

At least Big Tobacco never hawked cancer treatments to dying smokers: Meta is using its algorithms to deliberately target children most at risk of mental harm. And secretly boasting about it.

The real bombshell was buried at the back of the documents. To my astonishment, Facebook was boasting to media agencies that it could target children at their most vulnerable – ­including when they felt “worthless” and “insecure”.

A 23-page Facebook document, marked “Confidential: ­Internal Only” and dated 2017, outlined in chilling detail how the platform could identify “moments when young people need a confidence boost”.

By monitoring posts, pictures, interactions and other online ­activity in real-time, Facebook claimed it could detect when young people felt “stressed”, ­“defeated”, “overwhelmed”, “anxious”, “nervous”, “stupid”, “silly”, “useless” and like a “failure”.

For instance, if a teenage girl deletes a selfie from Instagram, Meta instantly pings that as a self-image insecurity. The girl’s feed will be flooded with posts extolling beauty standards – and ads promoting beauty products.

British teenager Molly Russell saved Instagram posts – including one from an account called Feeling Worthless – before taking her own life. Instagram and Whats­App are both owned by Meta.

The word “worthless” was one of the emotional triggers we ­exposed in our original report on the insidious ways Facebook’s ­algorithm was being harnessed.

Facebook is not just lying and hiding the evidence of the harm it is inflicting. It’s also targeting whistleblowers and journalists with all the sneaking fury of the Church of Scientology.

[Whistleblower Sarah Wynn-Williams] reveals she was dispatched by Facebook’s top executives to kill my story.

“It’s a reporter for an Australian newspaper who’s gotten his hands on one of the internal documents about how Facebook actually does this, and he reaches out for comment from Facebook ­before publishing. That’s when I hear about it. I didn’t know ­anything about this and neither did the policy team in Australia,” she writes.

“It’s an advertising thing. I’m put on a response team – communications specialists, members from the privacy and measurement teams, and safety policy ­specialists – all trying to figure out what to say publicly.”

‘What to say publicly’ included smearing the journalist reporting what Meta doesn’t want the world to know.

I was inundated with calls from international media ­requesting TV and radio appearances to discuss the story.

Among those contacting me were two Australian journalists – one from the ABC and another from Fairfax Media – who tipped me off that Facebook’s local communications team was calling ­reporters to smear me with personal attacks, a tactic aimed at discouraging others from pursuing the story.

The lengths they went to shut it down were staggering. Wynn-Williams writes that a junior ­researcher in Australia was fired.

Fired, that is, just for doing what she was told to: do the research on the effects of Meta’s products.

The malignant practices at Meta go all the way to the very top.

When Facebook’s attempts to deter us failed and we published follow-up stories, Wynn-Wil­liams reveals that the company – and Sheryl Sandberg, then second in command and a close confidant of [Mark Zuckerberg] – escalated their ­efforts to contain the growing media interest.

Another senior Facebook executive, “Joel”, asks for a firm statement that ‘We’ve never delivered ads targeted on emotion.’

Only to be told that they couldn’t do that without lying.

Joel responds, ‘We can’t confirm that we don’t target on the basis of insecurity or how someone is ­feeling?’

“Facebook’s deputy chief privacy officer replies, ‘That’s correct, unfortunately.’ Elliot then asks whether it’s possible to target ads using terms like ‘depressed’, and the deputy confirms that, yes, Facebook could customise that for advertisers. Despite all this, ­Elliot, Joel, and several of Facebook’s most senior executives conspired on a cover-up. ­Facebook issued a second statement that was a flat-out lie: ‘Facebook does not offer tools to target people based on their emotional state’.’’

The statement was circulated among the senior leadership. Many of them knew it was false. They approved it anyway.

This will, inevitably, lead to renewed calls for government legislation to curb Meta’s access to children. But children will quickly see through the hypocrisy, just as they did when their chain-smoking parents told them not to smoke.

The adults need to lead by example: ditch Facebook. It’s that simple.


💡
If you enjoyed this article please share it using the share buttons at the top or bottom of the article.

Latest