Skip to content
HealthLawMediaPolitics

The Fight to Protect Youth From Social Media

person holding cigarette stick and round glass ashtray
Photo by Jonathan Kemper. The BFD

Table of Contents

Carolyn Moynihan
mercatornet.com

Carolyn Moynihan is the former deputy editor of MercatorNet.


Is social media harmful to kids? Some people are still sceptical about this: a few experts, the American Civil Liberties Union and Big Tech itself, of course. Put a smart little device in the hands of teenagers, hook them up to platforms they find irresistible and what could go wrong?

A lot. Boys become addicted to online games and YouTube, girls to chat and photo-sharing platforms. They spend less time with their friends in person. Girls in particular spend more time in their bedrooms scrolling through their Facebook, Snapchat and Instagram accounts, looking for likes, comparing themselves to other kids, celebrities and influencers, and getting sadder all the time. Occasionally one commits suicide.

It’s four years since Jean Twenge, a professor of psychology at the University of San Diego, showed how the arrival of the smartphone (the first iPhone was launched in 2007) coincided with a sharp rise in depression, self-harm, suicide attempts and actual suicide among the generation of Americans born after 1995.

The US public health agency, the CDC, found that by 2019 an alarming 37 per cent of US high school students reported “persistent feelings of sadness or hopelessness” – a 40 per cent increase between 2009 and 2019. This trend has been accentuated by Covid-19, but it was there before the pandemic.

There has been a lot of pushback against Twenge’s work, not least from the big guns of social media themselves. However, a year ago a former employee of Facebook/Meta turned whistleblower revealed that the company’s own research in 2020 showed that its Instagram platform causes anxiety about body image among girls.

Unbelievably, Facebook sat on this information and went on planning a new “service” for preteens called Instagram Kids. “They are coming for your kids” seems true in this instance; Big Tech is out to hook youngsters, and Meta, for one, doesn’t want parents to get in the way. It insists on teens granting permission for parental controls to be enabled – and teens can revoke them at any time.

What does Meta care if young girls are miserable about how they look, or are stealing time from their sleep and increasing their risk of depression? Does ByteDance take seriously the possible link between the insanity of its TikTok platform and the outbreak of strange, nervous tics among girls?

This is what parents are up against and the resources they have are woefully inadequate. Some manage to keep their own kids smartphone free but the social pressures for adolescents are very strong when most of their peers are networking by phone and following the latest fads.

Strangely, legislators who are keen to protect teens by banning everything from smoking to “conversion therapy” are hanging back when it comes to a far more prevalent threat. It’s high time they acted.

Six policy ideas for protecting teens

And they can, says a group of US experts (see endnote) in a legislative brief published last week and containing six policy ideas for states – including one very bold proposal.

In America, the authors point out, US Supreme Court decisions have limited the power of Congress to pass effective legislation to protect children even from sexually explicit content, and its historical focus on indecent material misses “the unique disruption to children’s psychological development that social media’s pervasive presence appears to cause”.

(Australia has recently tweaked its laws to clamp down on “cyber-abuse” of adults, and New Zealand has approved a voluntary code by which Meta, Google, TikTok, Amazon and Twitter will take down abusive content, “misinformation and disinformation” – again, focused on adults.)

What’s needed, the authors of the brief say, is for states to take over, using their legislative power to protect children from platforms that promote anxiety, envy, pornography, loneliness, sleeplessness and suicide. Here is a brief summary of their proposals based on their article in the Deseret News:

  1. Enact age-verification laws so that no minors under the age of 13 can create social media accounts. Although that is already the de facto legal age for social media, children younger than 13 are gaining access, and these younger children are more vulnerable to the harmful mental health effects.
  2. Require parental consent for minors to open a social media account. When individuals join social media websites or use most commercial websites, they agree to terms of service, which are binding contracts, so it is a reasonable regulation that parental consent would be required for anyone under 18.
  3. Mandate full parental access to minors’ (ages 13 to 17) social media accounts. Full access would ensure that parents have control of their minor child’s account settings so they can restrict its privacy, review friend requests and know exactly what their child is doing online. While parents can currently utilize various for-purchase parental control apps, certain platforms, like TikTok, are not able to be covered, or parents are unable to fully monitor all aspects of the account. Government intervention is needed to provide full access, and to empower all parents, not just those able to afford a private option.
  4. Enact a complete shutdown of social media platforms at night for minors. This would align with usual night-time sleep hours, for example, 10:30 p.m. to 6:30 a.m., and eliminate teens’ temptation to stay up late on social media. This is an important step to take because technologically induced lack of sleep is a primary driver of depression among teens.
  5. Create causes of action for parents to seek legal remedies with presumed damages. Any law that a state passes to protect kids online should include a private cause of action to enable parents to bring lawsuits on behalf of their children for any violation of the law. These companies aim to maximize profit, so there must be a sizable enough threat in order for them to correct their behaviour.
  6. Enact a complete ban on social media for those under age 18. This is the boldest proposal of them all, but not unprecedented. Many states already place age restrictions on numerous behaviours known to be dangerous or inappropriate for children, such as driving, smoking, drinking, getting a tattoo and enlisting in the military. Similarly, a state could recognize social media as a prohibited activity for minors.

Of course, such a ban on social media for kids would be controversial. Big Tech would fight it tooth and nail. Kids would hate it. The liberal establishment would throw up their hands in horror. But parents might welcome a step that would lighten the burden of fighting the new media barons virtually single-handed.

As the authors of the brief point out, the problem of social media is no longer a private one:

“Social media use by even a few children in a school or organization creates a “network effect,” so even those who do not use social media are affected by how it changes the entire social environment. A collective solution is needed. An across-the-board age ban would place the burden where it belongs: back on the social media companies that designed their platforms to be addictive, especially to the most vulnerable: children.”

“One day,” they add, “we will look back at social media companies like ByteDance (Tiktok) and Meta (Facebook and Instagram) and compare them to tobacco companies like Philip Morris (Marlboro) and RJ Reynolds (Camel).”

Big Tobacco enjoyed immense profits and popularity while obscuring the science about their harm, and even pitching deceptive advertising to children. But eventually it became known and they were held accountable.

Now it is the turn of Big Tech to face an accounting for its “baleful influence on our children,” the report concludes. Before it’s too late.

Protecting Teens from Big Tech: Five Policy Ideas for States, is the work of the following authors: Clare Morell, who is a policy analyst at the Ethics and Public Policy Center, where she works on the Technology and Human Flourishing Project. Adam Candeub, who is a Professor of Law at Michigan State University, where he directs its IP, Information and Communication Law Program and Senior Fellow at the Center for Renewing America. Jean M Twenge, who is a Professor of Psychology at San Diego State University and is the author of iGen. Brad Wilcox, who is the Future of Freedom fellow at the Institute for Family Studies, visiting scholar at the American Enterprise Institute and the director of the National Marriage Project at the University of Virginia.

Latest