Are the days of a free run for Big Tech in the media sphere coming to an end? Developments in a case before the US Supreme Court seem ominous for the social media giants, most particularly Facebook.
In a statement on the Supreme Court’s denial of certiorari in Jane Doe v. Facebook, Justice Clarence Thomas urged the justices to “address the proper scope of immunity under §230” available to internet companies.
Calls to “repeal Section 230” have become a rallying cry for conservatives who argue, as petitioners did, that §230 “has strayed far from its origins and text,” and that Big Tech needs to be reined in.
Since the mid-90s, internet companies have been exempt from liability for content published on their platforms: in other words, posts, comments and so on. At the time, the exemption seemed sensible, as internet platforms were not “publishers” in the traditional sense. But, as social media companies more and more stringently control the content users publish on their platforms, that argument has worn ever-thinner.
The current case concerns a predator who used Facebook as the means of preying on young girls.
The case presented to SCOTUS involved an adult male sexual predator who used Facebook to lure a 15-year-old girl to a meeting. The predator repeatedly raped and beat the girl, then trafficked her for sex. The girl, known in court documents only as “Jane Doe,” escaped and sued Facebook in Texas state court, claiming that Facebook violated the Lone Star State’s anti-sex-trafficking statute and committed various common law offenses.
Doe’s statutory sex-trafficking claim was permitted to go forward, but the Texas Supreme Court dismissed Doe’s common law claims, ruling that they were barred by §230.
Which is where “certiorari” comes in: “a writ of superior court to call up the records of an inferior court or a body acting in a quasi-judicial capacity”. In other words, the Supreme Court refused to consider Jane Doe’s Texas lawsuit.
Justice Thomas agreed that SCOTUS’s refusal to consider the case was correct — but only because of a procedural issue. Thomas was quick to clarify that he believes it is time to reconsider the protections granted by §230; but because the Texas Supreme Court allowed Doe’s sex-trafficking claim to proceed, the court’s ruling was not sufficiently “final” for SCOTUS to review.
Thomas was clear: In a case without this procedural glitch, he would be more than happy to reconsider the rules on how §230 has “confer[red] sweeping immunity on some of the largest companies in the world.”
If that wasn’t clear enough, Thomas elaborated, arguing that Facebook ought to be held accountable for prioritising profits over safety. Facebook, he said:
“Knows its system facilitates human traffickers in identifying and cultivating victims,” but has nonetheless “failed to take any reasonable steps to mitigate the use of Facebook by human traffickers” because doing so would cost the company users—and the advertising revenue those users generate.
Thomas called it “hard to see” why §230 should give Big Tech protection from liability for companies’ “own ‘acts and omissions.’” In a case with “such serious charges,” said the justice, the Court “should be certain” that the law truly demands such protection for internet companies […]
Thomas even suggested that §230 could potentially violate the First Amendment by providing immunity to social media platforms.
Law and Crime
The implications of all this are enormous. For instance, in an echo of the Big Tobacco lawsuits, leaked internal documents show that Facebook’s own research teams have identified that its platforms like Instagram are “toxic” to teen girls in particular. Yet, Facebook’s marketing strategy continues to target that very demographic.
On the other hand, conservatives’ desire for revenge on the social media companies who’ve increasingly marginalised them could potentially generate toxic results of their own. For instance, having established that social media companies are publishers, with all the liability that entails, how can conservatives logically argue against such proposed laws as Australia’s, which would hold online media companies legally responsible for defamatory comments published on their platforms — even if the company removes such comments, or was unaware of them at all?
It doesn’t take much imagination to see activists creating sock puppet accounts and making purposely defamatory comments, screen-capping them, and then reporting them. Under such a regime, the only feasible response would be to remove comment sections altogether.
Try to imagine The BFD without its comment section. As on-the-ball as our mods are, even seconds between a defamatory comment being posted and mods removing it would be enough for mendacious activists to gin up a punishing defamation case.
As Robert Bolt warned, in A Man for All Seasons, having flattened all the laws in order to catch the devil, what recourse do you have when the devil turns on you?