On Section 230 and Instagram's child pornography problem
The White House and social media companies care more about censoring views they don't like than the facilitation of child porn - and rape - on their platforms
Instagram has a problem with child sexual abuse.
Instagram - and its parent company, Meta Platforms, which also owns Facebook - do not seem to care.
The Wall Street Journal ran a devastating piece today on child pornography and rape networks that Instagram does not merely tolerate but facilitates.
—
(INFORMATION AND ANALYSIS YOU WON’T GET ANYWHERE ELSE. AND THAT’S THE TRUTH.)
—
The piece is filled with ugly revelations from top to bottom. Near its end the reporters report “Instagram’s [automated] suggestions were helping to rebuild [a pedophile] network that the platform’s own safety staff was in the middle of trying to dismantle.”
Even worse, they reveal that Instagram would allow users to see posts it knew might be harmful or illegal, after a short warning:
(“See results anyway.” Unbelievable but true, this is real:)
—
I’d call the piece an expose, except other outlets have run similar attacks on Instagram’s enabling of child sexual abuse for years, without any effective response from Instagram, Meta, or Facebook. If anything, the problem appears to have worsened since 2020, when school closures left children prey for abusive adults.
The Journal article makes clear that the problem is not that the users posting this content are sophisticated or technologically savvy. They are not using encryption or even trying very hard to hide the content:
The pedophilic accounts on Instagram mix brazenness with superficial efforts to veil their activity, researchers found. Certain emojis function as a kind of code, such as an image of a map—shorthand for “minor-attracted person”—or one of “cheese pizza,” which shares its initials with “child pornography.”
The users don’t try harder to hide what they’re doing because they can’t - they’re chasing new buyers and users. Instagram’s virtue for them is that it is wide-open.
—
But why doesn’t Instagram try harder?
Assuming the answer is not that Meta and Instagram are run by a pedophile cabal - and let’s all hope that’s not the answer - the reason is that they don’t have to. The previous stories generated a day or two of bad press and then vanished.
Meanwhile, Section 230 of the Communications Decency Act, the infamous Section 230, gives social media companies essentially complete immunity for user-generated content.
Even a 2018 law called the Fight Online Sex Trafficking Act - which, as its name implies, is meant to increase the legal liability companies faced - has hardly pierced 230’s legal veil.
Last year, the federal 9th Circuit dismissed a claim from women who said the bulletin-board site Reddit had allowed images of them being abused as minors. And on May 30, the Supreme Court declined to hear the case - again refusing to set any limits on Section 230 and the protection it gives the companies.
—
(Smile for the camera, kiddo!)
—
This issue incenses me not just because I have three kids but because I know personally social media platforms can move quickly to ban content when it bothers them. Instagram has repeatedly taken down posts of mine that are nothing more than screenshots of my Substack articles reporting on the mRNAs.
But I am far from alone. During Covid, Instagram and Facebook heavily censored anti-lockdown posts. Facebook even banned posts on the lab leak theory until late May 2021.
Instead of putting the same effort into stopping child pornography and even use of its network to set up real-world physical sexual abuse of minors, Facebook and Instagram appear to be doing the minimum possible, relying on automated systems that match images to an existing database of child sexual abuse photos and videos.
Facebook may have concluded that using human moderators to examine images and hash tags would expose them to legal liability for pornography. Worse, it may have decided that setting strict automated limits would risk making it harder for bikini models who have some of Instagram’s largest audiences to post new glamour shots.
(21,559 likes. Willow Hand is 24, but you get the point. So does panda_wants_gummibears.)
—
The great irony here is that Section 230 explicitly allows for social media companies to move against sexually abusive content - its section C(2) allows for bans of “obscene” material in “good faith.”
But the companies would rather rely on the broader protections the federal 9th Circuit and other courts have said the law’s section C(1) gives them. Courts interpret 230 as allowing the companies both to censor content and users whenever they like and avoid any liability for the content they do allow.
They have the best of both worlds, and they use it. At this point, unless the Supreme Court restricts Section 230, it now seems that only boycotts and possibly legal and Congressional investigations of top executives will cause Meta and Instagram to tighten their rules against pedophiles.
Legal immunity is a hell of a drug.
The biggest sex trafficking hub is also our open border… and there is story after story of the government actually facilitating the placement of sex trafficked minors with their traffickers!
Planned Parenthood is also a good ally to sex traffickers, never asking questions about the older men bringing in young girls for abortions…
At what point does this stop being a coincidence and we have to acknowledge that an entire political party (and half the other) is frankly just pro sex trafficking?
This is so despicable. Instagram targets kids all the time. From anorexia tips to violent screenshots for young men. They need some hard lessons to stop this shit. Parents everywhere are fed up! And so is this grandma.