Study: Explicit Photos of Children Being Used to Train AI

Image copyright: Unsplash

The Facts

  • A study from the Stanford Internet Observatory (SIO) has found over 3.2K images of suspected child sexual abuse in the artificial intelligence (AI) database LAION, which is used to train the leading AI image generator, Stable Diffusion. The photos have reportedly been used to create realistic images of fake children and even transform pictures of fully clothed real teens into nude photos.

  • The study's findings counter previous claims that AI software creates child sex abuse imagery only by merging adult pornography with photos of real children. According to the study's authors, "having possession of a LAION‐5B data set populated even in late 2023 implies the possession of thousands of illegal images."


The Spin

Narrative A

While there is a long way to go, AI companies, including Stability AI, are working with governments and law enforcement across the world to rid the internet of harmful child abuse imagery. As AI grows in use and capability, the companies behind the technology have the tools and the ambition to keep their products safe while also offering their positive qualities for the public to use appropriately.

Narrative B

While private tech giants like Meta, OpenAI, and Google claim they've steered clear of Stability AI and its child abuse-plagued datasets, the fact is that LAION is an open-source software that allowed the public to catch flaws in its system. If the amount of child abuse found in LAION disturbs you, just imagine what's behind the closed-door datasets of these private companies.

Narrative C

This is just one example of the danger posed by the current AI race. AI products like the LAION dataset are being rushed to market in an attempt to fend off the competition, and the result in this case is that an internet-wide scrape of images was open-sourced without due diligence. This is just the tip of the iceberg unless more is done to regulate AI and cool down the race.


Metaculus Prediction


Go Deeper


Articles on this story