Home
AI
World
Politics
Health
Crime & justice
Science & technology
Social issues
Sports
Money
Entertainment
Environment/energy
Military
Culture
Weather
Media






Home
Bias Split
Public FiguresControversies

Sign Up for Our Free Newsletters
Sign Up for Our Free Newsletters

Sign Up!
Sign Up Now!

How our sliders workAboutContact UsNewsletter Archive
MediaFAQGlossaryPrivacy Policy
  1. Home

Study: Explicit Photos of Children Being Used to Train AI

  • #Child sexual abuse
  • #Computers & internet
  • #Artificial Intelligence
  • #Children and youth
  • #Child abuse & neglect
  • #Pornography & obscenity
  • #Canada
  • #Crime & justice
story
DEC 2023
Image copyright: Unsplash
story last updated DEC 2023

The Spin

Narrative A

While there is a long way to go, AI companies, including Stability AI, are working with governments and law enforcement across the world to rid the internet of harmful child abuse imagery. As AI grows in use and capability, the companies behind the technology have the tools and the ambition to keep their products safe while also offering their positive qualities for the public to use appropriately.

Cointelegraph

Narrative B

While private tech giants like Meta, OpenAI, and Google claim they've steered clear of Stability AI and its child abuse-plagued datasets, the fact is that LAION is an open-source software that allowed the public to catch flaws in its system. If the amount of child abuse found in LAION disturbs you, just imagine what's behind the closed-door datasets of these private companies.

Vice

Narrative C

This is just one example of the danger posed by the current AI race. AI products like the LAION dataset are being rushed to market in an attempt to fend off the competition, and the result in this case is that an internet-wide scrape of images was open-sourced without due diligence. This is just the tip of the iceberg unless more is done to regulate AI and cool down the race.

BNN

Metaculus Prediction


Go Deeper

Artificial Intelligence
Context
+4
MAR 2023MAR 2023

Articles on this story

AI image-generators are being trained on explicit photos of children, a study shows
Associated PressJUL 2023
Exploitive, illegal photos of children found in the data that trains some AI
Washington PostJUL 2023
Stable Diffusion Was Trained On Illegal Child Sexual Abuse Material, Stanford Study Says
ForbesJUL 2023
AI image training dataset found to include child sexual abuse imagery
VergeJUL 2023