Report: Illegal Trade in AI Child Sex Abuse Images Exposed

Image copyright: Wikimedia Commons

The Facts

  • A BBC report published Wednesday has found that pedophiles are using artificial intelligence (AI) tools to create and sell photorealistic images of child sex abuse, with buyers accessing the material via paid subscription services such as the US-based platform Patreon.

  • The creators of the material reportedly use software called Stable Diffusion, a popular AI tool among artists that makes images from simple text instructions by analyzing pictures found online. A separate report also found the software Midjourney was being used.


The Spin

Narrative A

The rapidly growing industry of AI-generated child porn is a serious threat to society that needs to be tackled. Not only do these pedophiles spawn AI images by using footage of real victims, but the overwhelming amount of novel content makes it harder for police to distinguish between real and fake. Although companies understandably want to keep open-source code to promote artistic creativity, that may not be worth contributing to the evil child porn industry.

Narrative B

AI depictions of minors do pose serious concerns on many fronts, but they can also help law enforcement catch pedophiles before any real children are hurt. For example, police created a fake young girl to combat child webcam sex trafficking; thousands of men around the world were then exposed for paying for her content. Hopefully, as is underway in Australia, individual law enforcement agencies can each create their own AI to find and arrest more predators.


Go Deeper


Articles on this story