While there is a long way to go, AI companies, including Stability AI, are working with governments and law enforcement across the world to rid the internet of harmful child abuse imagery. As AI grows in use and capability, the companies behind the technology have the tools and the ambition to keep their products safe while also offering their positive qualities for the public to use appropriately.
While private tech giants like Meta, OpenAI, and Google claim they've steered clear of Stability AI and its child abuse-plagued datasets, the fact is that LAION is an open-source software that allowed the public to catch flaws in its system. If the amount of child abuse found in LAION disturbs you, just imagine what's behind the closed-door datasets of these private companies.
This is just one example of the danger posed by the current AI race. AI products like the LAION dataset are being rushed to market in an attempt to fend off the competition, and the result in this case is that an internet-wide scrape of images was open-sourced without due diligence. This is just the tip of the iceberg unless more is done to regulate AI and cool down the race.