Dark Web AI Being Used To Generate Illegal Material

The disturbing trend of pedophiles collaborating online to employ open-source artificial intelligence to make child sexual abuse content has been brought to light by an internet watchdog.

The chief technology officer of the Internet Watch Foundation, Dan Sexton, says there is a technical community within dark web forums where they are discussing this technology. They are sharing photos; they are sharing [AI] models. Guides and advice are being passed around.

The chair of the U.K. government’s AI task force, Ian Hogarth, expressed concern about child sex abuse material and how open-source models have been used to make “some of the most heinous things out there.”

It would appear that anyone can obtain open-source software and modify it to their liking. Dall-E and Imagen from OpenAI and Google, respectively, are not in this category since they are closed systems that cannot be accessed or modified by the public.

Sexton claims that the dark web is being used by pedophiles interested in child sex abuse content to generate and spread realistic-looking imagery.

They think the stuff we’ve seen is generated by open-source software downloaded, installed, and operated locally on individual PCs, which can be customized. Sexton said it had been trained to recognize and produce child sexual assault content.

Dark web chats, he said, often involve photographs of minors found in the public domain or associated with famous people. There are instances in which the experiences of child sexual abuse victims are exploited as the basis for original works.

One primary concern is that many people could be subjected to graphic images of child sex abuse if AI were used to create them. Pioneer Development Group’s chief analytics officer, Christopher Alexander, said that AI may be used to discover trafficked minors by analyzing “age progressions and other characteristics.”

Before this kind of technology gets out of hand, there have been calls for the government to put the brakes on its development.

Although IWF actively searches for and helps remove child sex abuse material from the web, it may soon get overwhelmed with alerts about AI-generated material. According to Sexton, the content is already widely disseminated online.

The widespread sexual exploitation of children on the internet is a significant problem, according to Sexton. So, this won’t be a solution to the issue. Potentially, it will only make things worse.