Do AI images count as indecent images?


It’s common knowledge that indecent images – photos or videos of children under 18 years old in sexual contexts – are against the law in the UK.

However, the rapid development of artificial intelligence tools poses new problems, allowing offenders to generate AI images of child abuse.

Organisations working to prevent child sexual abuse in the UK, like the Lucy Faithfull Foundation, are reporting an increasing number of people using AI to generate indecent imagery without realising that this material is still illegal.

AI images can be just as harmful

As the people in completely AI-generated images are ‘not real’, it blurs the boundaries of moral behaviour, which can become a gateway for harming real children.

In some cases, AI can be used to generate sexual images of real people by replicating their likeness, with some offenders even creating new imagery of existing victims.

This has also led to a rise in teenagers using AI to create indecent images of their peers without realising this is a criminal offence, even if the offender is also a child.

The Internet Watch Foundation (IWF) has been raising the alarm of how increasingly realistic AI indecent images pose a very real risk to the general public, and how using AI for this purpose normalises sexual violence against children.

It’s essential to reinforce the message that AI-generated indecent images are not ‘less harmful’ or any less illegal, because creating, distributing, and viewing them still reinforces the interest in these abusive and illegal behaviours.

The law and indecent AI images

AI-generated indecent images fall under the same laws as real, non-AI child sexual abuse material (CSAM). Just as producing, possessing, viewing, or distributing real indecent images is illegal, the same applies to AI-generated CSAM.

The Protection of Children Act 1978 and the Criminal Justice Act 1988 make it a criminal offence in the UK to take or permit the taking of, possess, access, or distribute an indecent image of a child, whether it’s a picture, video, or electronic file.

‘Making’ an indecent image has broad legal interpretations, from downloading images to opening email attachments, so accessing CSAM generated by AI falls within the legal definition. Purposefully generating these images may also be considered production, which is an even more serious offence with severe penalties.

The Coroners and Justice Act 2009 also criminalises the creation of prohibited images, which are obscene or offensive images such as drawings or animations, so even non-photographic AI-generated indecent images are still against the law.

Additionally, the Online Safety Act 2023 will regulate online platforms more closely to identify abusive and harmful material and hold perpetrators accountable – which includes those sharing non-consensual AI-generated sexual images.

Penalties for AI indecent images

In the last year, analysis by the IWF has found that the majority of AI-generated CSAM is now convincing enough to be considered real imagery, making it difficult for even trained law enforcement to distinguish between AI and real indecent images.

As the technology continues to improve, it will take even more time and effort for authorities to distinguish between real images and AI, but in either case, they will categorise indecent images using the same system and apply the same penalties.

This means that anyone prosecuted for an AI indecent images offence will be charged – and if convicted, sentenced – according to the severity of the images as if they were real. The penalties can range from community orders to several years in prison.

Additionally, depending on the sentence, convicted offenders may also have to register as sex offenders for up to 10 years or more after they are released from prison.

The courts may also impose further requirements, such as banning contact with children or use of the internet – or specifically banning the use of any AI software.

With the law regarding AI indecent images just as harshly as their real counterparts, it’s important to speak to legal experts on indecent images if you are ever accused of committing an indecent image offence, whether AI or otherwise.

 

 

shinywriter