Experts on child abuse and technology have issued a warning to educators of relationships and sex education in British schools. They report that children are using artificial intelligence (AI) to create indecent images of other children, constituting child sexual abuse material. The images are described as “terrifyingly” realistic, comparable to professional photos taken in schools. The UK Safer Internet Centre (UKSIC) director, Emma Hardy, emphasizes the need for urgent action in implementing better blocking systems against such material. UKSIC director David Wright acknowledges the small case numbers but urges proactive measures before the problem escalates. The warning comes as the Internet Watch Foundation notes that AI-generated images of child sexual abuse are becoming indistinguishable from real imagery, posing a significant threat to the internet.
According to UK law, distribution of child pornographic images, whether AI generated or not, is illegal. Clear communication with young people about laws should be a key part of relationships and sex education.
Why not share this article with your students? Foster critical thinking skills; employ the following questions for discussion with your students.
- In what ways should legal systems address the creation of indecent imagery using AI by school pupils?
- Are existing laws sufficient, or should there be specific regulations?
- What role should parents play in monitoring and guiding their children’s use of technology, especially in the context of creating inappropriate content?