
‘CSAM is a serious threat to a child’s right to life and dignity’
| Photo Credit: Getty Images
Recently, the Department for Science, Innovation and Technology of the British Government, along with the AI Safety Institute (now called the AI Security Institute), released the first-ever International AI Safety Report 2025 (updated February 18, 2025). It flags the imminent risk of the generation, the possession, and the dissemination of child sexual abuse material (CSAM) with the help of Artificial Intelligence (AI) tools. Additionally, the United Kingdom is making the first legislative attempt to target the threats posed by AI tools that can generate CSAM. CSAM refers to material (audio, video, and images) that depicts a sexually explicit portrayal of a child. In a similar vein, the World Economic Forum, in a 2023 paper, highlighted how generative AI can create life-like images, especially of children. Moreover, the Internet Watch Foundation, in its report released in October 2024, underscored the proliferation of CSAM on the open web. The Government of India must amend existing laws to address the emerging threats and ensure long-term effectiveness.
Recent developments
The upcoming U.K. legislation will make it illegal to possess, create, or distribute AI tools that can generate CSAM. Moreover, it will be illegal to possess paedophile manuals that may guide individuals in using AI tools to generate CSAM. This marks a progressive shift from an ‘accused-centric’ and ‘act-centric’ to a ‘tool-centric’ approach in dealing with these abhorrent crimes.
The existing laws focus entirely on ‘who’ has done ‘what’, placing less or no emphasis on the ‘tool/medium’ used to commit the said ‘act.’ For instance, the Protection of Children Act 1978 criminalises taking, distributing, and possessing an indecent photograph or pseudo-photograph of a child. Furthermore, the Coroners and Justice Act 2009 criminalises the possession of a prohibited image of a child, including non-photographic materials. In contrast, the proposed law outlaws even the possession and use of such AI tools, making it deterrent and holistic. Second, it will enable enforcement authorities to apprehend offenders at the preparation stage itself. Third, it can curb the initial rippling effect caused by the spread of CSAM on the mental health of children. Fourth, it addresses the legislative gap concerning CSAM generated as purely AI imagery, which was previously restricted to the images of an ‘actual child.’
On whether India is future ready
According to the National Crime Records Bureau (NCRB) Report 2022, cybercrimes against children have substantially increased compared to the previous year’s statistics. Moreover, the National Cyber Crime Reporting Portal (NCRP), under the aegis of the Cyber Crime Prevention against Women and Children (CCPWC) scheme, recorded 1.94 lakh child pornography incidents as of April 2024. In 2019, the NCRB signed a memorandum of understanding with the National Centre for Missing and Exploited Children (NCMEC), USA to receive tip-line reports on CSAM. As of March 2024, 69.05 lakh cyber tip-line reports have been shared with the States and Union Territories concerned. The statistics underscore the gravity of CSAM as a serious threat to a child’s right to life and dignity in India.
Presently, Section 67B of the IT Act 2000 punishes those who publish or transmit material in electronic form depicting children in sexually explicit acts. Furthermore, Sections 13, 14, and 15 of the Protection of Children from Sexual Offences Act, 2012 (POCSO) prohibit using children for pornographic purposes, storing child pornography in any form, and using a child for sexual gratification. Additionally, Section 294 of the Bharatiya Nyaya Sanhita penalises the sale, distribution, or public exhibition of obscene materials, while Section 295 makes it illegal to sell, distribute, or exhibit such obscene objects to children. However, the existing legislative framework lacks adequate safeguards to deal with the AI-generated CSAM.
A plan to follow
The existing legislative and policy framework in India needs to adapt to futuristic challenges, by making suitable changes. First, as proposed by the NHRC Advisory in October 2023, the definition of ‘child pornography’ under the POCSO Act must be replaced with the phrase ‘CSAM’ to make it expansive. Second, the term ‘sexually explicit’ under Section 67B of the IT Act must be defined to enable the real-time identification and blocking of CSAM. Third, the definition of ‘intermediary’ under the IT Act must expressly include Virtual Private Networks, Virtual Private Servers, and Cloud Services to impose statutory liability on them to comply with the CSAM-related provisions in Indian laws. Fourth, statutory amendments are needed to integrate the risks arising from emerging technological advancements. Fifth, the Government of India must pursue the adoption of the UN Draft Convention on ‘Countering the Use of Information and Communications Technology for Criminal Purposes’ by the UN General Assembly. Notably, the Ministry of Electronics and Information Technology proposed the Digital India Act 2023, currently in pipeline, to replace the two-decade-old IT Act. Therefore, and lastly, the proposed Digital India Act must draw inspiration from the U.K.’s upcoming legislation to include the provisions specifically targeting AI-generated CSAM.
Shivang Tripathi is a Doctoral Researcher at the Faculty of Law, Banaras Hindu University. Neha Singh is a Doctoral Researcher at the Faculty of Law, Banaras Hindu University
Published – April 03, 2025 12:08 am IST