What is Child Sexual Abuse Material (CSAM)?

What is Child Sexual Abuse Material (CSAM)?Child Sexual Abuse Materials (also known as “child pornography”) denotes any visual representation of sexually explicit conduct involving individuals under 18 years old.


Child Sexual Abuse Materials (also known as “child pornography”) denotes any visual representation of sexually explicit conduct involving individuals under 18 years old. It encompasses various forms of media that describe child sexual abuse. CSAM conveys the exploitation depicted in these images and videos and the consequential trauma inflicted upon the child. It is essential to understand that Child Sexual Abuse Material (CSAM) not only exposes that particular child to a life of perpetual abuse but also creates a demand for this illegal material, leading to a vicious cycle of sexual abuse.

CSAM is a recorded copy of the abuse, be it sexual, physical, mental, social, or emotional, permanently made available over the vast internet. With the technological boom, it becomes easier for offenders to photograph, record, or watch live CSA; store CSAM on the device; access CSAM stored remotely; connect with victims and other offenders; and distribute and receive CSAM through an endless variety of applications. This network is so sophisticated that it encrypts all messages and devices, providing a false sense of security to offenders.

What are the materials typically included in Child Sexual Abuse Materials (CSAM)?

  1. Images: Photographs or digital images that depict children in sexually explicit poses or activities. The digital world makes it quite easy to gather and transmit images, thus jeopardising the lives of children.
  1. Videos: Recorded footage showing children being subjected to sexual acts or abuse. As gruesome as it sounds, it is true.
  1. Live-streamed content: Real-time video broadcasts capturing the sexual exploitation of children as it happens.
  1. Written narratives: Stories or text-based content that describes sexual abuse scenarios involving children in detail. Some offenders might be interested in such stories or texts.
  1. Drawings and animations: Graphic depictions or animations portraying children in sexual situations. Unfortunately, these expressive forms of communication are often misused to portray children in inappropriate scenarios.
  1. Advertisements and solicitations: Online promotions or solicitations promoting the exchange or sale of CSAM. They infamously popularise the CSAM to target its viewer audience.
  1. Chat logs and conversations: Communications discussing or arranging the exchange of CSAM. The administrator of such discussions is often the person who kick-starts the heinous crime.
  1. Websites and online forums: Online platforms dedicated to the distribution, sharing, or discussion of CSAM. They provide a breeding ground for the exchange of illegal material.
  1. Sextortion materials: Content obtained through the extortion of sexually explicit material from children. Children are often blackmailed into sharing such content, leading them into a trap of abuse. It is appalling, yet a reality.
  1. Compilation albums: Collections of CSAM categorised by age, gender, or type of abuse.

Possession of Child Sexual Abuse Materials is not just a moral issue but a legal crime. Our law enforcement agencies try their level best to prevent such disasters from occurring. But they can not be eliminated without the active participation of every citizen. Let us all realise the depth of the issue and not let children become victims of inhumanity. 

What does CSAM in India look like?

CSAM in India
  1. CSAM is readily accessible through a wide range of internet technologies, spanning social networking platforms, file-sharing sites, gaming devices, and mobile apps. This widespread availability has led to an unprecedented surge in reports submitted to the CyberTipline, operated by the National Center for Missing & Exploited Children (NCMEC).
    From 2013 to 2021, the volume of reports received by NCMEC soared from 500,000 to nearly 30 million.
    In 2015 alone, CyberTipline reports reached 4.4 million, quadrupling the previous year’s figures.
  1. On the Dark Web, where anonymity and encryption obscure tracing efforts, a single active website dedicated to child sexual abuse boasted over 2.5 million registered users as of June 2021.
  1. The National Human Rights Commission (NHRC) of India has taken proactive measures following a concerning media report indicating a significant surge, in the circulation of Child Sexual Abuse Material (CSAM) on social media platforms in the country. The report highlights that the CSAM content is predominantly of foreign origin, with Indian investigation agencies yet to encounter domestically produced CSAM.
  1. Recognizing the potential violation of fundamental human rights, particularly concerning the safety and dignity of children, the NHRC has initiated action. Notices have been issued to key stakeholders, including
    – the Commissioner of Police (Delhi)
    – the Director General of Police across all states and union territories
    – the Director of the National Crime Record Bureau (NCRB)
    – the Secretary of the Union Ministry of Electronics and Information Technology.
  1. The media report from May 2023, indicates a staggering 450,207 reported cases of CSAM dissemination in 2023 alone, with Delhi Police addressing 3,039 cases and investigating 447,168 cases further. Previous years have also witnessed a worrying increase in reported cases, with 204,056 cases in 2022, 163,633 in 2021, and 17,390 in 2020.
  1. NHRC has long been vigilant about the adverse effects of online CSAM on human rights, particularly the psychological harm inflicted on children. Past initiatives include national seminars and conferences, such as the one held on March 2nd and 3rd, 2023, and an online conference on July 21, 2020. Furthermore, the Commission has issued advisories, including ‘Human Rights Advisory for the Protection of the Rights of Children in the Context of Covid-19,’ emphasizing cybercrime reporting mechanisms and digital education guidelines.
  1. In a concerted effort to address the issue comprehensively, NHRC has facilitated discussions involving various stakeholders, including
    – international organisations,
    – government bodies,
    – law enforcement agencies, and
    – civil society groups,
    to tackle the multifaceted challenge of CSAM effectively.

Causes of Child Sexual Abuse Material (CSAM)

Probable causes of CSAM

1. Market Demand:
Individuals with a sexual interest in children drive demand for new and more egregious images, perpetuating abuse. The push for new CSAM results in the continued abuse and exploitation of child victims, and the abuse of new children every day.

2. Grooming of Minors:
Perpetrators increasingly groom minors for sexually explicit conduct online, exploiting vulnerabilities. Offenders have been known to take advantage of multiple vulnerabilities of a child, including a minor’s fear of getting in trouble with their parents or guardians, school, or law enforcement.

3. Lack of Parental Awareness and Technological Exposure:
Children’s comfort with technology leaves them vulnerable, as parents may not understand online activities or available protection measures.

4. Extortion and Blackmail:
Offenders take advantage of a child’s fear, extorting or blackmailing them to create additional CSAM or pay a ransom.

5. Fear of Law Enforcement:
Many child victims don’t report abuse promptly because offenders manipulate victims, threatening police involvement and hindering reporting. Even families who are aware of the issue are concerned that the child will get into trouble with law enforcement and may not report the crime, preventing investigators from identifying and stopping the offender.

6. Perpetual Victimisation:
Posting and disseminating CSAM online leads to lifelong re-victimization for children, impacting them perpetually. Victims experience double victimisation, suffering each time their abuse images are viewed, resulting in profound feelings of guilt, shame, and blame.

Know more about the prevention of CSAM and laws governing CSAM in our next blog. Stay tuned!