×

Spike in creation, circulation of child porn calls for stronger law enforcement

Illustration: Bara Bhaskaran

The photograph was shot on what seems like a hillock. A girl who looks about nine years old squats on the ground. She casually holds an open book, unmindful that it is upside down, and looks into the camera with her eyes empty and her face blank. Next to her stands a thin boy, in his teens, holding the branch of a tree. He looks malnourished and his smile appears fake. Both children are naked.

It did not take Additional Director General of Police Manoj Abraham long to identify the location in the photograph as a rural area near Kochi. His team, which has been at the forefront of Kerala’s special drive against child pornography, traces such children by visiting the locations, but the parents often refuse to acknowledge that such a thing had happened. “We know it exists and it is widespread now more than ever with the cheaply and widely available access to smartphones and the internet,”said Abraham. “Child porn is a transnational crime, it is affecting every state, every country and nobody is immune to it because sexual predators, especially paedophiles, exist everywhere.”

In the past 18 months of the pandemic, these predators have been thriving. In March, the United Nations Human Rights Council noted that the Covid-19 pandemic amplified the risk to vulnerable children from trafficking and sexual exploitation. While presenting a report on the impact of the coronavirus disease on different manifestations of the sale and sexual exploitation of children, the special rapporteur to the UNHRC said that the pandemic had “changed the pattern of sexual exploitation in which perpetrators were operating to produce, disseminate or consume child sexual abuse materials online.”

Cyber police teams in different states acknowledge that the circulation of child sexual abuse materials, or CSAM, has picked up exponentially since the pandemic struck. “The CSAM cases increased during the pandemic months in India to around 150 per cent on the dark web and almost 300 per cent on social media platforms,” said Abraham. “The disturbing trend was the transmission of videos of local boys and girls and also live video sessions with children.”

The Maharashtra Police arrested about a hundred people and registered some 200 cases in connection with the creation and circulation of CSAM during the pandemic. Sanjay Shintre, superintendent of police, Maharashtra Cyber, said the cases were registered based on the “tipline reports”shared by the National Crime Records Bureau (NCRB), which had received them from the US-based National Center for Missing and Exploited Children (NCMEC), which regularly monitors web sites, search engines and social media platforms for CSAM.

Between September 2019 and January 2020, more than 25,000 CSAM were uploaded across social media platforms in India, say reports shared by NCMEC to NCRB. The Kerala Police Cyberdome, the nodal agency for investigating cybercrimes in the state, has partnered with Interpol and the International Center for Missing and Exploited Children (ICMEC) to combat child porn. It arrested 21 people in its first raid in April 2019 and many more in the subsequent ones.

NCRB data says cyber pornography cases that involved hosting and publishing of obscene sexual materials of children were close to 50,000 in 2019. “Two years later, the problem has increased manifold because so many new platforms and channels have emerged online,” said Deputy Commissioner of Police Balsing Rajput, who helms Operation Blackface, Maharashtra’s anti-child ops. “The problem is that if we bust a racket, in no time another one comes up. We arrested a vegetable seller, Hariprasad Patel, who was caught sharing videos of CSAM on Facebook. He was back at it in no time on a different profile,” said Rajput.

Child pornography is a billion-dollar business driven by the heavy demand for it. On the dark web, the dirty underbelly of the internet, these transactions are done through cryptocurrencies and other means that are difficult to trace. And, most of those who create and circulate this content are sophisticated young adults.

Dr Debarati Halder, a cyber victim counsellor practising in Gujarat, gives a clear profile of the typical accused charged with creating, viewing, downloading, circulating or forcing a child to view CSAM—well-educated, decent, mostly with an IT background and knows how to keep himself anonymous. He often hails from a middle-class or upper-middle-class background. All the 21 people whom Abraham arrested in his first raid were “very well-educated young men in the age group of 25 to 35 working in reputed companies”.

Halder said the accused almost always knows the child victim very well, as a relative, friend or neighbour. Being a parent herself, she has observed how during the pandemic-induced online learning, the daughter of a close friend went into depression after she got bullied by classmates who took screenshots of her attending classes in casual clothes and circulated them on WhatsApp groups with cheesy comments. “This is scary because now there is no alternative to virtual learning and the options for self-protecting mechanisms remain limited,” said Halder.

Experts say children need to be actively protected, by technological means as well as by physical supervision, against the perils of the internet. Activist and social worker Uma Subramanian’s NGO, Aarambh India, runs an internet hotline in the fight against CSAM. It was launched after Rahul Trehan’s case came up six years ago. Trehan, who was 15 then, lived with his parents in suburban Mumbai. One day, he received a friend request on Facebook from a woman. He accepted it, began chatting with her and they became friends. After a few months of online interaction, the woman asked the boy to share his nude pictures. He obliged despite the initial hesitation. The next day, the woman started blackmailing him, saying that she would put the pictures on pornographic websites unless he had sex with his neighbour, filmed it and sent it to her. The neighbour, an adult male, said he was also trapped by the same woman. The ‘sextortion’ went on for a few months, taking a toll on Trehan’s physical and mental health until he told his parents about it. Investigations followed and what was initially assumed to be a case of physical sexual abuse turned into one involving child porn after it was established that the neighbour himself had impersonated a woman to blackmail the child into having sex with him.

“It is an enormous challenge to break into the internet’s underbelly where all sorts of crimes take place,” said Nandakishore Harikumar, director and CEO of Technisanct, a big data cybersecurity startup that carries out research on the dark web and creates tools that can detect CSAM. “Policing it requires special skill sets.”

Most criminals use multiple identities, and at times even stolen identities, to circulate or sell child porn. They constantly identify new platforms and host their own servers to dodge the radar of global surveillance agencies. “It is a huge mafia which is very organic. It is always on the move; payments are in Bitcoin so that nobody can ever trace them. We cannot even identify which country they are operating from. The use of Torque further complicates it. It makes the IP address bounce from one country to the other so that you can never locate the exact location,,” said Harikumar.

Ironically, the same tools used to protect users’ privacy are being used by these criminals to hide from law enforcers. Messaging app Telegram, for instance, has emerged as one of the most widely used platforms for the circulation and consumption of CSAM. “You can host a Telegram group using a VPN network. This makes it very difficult for the law enforcement agencies because they can never find out the real identity of the culprit,” said Harikumar.

Abraham’s team came across a Telegram group in Kerala created to share child pornography. Cracking it was not easy because the data was encrypted end to end and the users remained anonymous. That was when the police thought about infiltrating the group. “We got in as a member of a Telegram group called Butterfly with 3,000 members. This had scores of pictures of local children, rape videos, morphed pics and videos of child artists and animated CSAM. So it became all the more important to bust this racket and trace the administrator, Sharafuddin,” said Abraham. The police sent a malicious code in the form of child porn to interest him. The moment he clicked on it, the malware-infected his device and the police got the IP address. He was arrested with technical help from Interpol and ICMEC.

In 2020, Cyber Peace Foundation, a think tank, carried out an investigation into the use of chat groups to circulate CSAM. “Telegram and WhatsApp are principally different in the way that their groups and channels operate,” said Vineet Kumar, the organisations’s founder and president. “The similarity, however, is that one can create similar group/channel invite links as WhatsApp’s to get people to join them without any restriction. An interesting finding was that there were quite a few channels dedicated to pornography with region-specific content from India, with the number of subscribers crossing 50,000 in some cases. The content shared include videos and pictures of child sexual abuse and physical violence against children, and messages offering services like video chat with children for Rs500 for 10 minutes and sexual intercourse for Rs5,000.”

Keeping track: Additional Director General of Police Manoj Abraham and his team have been at the forefront of Kerala’s special drive against child pornography | AFP

Technical challenges, delays in getting reports of CSAM from international agencies and the slow trial system in India are the major hurdles in tackling cases involving child pornography. An IPS officer in Thiruvananthapuram said at times it was difficult to even prove that there was an offence. “What happens is that CSAM is viewed through bots (or robots, which are software programmes that perform automated predefined tasks). When we went to the court with a case involving a man who was into CSAM through a bot, the court dismissed it saying there was no real person involved, hence there was no offence.”

Victim identification is another challenge. “Generally we can’t use photographs of girls for victim identification because that can become a problem for them. Abroad, agencies have victim ID software, but in India, it is not there yet,” said Rajput.

Then there is the larger concern of protecting the victim. “It is not just about the prosecution of the offender,” said Vidya Reddy of Tulir Centre for Prevention and Healing of Child Sexual Abuse in Chennai. “The offender will be arrested and jailed, but what is happening to the child reeling from this is also equally important.”

A study conducted in 2020 by the NGO Child Rights and You on online safety and internet addiction among 630 school-going adolescents found that 80 per cent of the boys and 59 per cent of the girls had social media accounts, and 63 per cent said they accepted friendship/connect requests only from people they knew while the rest said they accepted requests from friends of friends or strangers. The latest data released by NCMEC CyberTipline says, in 2020, it received more than 21.7 million reports from electronic service providers and social media platforms pertaining to CSAM, online child sex trafficking and online enticement. Most of them (20.3 million) were found on Facebook, about 5.5 lakh incidents on Google, close to 1.5 lakh on Snapchat, 65,000 on Twitter and 22,000 on TikTok.

A spokesperson for Facebook said it had deployed sophisticated technology across all its platforms (Facebook, WhatsApp and Instagram) to proactively find and remove child exploitative content and worked with local and international law enforcement to take action on perpetrators. “We have absolutely zero tolerance for any behaviour or material that exploits young people online,” said the spokesperson.

In order to establish stricter norms and ensure that the purveyors of child pornography do not get a step ahead of the regulators, the Adhoc Committee of the Rajya Sabha instituted by Vice President M. Venkaiah Naidu last year made 40 recommendations. These included amendments to the Protection of Children From Sexual Offences Act, 2012, and the Information Technology Act, 2000, besides technological, institutional, social and educational measures and state-level initiatives to address the issue.