We are in a situation where the legislation is decades behind: Glen Pounder

Interview/ Glen Pounder, chief operating officer, Child Rescue Coalition

46-pounder Committed to care: (From right) Pounder; Carly Yoost, CEO and founder of Child Rescue Coalition; Bill Wiltse, president

The internet has democratised the freedom of expression around the world, but it has also spurred a rise in harmful and illegal content, including child sexual abuse material (CSAM), and has provided offenders with the opportunity to access, possess and trade images and videos anonymously. Glen Pounder, chief operating officer, Child Rescue Coalition, says he has evidence suggesting that there has been more than a 50 per cent increase in the consumption of CSAM across the world during the pandemic. Excerpts from an interview:

Q/ Is there any data on how CSAM has increased in India during the pandemic?

A/ We did an analysis on the consumption of CSAM during the pandemic across the world, and we saw that there was an initial drop in the networks we monitored, but then over the following months we saw a significant increase, which varies from country to country. But there is no doubt that in some countries that increase was 60 per cent higher for the number of suspects. Of course, anecdotally some bad guys were in these networks discussing that the pandemic was indeed a great time for them.

Q/ What are the different kinds of CSAM offenders on the internet?

A/ Some enjoy grooming children online with a view to meeting those children and sexually abusing them. Other offenders never intend to meet the child. They will pretend to be similar to the child—same sex, same age—and they will groom them, exchange images and as soon as the child send the images that can be embarrassing to the child, they will take it onto the next stage. They will compel the child with threats of coercion to give more and more, and in some horrific cases, those children are compelled or groomed by the adults to commit offences on younger children whom they have access to.

Q/ Which are the most commonly used social media platforms by predators?

A/ That is hard to say. It is said that Facebook may be the most common platform because it regularly sends CSAM reports to NCMEC, for which it is actually mandated by law. But this does not mean that Facebook is the most popular platform for bad guys. There may be other platforms that are being used much more, but we don’t know about it because technology companies, many of which are in the US, are not compelled by US laws to make any records.

I think there are well over 100 companies reporting to NCMEC, but as against that, there are millions of online providers and apps that do not. In NCMEC’s 2020 data, there are only a limited number of companies that reported the data and provided cyber tips. So we are in a situation where the legislation is decades behind.

But now that we know that the technology exists, we must make sure that our companies look for potential predators on their platforms. They are making millions of dollars from the consumers who use their platforms, so why not provide a safety net? Maybe half or one per cent of a year’s profits can be allotted to this aspect.

Q/ You are saying apps and games are creating secret ways into our children’s lives and the tech companies are not doing anything about it.

A/ That is correct. Parents really need to know what are the aspects that make those apps a secret tunnel into the lives of their kids. Does it allow private chat, location, need for providing phone number? Does the child know whom he is speaking to online? If he does not, that’s the first red flag. But because the companies are not compelled to keep their apps safe, most of them are bothered about making money and why would they spend money on protecting children when that is not profitable? Talk to the children and tell them as soon as they send an image into the internet, it is gone forever and one will never be able to undo it. It is important to communicate with kids.

Q/ Any startling observations?

A/ There is a case where our system was able to detect just one CSAM file attributed to one man who had been consuming it. Because of our system, the police were able to locate the man’s house and that is when the investigation started. In the course of their investigation, they were able to establish that the father in that house was raping his own 18-month-old child while his wife was away at work. This man may never have been identified by traditional methods of snooping on Gmail or Facebook.

Some shocking statistics go like this: Between 55 and 85 per cent of people who are consuming CSAM are already committing offences against children in person. Between 80 and 90 per cent of the children who are sexually abused are abused by somebody they already know, and it is a myth that child abusers are mostly middle-aged or older. I know of so many cases where the abuser is just 21, exploiting a 2.5-year-old child. Once a person has been arrested for CSAM, he or she should be kept away from children and should never be allowed to care for them. I know there are predators trading in CSAM in India right now using these apps, and with a population of 1.3 billion India is worse off than any other country. But this is a global problem. So one of the common stunts of these tech companies is that ‘we don’t know what’s going inside the app and hence it is not our fault’. Telegram is an example of an app where it is absolutely guaranteed that thousands of CSAM files are being traded at any given point in time in India, even right now as we speak.