AI Makes Child Porn Too – eCrimeBytes Nibble #37

With every technological advancement someone will eventually come along and exploit it. Well, those using AI have been caught deepfaking child porn.

At first I thought this was a case of creating the material from scratch, but it’s much worse than that. Steven Larouche of Canada use AI technology to put real children’s faces from innocent online posts onto bodies of victims from previously created child pornography.

So imagine you have a child and they post pictures on their social media accounts. Larouche would research your child in depth and use their public images to make new deepfaked child pornography of your child combined with the bodies of previously created child pornography victims.

This victimizes your child and the child the original pornography was created from.

I thought this snippet was interesting, investigatively:

Many images of child pornography have a digital fingerprint, allowing police to identify them, the judge wrote. By creating the synthetic images, Larouche made it more difficult for police to stop the spread of the illicit material.

https://montreal.ctvnews.ca/quebec-man-sentenced-to-prison-for-creating-ai-generated-synthetic-child-pornography-1.6372624

Larouche asked for leniency because his victims were not assaulted, but luckily the judge was not having it:

Larouche also admitted to possessing more than 545,000 computer files containing images or videos of child sexual abuse, some of which he made available to others.

Among those images was a series of one girl being abused over a seven-year period, between the ages of seven and 14. On Larouche’s computer, police also found photos of her from her social media accounts and personal information about the child, including her real name, the town where she lives and the name of her school.

In total, Gagnon sentenced Larouche to eight years in prison. With credit for time served, Larouche will be required to serve another five years and 11 months.

https://montreal.ctvnews.ca/quebec-man-sentenced-to-prison-for-creating-ai-generated-synthetic-child-pornography-1.6372624

Further Reading:

Transcript:

[00:00:00] Keith: With every technological advancement, someone will eventually come along and exploit it. Well, those using artificial intelligence have been caught deep faking child porn. At first, I thought this was a case of someone just trying to create child porn from, you know, just regular porn, but, it is worse than that.

[00:00:22] There’s an individual in Canada. His name is Steven Larouche. He used a AI technology to put real children’s faces from innocent online posts that they share. You know, just normal family posts takes those faces off and uses AI technology to basically stitch them into other previously produced child pornography.

[00:00:46] So imagine you have a child, they post pictures on their social media accounts. Larouche would research a child in depth and use their public images to make new, deep, fake child pornography of your child, combined with bodies of previously created child pornography victims. This victimizes, not only your child, but it also victimizes the original child that was forced into the original child pornography material. So, you know, I did my usual research and I found an interesting investigative snippet in one of the news articles that I thought I would read to you. It says, many images of child pornography have a digital fingerprint allowing police to identify them.

[00:01:34] The judge wrote, By creating the synthetic images, Larouche made it more difficult for police to stop the spread of illicit material. So you can imagine the fingerprint of the image. Typically these, these digital fingerprints of images and other files and computer forensics. Once you start changing little bits and pieces of the file, it changes that fingerprint.

[00:01:59] And so if you have this fingerprint of some child pornography from a year ago and Larouche goes along and picks a child off the internet and pastes their face onto the old child pornography, the digital fingerprint is now gonna be new. So there’s not an easy way for detectives to just easily say, is this fingerprint on this drive anymore?

[00:02:22] They have to go and look at everything on the drive with their own eyes to determine if it’s gonna be child pornography material or not. Well, in court, Larouche asked for leniency because he said his victims were not assaulted. But luckily it sounded like the judge was not having it. So from another article I found it says Larouche also admitted to possessing more than 545,000 images. I had to pause cuz it just, I can’t even imagine having a thousand images, let alone over a half a million images, over a half a million images of videos of child sex abuse, some of which he made, and some of which he just got off the internet. Among those images was a series of one girl being abused over a seven year period between the ages of seven and 14. On Larouche’s computer police also found photos of her, from her social media accounts and personal information about the child, including her real name, the town where she lives, and the name of her school. In total, the judge sentenced Larouche to eight years in prison. With credit for time served, Larouche will be required to serve another five years and 11 months.

[00:03:47] I really don’t like taking the sex crime cases, but I chose to take this one because of the AI twist on it, and I hope, I really hope that this does not become a trend. So if you enjoyed this quick eCrimeBytes nibble, I hope you join us on one of our longer eCrimeBytes episodes where we take a look at a crime like this.

[00:04:08] Not, not necessarily a sex case, but. A computer crime case and we go in depth into it about 30 to 60 minutes and we pick it apart. When I say we, it’s myself and my co-host Seth Eichenholtz. So if you enjoyed this nibble, I hope to see you over there on one of those longer episodes soon. Thanks. Bye.

#ecrimebytes #electronic #truecrime #podcast #ai #porn #cyberstalk

Leave a Reply

Your email address will not be published. Required fields are marked *