How we can eliminate child sexual abuse material from the internet | Julie Cordua

How we can eliminate child sexual abuse material from the internet | Julie Cordua

[This talk contains mature content] Five years ago, I received a phone call
that would change my life. I remember so vividly that day. It was about this time of year, and I was sitting in my office. I remember the sun
streaming through the window. And my phone rang. And I picked it up, and it was two federal agents,
asking for my help in identifying a little girl featured in hundreds of child
sexual abuse images they had found online. They had just started working the case, but what they knew was that her abuse had been broadcast
to the world for years on dark web sites dedicated
to the sexual abuse of children. And her abuser was incredibly
technologically sophisticated: new images and new videos every few weeks, but very few clues as to who she was or where she was. And so they called us, because they had heard
we were a new nonprofit building technology
to fight child sexual abuse. But we were only two years old, and we had only worked
on child sex trafficking. And I had to tell them we had nothing. We had nothing that could
help them stop this abuse. It took those agents another year to ultimately find that child. And by the time she was rescued, hundreds of images and videos
documenting her rape had gone viral, from the dark web to peer-to-peer networks,
private chat rooms and to the websites you and I use every single day. And today, as she struggles to recover, she lives with the fact
that thousands around the world continue to watch her abuse. I have come to learn
in the last five years that this case is far from unique. How did we get here as a society? In the late 1980s, child pornography — or what it actually is,
child sexual abuse material — was nearly eliminated. New laws and increased prosecutions
made it simply too risky to trade it through the mail. And then came the internet,
and the market exploded. The amount of content in circulation today is massive and growing. This is a truly global problem, but if we just look at the US: in the US alone last year, more than 45 million images and videos
of child sexual abuse material were reported to the National Center
for Missing and Exploited Children, and that is nearly double
the amount the year prior. And the details behind these numbers
are hard to contemplate, with more than 60 percent of the images
featuring children younger than 12, and most of them including
extreme acts of sexual violence. Abusers are cheered on in chat rooms
dedicated to the abuse of children, where they gain rank and notoriety with more abuse and more victims. In this market, the currency has become
the content itself. It’s clear that abusers have been quick
to leverage new technologies, but our response as a society has not. These abusers don’t read
user agreements of websites, and the content doesn’t honor
geographic boundaries. And they win when we look
at one piece of the puzzle at a time, which is exactly how
our response today is designed. Law enforcement works in one jurisdiction. Companies look at just their platform. And whatever data they learn along the way is rarely shared. It is so clear that this
disconnected approach is not working. We have to redesign
our response to this epidemic for the digital age. And that’s exactly
what we’re doing at Thorn. We’re building the technology
to connect these dots, to arm everyone on the front lines — law enforcement, NGOs and companies — with the tools they need
to ultimately eliminate child sexual abuse material
from the internet. Let’s talk for a minute — (Applause) Thank you. (Applause) Let’s talk for a minute
about what those dots are. As you can imagine,
this content is horrific. If you don’t have to look at it,
you don’t want to look at it. And so, most companies
or law enforcement agencies that have this content can translate every file
into a unique string of numbers. This is called a “hash.” It’s essentially a fingerprint for each file or each video. And what this allows them to do
is use the information in investigations or for a company to remove
the content from their platform, without having to relook
at every image and every video each time. The problem today, though, is that there are hundreds
of millions of these hashes sitting in siloed databases
all around the world. In a silo, it might work for the one agency
that has control over it, but not connecting this data means
we don’t know how many are unique. We don’t know which ones represent
children who have already been rescued or need to be identified still. So our first, most basic premise
is that all of this data must be connected. There are two ways where this data,
combined with software on a global scale, can have transformative
impact in this space. The first is with law enforcement: helping them identify new victims faster, stopping abuse and stopping those producing this content. The second is with companies: using it as clues to identify
the hundreds of millions of files in circulation today, pulling it down and then stopping the upload
of new material before it ever goes viral. Four years ago, when that case ended, our team sat there,
and we just felt this, um … … deep sense of failure,
is the way I can put it, because we watched that whole year while they looked for her. And we saw every place
in the investigation where, if the technology
would have existed, they would have found her faster. And so we walked away from that and we went and we did
the only thing we knew how to do: we began to build software. So we’ve started with law enforcement. Our dream was an alarm bell on the desks
of officers all around the world so that if anyone dare post
a new victim online, someone would start
looking for them immediately. I obviously can’t talk about
the details of that software, but today it’s at work in 38 countries, having reduced the time it takes
to get to a child by more than 65 percent. (Applause) And now we’re embarking
on that second horizon: building the software to help companies
identify and remove this content. Let’s talk for a minute
about these companies. So, I told you — 45 million images
and videos in the US alone last year. Those come from just 12 companies. Twelve companies, 45 million files
of child sexual abuse material. These come from those companies
that have the money to build the infrastructure that it takes
to pull this content down. But there are hundreds of other companies, small- to medium-size companies
around the world, that need to do this work, but they either: 1) can’t imagine that
their platform would be used for abuse, or 2) don’t have the money to spend
on something that is not driving revenue. So we went ahead and built it for them, and this system now gets smarter
with the more companies that participate. Let me give you an example. Our first partner, Imgur —
if you haven’t heard of this company, it’s one of the most visited
websites in the US — millions of pieces of user-generated
content uploaded every single day, in a mission to make the internet
a more fun place. They partnered with us first. Within 20 minutes
of going live on our system, someone tried to upload
a known piece of abuse material. They were able to stop it,
they pull it down, they report it to the National Center
for Missing and Exploited Children. But they went a step further, and they went and inspected the account
of the person who had uploaded it. Hundreds more pieces
of child sexual abuse material that we had never seen. And this is where we start
to see exponential impact. We pull that material down, it gets reported to the National Center
for Missing and Exploited Children and then those hashes
go back into the system and benefit every other company on it. And when the millions of hashes we have
lead to millions more and, in real time, companies around the world are identifying
and pulling this content down, we will have dramatically increased
the speed at which we are removing child sexual abuse material
from the internet around the world. (Applause) But this is why it can’t just be
about software and data, it has to be about scale. We have to activate thousands of officers, hundreds of companies around the world if technology will allow us
to outrun the perpetrators and dismantle the communities
that are normalizing child sexual abuse around the world today. And the time to do this is now. We can no longer say we don’t know
the impact this is having on our children. The first generation of children
whose abuse has gone viral are now young adults. The Canadian Centre for Child Protection just did a recent study
of these young adults to understand the unique trauma
they try to recover from, knowing that their abuse lives on. Eighty percent of these young adults
have thought about suicide. More than 60 percent
have attempted suicide. And most of them live
with the fear every single day that as they walk down the street
or they interview for a job or they go to school or they meet someone online, that that person has seen their abuse. And the reality came true
for more than 30 percent of them. They had been recognized
from their abuse material online. This is not going to be easy, but it is not impossible. Now it’s going to take the will, the will of our society to look at something
that is really hard to look at, to take something out of the darkness so these kids have a voice; the will of companies to take action
and make sure that their platforms are not complicit in the abuse of a child; the will of governments to invest
with their law enforcement for the tools they need to investigate
a digital first crime, even when the victims
cannot speak for themselves. This audacious commitment
is part of that will. It’s a declaration of war
against one of humanity’s darkest evils. But what I hang on to is that it’s actually
an investment in a future where every child can simply be a kid. Thank you. (Applause)

100 thoughts on “How we can eliminate child sexual abuse material from the internet | Julie Cordua

  1. I don't like speeches like these, hate me for saying it but, this whole trembling voice and tears in the eyes thing has too much of a propaganda flavour to it, we've seen it all already. This woman is a prime example of why this topic is so charged, she is way too emotional to maintain a rational overview about all the risks that systems like these have. It would be more reputable if she remained calm. If this system, which is created for "fighting evil" gets abused and leads to censorship and monitoring of any other kind, it's far worse than cp spreading on the internet. The software itself should be visible to the public or at least members of parliament with the professional expertise in the countries that it's used, because it is in the public interest that those 12 companies use it as it was primarily intended, and not abuse it in the interest and profit of any other companies, or governments.

  2. I love this channel. I have osteoarthritis. I It got noticeable when I was 26 I’m 51 and all the vertebrae in my whole back is crumbled I had surgery after surgery I have my neck fused they want to do one more fusion but they have to crack my chest I’m not game .I’m tired what frequency do I use My C7 vertebrae is broken and flipped down over my t-1and that’s the one they want to crack My chest. The last surgery they went in from my back and they removed fragments and stuff off of my spinal cord and nerves around it but they couldn’t get to the t1. There are like 10 doctors and surgeons who give me orders not to fall that’s what everyone says orders do not fall like no one wants to Fall. Thank you for what you do

  3. You can withhold permission to allow your camera and microphone to see and record you and you can still have YouTube just fine I don’t put my camera on it unless I want a picture .your microphone is the same way just go to your permissions and turn it off

  4. There needs to be a frank discussion at schools in a safe protective environment with children from a young age. They need to know what exactly is not OK behaviour. They love their parents, and family, but they are the most likely abusers and children need to have a clear understanding of how they can keep their body safe and how to make bad things stop if they start happening. Parents, extended family, doctors and schools need to be educated on how to keep their children's bodies safe from potential abuse and warning signs to look for. Not talking about it does not make it go away, in fact, it's quite the opposite.

  5. Now this all sounds awsome and wholesome, but the used technique is of liddle use after the criminals know how it works. They change 1 Pixel value in a video and the hash is going to be drastically different, thus unrecognizable to the system.
    I think more suffisticated systems ,like ai, should look at these media and then determine if the content is illegal or not. You do have, as stated, a huge training set available to base the ai on, so why not do that? Yes it will be way more computationally expensive but obviously worth the cost.

  6. Glad to hear the department in preventing child sexual abuse is strengthening. Improvements like these is how humanity can advance; improving and carving out a better world one step at a time. Well done to this organization!

  7. and how can you stop something that is happened in ancient rome and ancient japan. it always been there thats the issue. 70 years ago 13yo young boys married 20-30 yo women in turkey. not because of the islam. atatürk made it secular wo why it happened – we got out of war and there were no men left behind. so this topic is not easy as others thats for sure. you cant justify a 13yo boy marrying 25yo women yeah but if you ask the women that did this reply like- there were no men. all of them gone to indepented war and the families wanted to survive so it happened. if you stop 13yo boy marrying then dont send him to war. this is topic is hard to discuss in middle east. there are men that lust after young bodies and there is life that is happening.

  8. She is dumb. The people sharing these pictures and films are not on the “normal” internet. Everything is done on the dark web and she knows it. What she proposes is that the government gets access to spy on us. And what 12 company’s are she taking about? Please inform me if I am wrong.

  9. Be careful with technology and how it controls us all. There are both good and bad people and I believe in this cause but it's dangerous to be monitored by people without our knowledge. Most of the world is "free" and I personally want to feel safe in a society but also personally. This kind of mass surveillance is a dystopian future written about in books and tragic events in history. Peace and love people.

  10. Hard to do when the president of your country could be involved. What happen to that case with the 13 year old tied to a bed by Epstein? Her life was threatened and dropped the case? I want to know

  11. While I support this idea, I'm afraid what it can do in the wrong hands.

    Sure, this focuses on child abuse, but it can be adapted to different things, such as tracking down whistleblowers and journalists with confidental documents trying to uncover corruption. Sure, things might be fine in your country right now, but what about after few generations? Things might have changed from democracy to authoritarianism behind the scene.

    This solution is lacking long term perspective. People are going to go deeper underground, change files to generate different hashes and so one. And there is chance for false positives, unless a human reviews the content. But how will they review billions and billions files if even the youtube can't handle the volume of videos being uploaded? And what about the privacy? This is just not a good solution overall.

  12. If 30% are being recognized that means there are a lot of people watching. But glad our society is FINALLY talking about it.

  13. It's weird to me how many conservatives and billionaires are straight-up pedos. Look what trump has done during all his teen pageants that he spent hundreds of millions buying up all over the world.

  14. Pedophiles who are caught should be stripped of the internet, and all public non-password internet should have a free login with your ID. We need to remove internet routers that have open internet and force everyone to go through a login of ID to use the internet. ONLY childporn and murder content should put you into a internet ban. Having radical speech or as they call it, hate speech should be allowed, because we dont want to live in a political controlled world, that strips you of for disagree or have hate against an idelogy.

    For the pedophiles who produce and publish the content, they should get death penalty so we dont waste tax on keeping them alive.

  15. plain and simple , get this government out of power that prays on children and pedophiles alike , to them it is a means to an end nothing less nothing more …these kids are pawns in there sick game of control and influence …it would not be there if they did not want it to be …time to call out the C I A and and the deep state for there sick games . Epstein is at the core of this problem ,along with many others …

  16. 🦋people are sick, kids are the most pure and innocent thing in the world. People need to stop being so greedy and should stop turning a blind eye to this

  17. As far as I know, the problem with hashes is that if you change the file even a little bit, the hash changes as or at least has a high probability of changing as well, so the files may be easy to mask from hash search. Or the files can be completely encrypted and the files can't be recognized at all.

    But one could train a neural network using known material to tag content that's not encrypted and run that neural network against unknown material. That way it would be possible to tag new material with high level of effectiveness without any human having to check all the material to find it. Also facial recognition neural networks might help in finding the victims faster.

  18. Even slatest modification of the picture will create a completely new image with an unique hash which this multimillion technology can not identify. Such an expensive idea but so easily circumvented. Also I don't see how this policy of checking the hash of the image before uploading can be enforced to darknet sites. They called dark for a reason, you know.

  19. Rather than stopping the uploading (sure, that works), but the source isn't the upload. It's the perpetrator or the uploader.Surely, the more effective way is to let the uploader load, let the trail lead to him/her and then disable the upload and the share. That way, the root is eliminated.

  20. It's such a important problem I think
    If they were upload it even just Curiosity it gonna be abuse at bad way
    So its should be control by the every single operator of the website

  21. 🦋this is truly appalling to me, I would love to share what I’ve learned with you and hopefully help can you put the smack down on this not sure if I can help but I would love to

  22. Well i tell you how as a observer of a chaotic world, rapists disgusts me but child sexual exploitation will continue to happen as long as society disgards the need of men, no one is born evil, it's their environment that creates that, in the past it was a gareentee that each man gets a wife at a certain age, men are forbided to be cared for or loved anymore, so subconsciously they take away innocence of a child like revange or something, child harresment is a serious issues that society needs to enable health care for mentally ill men.

  23. yeah let's make people more afraid of this not an epidemic and lie to them saying we know how to stop it. Ted talks have really gone down hill.

  24. Much love & energy to these guys & their efforts.

    Epstein didn’t kill him self. But he was one of the top pimps for child sexual abuse, to the power elite of humanity on this world.

  25. I wish she would get to the point, I don’t get emotional pleading.
    State what we can do in bullet points otherwise its the police’s problem.

  26. Peolle will pend money on meaningless, shallow stuff when they cant afford ot but wont donate things like this. That's sad.

  27. Pseudointellectuals say filming kiddie diddling is bad, but kiddie diddling itself shouldn't be stigmatized.
    Watch previous material from this channel if you don't believe me

  28. The problem is we're too politically correct and to Humane. You catch a pedophile raping a 12 year old girl/boy that warrants horrendous acts of violence to be broadcasted for all the pedophiles to see. Don't just throw him in jail where no one hears about how horrible his life is broadcast the violence for all to see make people think twice about being predators of children.

  29. You've completely ignored the single biggest source of this type of material. The dark Web is where all the begins. You admitted that. But didn't talk about any solutions.

  30. People who rape children and post it online are the worst of humanity. I donated 17$. I hope you will be successful in your mission.

  31. The honest truth is, it's more common than we think. Sexual abuse is everywhere. From the internet to our schools. This needs to stop.

  32. What the * is wrong with some people. Yes we should be considering prevention but that's not this woman's job, and until they find a way to better prevent this we need more people like her!

  33. If you want to build a system that works, you must think with your brain.

    In that first case, how would this tool have helped the federal agents? If that material was being carried by a large, conscientious carrier, then why didn't the agents just go to that carrier with a warrant and ask for help? And if it wasn't, then how could they have used this tool?

    And just try writing out a description of this tool without specifying the nature of the material of interest. Go ahead, see how it looks. And then spend a minute or two thinking about how you might evade such a system, if you were technically skillful and stood to make a lot of money that way. Then ask yourself what good this system really is, and whether you want it on the platforms you use.

  34. Matthew 18:6 …. But whoever causes one of these little ones who believe in me to sin, it would be better for him to have a great millstone fastened around his neck and to be drowned in the depth of the sea.

  35. Science has proved that pedophilia is a disease that people get from birth. These people who make this content are ill as well as who watch it. They need to be restricted from the society and get required treatment. Unfortunately, the internet lets such people stay unwitnessed and this realm thrive. Ideally, we shouldn't condemn pedophilia as a disease, we should provide proper treatment to such people, and of course punish those who opt abusing of children over admitting the problem and getting the treatment.

  36. We don't have to wait. We can take care of ourselves NOW. OurName4Freedom .ws / BEST WAY TO FUND YANG / Common sense is WeSovereign .ws / We The People

  37. You'll need to build software more sophisticated than hashes. Adding noise to the same video can cause sophisticated deep learning models to fail tagging the material as child abuse – adverserial examples. If the person is aware of how current SOTA algorithms tag videos or how hashes work, they can get away with it. This is a step in the right direction, but I really hope hashing isn't the solution you guys are proposing. Correct me if I'm wrong.

  38. This should be both child and animal protection for sure. Most of the disgusting acts are starting with animal abuse first. If the system can focus on animal abuse as well, so as LAW(!) in every country, there can be so many prevented sick-minded people. And actually taking them away from society forsake.

  39. They say you can do whatever you want as long as you are not harming other, these kind of phrases are the reason behind the dark world we are creating

  40. Well if we remove the content from the internet how would we identify the perpetrators? At least I film you can identify the criminals. Without the web they'll just do it under the sheets

  41. Im sad to say it but it cannot be done. I can see why the internet is associated with freedom. It is impossible. If we want to stop the ongoing production and distribution of this disturbing content we’d have more luck to stop producing paedophiles/sexuals.

  42. I was sexually abused as a young child and I still have PTSD dreams based around it at the age of 32. I hope people will eventually be able to stop this stuff from happening.

  43. its verry good for the normal common web portals, but what about the hiuge amount that is on the dark / deep web, how can we prevent this kind of data if we cant even access of filter it. the methode is sertainly verry good but the internet isnt just what you can find with a google search its many times bigger then what we can access normaly and its in this deep and layered internet space that this content is most used overall.

  44. How could anyone downvote this, I've followed this charity for years and they're doing incredible things. If you check Charity Navigator they are also very good at spending their budget appropriately.

Leave a Reply

Your email address will not be published. Required fields are marked *