Thank you all for being here today. I want to begin by thanking my colleagues who have worked closely with me on this legislation. And I’d especially like to thank Senator Blumenthal for his partnership in this effort.
The sexual exploitation of children online is a serious problem. The current system is not working.
This past July, I heard from several witnesses that existing laws and current enforcement efforts are falling short.
In the past two decades, reports of child sexual exploitation online have exploded. According to the National Center for Missing & Exploited Children, also known as “NCMEC,” we are witnessing a rapid increase in online child exploitation.
Here are some facts:
I want to be clear about the type of victimization that we’re talking about here: This is child sexual exploitation, most of which involves pornographic images and videos of children. Our bill changes the term “child pornography” to “child sexual abuse material” throughout the federal code, because that’s what it is. Images of children being harmed. Each time an image or video is shared, it’s as if the child is being abused all over again. It is graphic and it is hard to hear. It represents some of the most horrendous criminal acts and takes a toll on us all.
In 2017, a 23-year-old daycare worker pleaded guilty to charges related to child exploitation and child pornography, and was subsequently sentenced to 30 years in prison. She sexually abused a 5-year-old child in her care and filmed the abuse. Then, she shared the abuse material, mainly over a messaging app called “Kik,”with men she met on other messaging apps and online forums, including Craigslist.
Another case involved a 16-month-old toddler, who was sexually abused by a 23-year-old man. He took photos and videos of the assaults and distributed them on a social media app. He later admitted to choosing the young child because she did not have the ability to communicate that she was being abused.
These despicable crimes committed against innocent children should shake us to our core.
To the victims, survivors, and their families: I want to extend my heartfelt sympathy to you. No one should ever have to go through what you have.
Today, it’s my hope that we can make the first step toward a real change to better protect minors online. First, we have to recognize where the system is failing.
The previous testimony before our committee, the reports we have seen, and the stories we have heard all tell us one thing: Section 230 of the Communications Decency Act has failed to incentivize tech companies to sufficiently address this serious issue.
The numbers make this point clear. Out of the 16.9 million reports to the NCMEC CyberTipline in 2019, 15.8 million were from Facebook. Comparatively, in 2019, Twitter made around 46 thousand reports and Snapchat made just over 82 thousand reports. It’s hard to believe that this exploitation and the sharing of abusive material only occurs at that rate on Facebook.
This is not just a social media problem. What about cloud storage services? There are countless images and videos in the cloud. Amazon made eight reports to the CyberTipline last year. That’s not a typo, just eight reports. Another example is Apple with only 205 reports. And the list goes on and on.
The liability protection afforded to these companies by section 230 allows Facebook to make 15.8 million reports, and Amazon to make only eight.
While federal prosecutors can still bring criminal cases against these companies, it hardly ever happens and this law enforcement deterrence – without a civil avenue for survivors – seems to be of little comfort.
And so, working with groups and stakeholders, we introduced the bipartisan EARN IT Act with 10 cosponsors.
I’m proud that our bill has the support of more than 70 groups, survivors, law enforcement and other stakeholders. That includes NCMEC, Rights4Girls, the National Center on Sexual Exploitation, the National District Attorneys Association, the National Sheriffs' Association and ECPAT-USA.
It’s my hope that after today we’ll gain even more support.
This bill is a major first step. For the first time, you will have to earn blanket liability protection when it comes to protecting minors.
The EARN IT Act has two primary components:
Our goal is to do this in a balanced way that doesn’t overly inhibit innovation, but forcibly deals with child exploitation.
Now, I also want to take on some of the concerns that have been raised about this bill.
One of the primary concerns raised by critics, like the ACLU and the Center for Democracy and Technology, is that the commission will ban end-to-end encryption or mandate a backdoor for law enforcement.
This is not a backdoor encryption bill although encryption is a serious issue in the enforcement of child sexual abuse material laws. The purpose of the EARN IT Act is to encourage companies to proactively address what is happening on their services. Our goal is bring these companies to the table and give them a choice: prevent the exploitation of children online and stop the proliferation of child sexual abuse material, or face the same liability as every other industry in America. If technology has taught us one thing, it’s that these companies do have the ability to prevent and disrupt child sexual exploitation, design product features to stop the solicitation and grooming of minors, and then coordinate across platforms.
There is nothing in the legislation requiring the commission to ban end-to-end encryption. If the commission decides to address encryption, there are significant safeguards -including Congressional approval - to ensure the approach is reasonable and considers the impact of data security and privacy.
More importantly, compliance with the best practices is voluntary. The EARN IT Act gives companies options, with different benefits and burdens. If a company disagrees with the commission’s possible approach to encryption, the company is free to offer encrypted products and services and take their chances in court like every other American business.
It is important to remember that Congress enacted section 230 to incentivize the filtering of objectionable material. End-to-end encryption without any safeguards effectively eliminates the ability of a service provider to perform any type of content moderation, and turns a blind eye to illicit online material that is being traded or shared on their service.
As I said at our hearing in July 2019, things would change tomorrow if tech companies could get sued. The problem with just removing blanket liability altogether is that you’d most likely prevent new companies from starting and you’d probably hurt some of the older businesses that already exist. No company will ever catch every piece of child sexual abuse material. But tech can’t continue to allow tens of millions of illicit files to be traded.
So we have to strike a balance. I want to protect those companies that earn their immunity by actively working to combat child exploitation and allow the bad actors to be taken down.
As we go through this hearing and continue the debate about this bill, let’s remember who we’re trying to protect. Today we’ll hear more heartbreaking stories of children being abused and exploited. Think of all the stories we don’t know. I’m hopeful that we will see eye to eye on this issue so that we can come together to bring justice to vulnerable children.