Big Tech CEOs to Protect Kids from Sexual Exploitation Online During Senate Judiciary Committee Hearing
31.01.2024
Durbin questions CEOs of Discord, Meta, Snap, TikTok, and X; X CEO commits to supporting Durbinโs STOP CSAM Act
WASHINGTON โ U.S. Senate Majority Whip Dick Durbin (D-IL), Chair of the Senate Judiciary Committee, today pressed the CEOs of Discord, Meta, Snap, TikTok, and X (formerly known as Twitter) during a Senate Judiciary Committee hearing that examined Big Techโs failures to protect kids from sexual exploitation online.ย During the hearing, Durbin received a commitment from Xโs CEO, Linda Yaccarino, that X will support Durbinโsย Strengthening Transparency and Obligations to Protect Children Suffering from Abuse and Mistreatment Act of 2023ย (STOP CSAM Act),ย legislation that supports victims and increases accountability and transparency for online platforms.ย
Durbin first questioned Jason Citron, CEO of Discord, about the Senatorโs STOP CSAM Act and if online platforms, such as Discord, should be held liable for knowingly distributing child sexual abuse material (CSAM).
โLet me get down to the bottom line here. I am going to focus on my legislation on CSAM. What it says is: civil liability if you intentionally or knowingly host or store child sexual abuse materials or make child sex abuse materials available. Secondly, intentionally or knowingly promote or aid or abet violation of child sexual exploitation laws. Is there anyone here who believes you should not be held civilly liable for that type of conduct?โ Durbin asked.
Mr. Citron responded that there are โparts of the STOP CSAM bill that are very encouraging.โ He also committed to Durbin that he would discuss the STOP CSAM Act with him in the future. Durbin responded that he would like to meet with Mr. Citron about the bill.
Durbin continued, โI would sure like to do that [meet with Mr. Citron] because if you intentionally or knowingly post or store CSAM, I think you ought to at least be civilly liable. I cannot imagine anyone who would disagree with it.โ
Durbin then questioned Evan Spiegel, Snapโs CEO, on the companyโs failure to address Snapchatโs long known use as a tool to send sexually explicit content.
โItโs never been a secret that Snapchat is used to send sexually explicit images. In fact, in 2013โearly in your companyโs historyโyou admitted this in an interview. You said that when you were first trying to get people on the app, you would, go up to people and be like: โHey, you should try this application. You can send disappearing photos.โ And they would say: โOh, for sexting,โโ Durbin said.
Durbin continued to say that as early as 2017, law enforcement had identified Snapchat as pedophilesโ go-to sexual exploitation tool. During his questioning, Durbin cited the case of a 12-year-old girl who was targeted while using Snap. Over two-and-a-half years, a predator sexually groomed her, who went by L.W. in the court, by coercing her to send him sexually-explicit images and videos over Snapchat. The man admitted that he only used Snapchat with L.W. โand not any other platformsโbecause he, โkn[e]w the chats [would] go away.โ
โDid you and everyone else at Snap really fail to see that the platform was the perfect tool for sexual predators to exploit children? Or did you just ignore this risk?โ Durbin asked.
Mr. Spiegel responded that Snap provides an in-app reporting tools for those who are being harassed. He stated that Snap typically responds to those reports within 15 minutes.
Durbin continued to press Mr. Spiegel about L.W.โs case, and if tech companies have the same standards as other companies, particularly with civil liabilities.
โWhen most companies make a dangerous product, they face civil liability through our tort system. But when L.W. sued Snapchat, her case was dismissed under Section 230 of the Communications Decency Act. Do you have any doubt that, had Snap faced the prospect of civil liability for facilitating sexual exploitation, the company would have implemented better safeguards?โ Durbin asked.
Mr. Spiegel responded that he believes Snap does have safeguards for children when using their platforms.
Durbin then asked Mr. Citron about Discordโs hands-off approach to safety. According to Discordโs website, it takes a โproactive and automated approach to safetyโ only on servers with more than 200 members. Smaller servers rely on server owners and community moderators to define and enforce norms of behavior.
โHow do you defend an approach to safety that relies on groups of fewer than 200 sexual predators to report themselves for things like grooming, the trading of CSAM, and sextortion?โ Durbin asked.
Mr. Citron responded that their goal is to โget all of that content off of our platform and ideally prevent it from showing up in the first place.โ He went on to say Discord recently launched a program called โTeen Safety Assistโ which lets a young user know if they are โin a situation that may be inappropriate so they can report that to us.โ
Durbin responded, โMr. Citron, if that were working, we wouldnโt be here today.โ
Mr. Citron stated CSAM is an โon-going challenge for all of us [and] thatโs why weโre here today.โ He continued to say that he looks forward to working with Durbin to address CSAM and improve his company.
Durbin then asked Mr. Shou Chew, TikTokโs CEO, about what TikTok is doing to address CSAM.
Mr. Chew testified that TikTok has committed to invest more than $2 billion in trust and safety, has added 40,000 safety professionals to address this issue, and built a personalized child safety team to help identify CSAM.
The hearing, entitled โBig Tech and the Online Child Sexual Exploitation Crisis,โ builds on the work Durbin and the Committee have done to extensively examine and investigate the plague of online child sexual exploitation, through hearings, legislation, and oversight efforts.ย The hearing also highlighted the need for Congress to act on the bipartisan bills reported by the Committee.ย ย
Read more
- Practicing Islam in Todayโs China: differing realities for the Uighurs and the Hui- Hearing before US Congress (17/05/2004)
- DECREE OF SENATE ON JUDICIAL PROCESS IN CYRENE: AUGUSTUS 64 BCE
- Standing Orders of the Senate: Parliament of Spain