BREAKING: Congress Introduces Bill Compelling Big Tech to Address Growing Threat of Online Child Sexual Exploitation
Law Enforcement Reports Historic Levels of Child Sexual Abuse Images and Videos Distributed Online
Washington, D.C. (April 20, 2023) - A bipartisan group of Members of Congress is trying to end the legal loophole that gives technology companies immunity from liability for hosting images of child sexual abuse on their platforms.
The Eliminating Abusive and Rampant Neglect of Interactive Technologies (EARN IT) Act, introduced today by Sens. Richard Blumenthal and Lindsey Graham, and Reps. Ann Wagner and Sylvia Garcia will incentivize technology companies to proactively search for and remove child sexual abuse materials (CSAM) from their platforms.
The EARN IT Act will:
- Remove immunity for social media and technology companies that knowingly facilitate or profit from the distribution of CSAM on their platforms.
- Require electronic service providers to preserve the contents of reported CSAM for one year, to facilitate law enforcement investigations.
- Exclusively target the prevention and removal of sexually abusive images and videos of children on online platforms.
- Update federal statutes to use the term CSAM instead of child pornography. The term child pornography fails to describe the true nature of the videos and images and undermines the seriousness of the abuse.
“Tech companies have the technology to detect, remove, and stop the distribution of child sexual abuse material. However, there is no incentive to do so because they are subject to no consequences for their inaction,” said Erin Earp, RAINN’s interim vice president for public policy.
The distribution of child sexual abuse images and videos is a growing, pervasive, and rampant issue. In 2022, the National Center for Missing and Exploited Children received over 32 million CyberTipline reports containing over 88 million images, videos, and other content related to suspected child sexual exploitation. This is the largest number of reports ever received in one year, and it is almost double the amount received pre-pandemic in 2019.
Many of these images and videos depict the actual rape and torture of children, including infants and toddlers. The worst moments of these children’s lives are being shared and viewed by millions of offenders around the world right now, and platforms currently have no liability for their failure to detect and remove them.
###
To learn more about RAINN’s public policy work to support survivors and bring perpetrators to justice, sign up to receive policy news updates.
Contact: Erinn Robinson
Director of Media Relations
media@rainn.org