news

New AI deepfake porn bill would require Big Tech to police and remove images

Evelyn Hockstein | Reuters
  • A new bill in Congress seeks to hold social media platforms accountable for the publishing and distribution of nonconsensual AI images containing real faces with fake bodies.
  • Deepfake porn production increased 464% in 2023 over the prior year.
  • Two dueling bills in the Senate could complicate the legislative process.

WASHINGTON — Lawmakers on Capitol Hill are scrambling to address the boom in deepfake artificial intelligence pornographic images, which have targeted everyone from celebrities to high school students.

STAY IN THE KNOW

icon

Watch NBC10 Boston news for free, 24/7, wherever you are.

icon

Get Boston local news, weather forecasts, lifestyle and entertainment stories to your inbox. Sign up for NBC Boston’s newsletters.

Now, a new bill will seek to hold social media companies accountable for policing and removing deepfake porn images published on their sites. The measure would criminalize publishing or threatening to publish deepfake porn.

Sen. Ted Cruz, R-Texas, is the bill's main sponsor. Cruz's office provided CNBC with exclusive details about the bill.

The Take It Down Act would also require social media platform operators to develop a process for removing the images within 48 hours of receiving a valid request from a victim. Additionally, the sites would also have to make a reasonable effort to remove any other copies of the images, including ones shared in private groups.

The task of enforcing these new rules would fall to the Federal Trade Commission, which regulates consumer protection rules.

Cruz's legislation will be formally introduced on Tuesday by a bipartisan group of senators. They will be joined in the Capitol by victims of deepfake porn, including high school students.

The rise of nonconsensual AI generated images have impacted celebrities like Taylor Swift, politicians like Rep. Alexandria Ocasio-Cortez, D-N.Y., and high school students whose classmates have taken images of their faces and, using apps and AI tools, created nude or pornographic photos.

"By creating a level playing field at the federal level and putting the responsibility on websites to have in place procedures to remove these images, our bill will protect and empower all victims of this heinous crime," Cruz said in a statement to CNBC.

Dueling Senate bills

In 2023, producers of deepfake porn increased their output by 464% year-over-year, according to a report from Home Security Heroes.  

Yet while there is wide consensus in Congress about the need to address deepfake AI pornography, there is no agreement on how to do it.

Instead, there are two competing bills in the Senate.

Sen. Dick Durbin, D-Ill., introduced a bipartisan bill early this year that would allow victims of non-consensual deepfakes to sue people who had held, created, possessed or distributed the image.

Under Cruz's bill, deepfake AI porn is treated like extremely offensive online content, meaning social media companies would be responsible for moderating and removing the images.

When Durbin tried to get a floor vote of his bill last week, Sen. Cynthia Lummis, R-Wyo., blocked the bill, saying it was "overly broad in scope" and could "stifle American technological innovation."

Durbin defended his bill, saying "there is no liability under this proposed law for tech platforms."

Lummis is one of the original co-sponsors on Cruz's bill, along with Republican Sen. Shelley Moore Capito and Democratic Sens. Amy Klobuchar, Richard Blumenthal and Jacky Rosen.

The new bill also comes as Senate Majority Leader Chuck Schumer, D-N.Y., is pushing his chamber to move on AI legislation. Last month, a task force on artificial intelligence released a "roadmap" on key AI issues which included developing legislation to address the "nonconsensual distribution of intimate images and other harmful deepfakes."

Copyright CNBC
Contact Us