December 29, 2024

U.S. Senator Ted Cruz (R-Texas) speaks during a news conference on Capitol Hill in Washington, October 6, 2021.

Evelyn Hochstein | Reuters

WASHINGTON — Lawmakers on Capitol Hill are scrambling to address the proliferation of deepfake artificial intelligence pornographic images that target everyone from celebrities to high school students.

Now, a new bill would seek to hold social media companies responsible for moderation and removal Deepfake porn pictures Publish on their website. The measure would make it a crime to publish or threaten to publish deepfake pornography.

Texas Republican Sen. Ted Cruz is the bill’s lead sponsor. Cruz’s office provided CNBC with exclusive details about the bill.

The Take It Down Act also requires social media platform operators to develop a process to delete images within 48 hours of receiving a valid request from a victim. Additionally, these sites must make reasonable efforts to remove any other copies of the image, including images shared in private groups.

The task of enforcing these new rules will fall to the Federal Trade Commission, which Standardize consumer protection rules.

Cruz’s legislation will be formally introduced by a bipartisan group of senators on Tuesday. Victims of deepfake porn, including high school students, will join the Capitol.

rise Non-consensual images generated by artificial intelligence have affected celebrities, e.g. Taylor Swiftpoliticians such as New York State Representative Alexandria Ocasio-Cortez (Alexandria Ocasio-Cortez), and High school student Their classmates took images of their faces and used apps and artificial intelligence tools to create nude or pornographic photos.

The risk of deep election fraud rises

“By creating a level playing field at the federal level and holding websites accountable for having procedures in place to remove these images, our bill will protect all victims of this heinous crime,” Cruz said in a statement to CNBC. , and empower them.

Duel over Senate bills

According to a 2023 report, the output of deepfake pornographic content producers increased by 464% annually in 2023 Home Security Hero.

However, while there is broad consensus in Congress on the need to address the problem of deepfake AI porn, there is no agreement on how to do so.

Instead, the Senate has two competing bills.

Senator Dick Durbin, Democrat of Illinois, introduced a bipartisan bill Earlier this year, victims of non-consensual deepfakes will be able to sue those who possess, create, own or distribute the image.

Under Cruz’s bill, deepfake AI porn would be considered highly offensive online content, meaning social media companies would be responsible for reviewing and removing the images.

When Durbin tried to get his bill to a floor vote last week, Senator Cynthia Loomis blocked it, saying it was “overly broad” and could “stifle technology innovation in the United States.”

Durbin defended his bill, saying “tech platforms would have no liability under this proposed law.”

Loomis was one of the original co-sponsors of the Cruz bill, along with Republican Senator Shelley Moore Capito, Democratic Senators Amy Klobuchar, and Richard Blumen Sal and Jackie Rosen.

The new bill was also introduced by Senate Majority Leader Chuck Schumer, D-N.Y. is pushing his chamber of commerce to push for artificial intelligence legislation. Last month, the Artificial Intelligence Task Force Published “Roadmap” On key artificial intelligence issues, these include legislation to address the “non-consensual distribution of intimate images and other harmful deepfakes.”

About The Author

Leave a Reply

Your email address will not be published. Required fields are marked *