| Jaicee said:
Publicly championed by everyone from Taylor Swift to Melania Trump, the Take It Down Act criminalizes the non-consensual creation and publication of pornographic images and video, as in to say content like A.I.-created deepfake and so-called "revenge" porn, much of which gets shared with the victim's full name, contact information, and/or home address such as to guarantee stalking and harassment and maximize the possibility of violence of them. In other words, it bars you from becoming what the industry calls a "sex worker" without your knowledge, consent, or compensation. The bill, satisfyingly co-sponsored by the unlikely combination of Amy Klobuchar and Ted Cruz, passed the Senate by unanimous consent and recently passed in the House of Representatives by a vote of 409 to 2 and now heads to the president's desk. He has pledged to sign it. All I can say is that it's about damn time!! |
I knew politicians would finally start doing something about AI once they feel the fear about it, AI Deepfakes has no allegiance, someone can just as easy make some AI Deepfake porn of Ted Cruz as they could Amy Klobuchar. It's a start but there needs to be a lot more laws and regulation on all this AI shit. This is the bare minimum but it's a good start.
Along with a total ban on non-consensual AI deepfakes, we need massive fines for using AI in political campaigns and a total ban on using any AI in political campaigns, especially in the forms of deepfake voices and apply those fines to the companies that host the content if they don't remove it in time. We're heading towards an extremely dangerous future with AI if regulation and restrictions don't come in soon.
I've already seen big examples of AI deepfakes used in political campaigns and spread on platforms like Twitter.
Last edited by Ryuu96 - on 04 May 2025






