Federal lawmakers are concerned about the potential for “deepfakes” produced using artificial intelligence to deceive people, perpetuate fraud, and pose national security risks.
“Deepfakes” can include manipulated audio, images, and videos created by AI to appear authentic.
The House Oversight and Accountability Committee recently delved into the possible negative impacts of the increasing spread of deepfakes and discussed ways to prevent them from causing upheaval in Americans’ lives.
Representative Nancy Mace, a Republican from South Carolina, highlighted real-life instances where deepfakes have been used to spread misinformation, such as AI-generated videos misrepresenting the president of Ukraine.
Mace emphasized the need for laws to address the blurred lines between fact and fiction, stating that while she does not advocate for banning all synthetic videos, enforcement becomes challenging when truth and falsehood become indistinguishable.
Both lawmakers and President Biden have expressed a desire to establish new regulations regarding deepfakes. Last month, President Biden signed an executive order aimed at mitigating potential risks associated with AI technology.
Former President Barack Obama has been influential in shaping this new executive order and has expressed concerns about the misuse of deepfake technology. He believes that different rules should govern deepfakes targeting public figures and that specific regulations are required to prevent cyberbullying, especially aimed at young individuals.
Despite efforts from organizations like the American Association of Political Consultants to condemn the use of AI deepfake technology, concerns remain high, particularly with the upcoming 2024 election season.
Congress is currently considering various proposals aimed at enacting laws to regulate new AI tools. The Senate Rules Committee is particularly focused on evaluating the potential impact of AI on elections and is poised to play a significant role in forthcoming AI legislation related to next year’s contests.