The White House responded with concern on Friday to the viral spread of explicit, AI-generated images featuring music icon Taylor Swift, prompting a call for legislative intervention from Congress.
During a press briefing, White House press secretary Karine Jean-Pierre labeled the dissemination of these “false images” as “alarming” and emphasized the critical role of social media companies in enforcing their content rules to curb the spread of misinformation and non-consensual, intimate imagery involving real individuals.
Jean-Pierre underscored the disproportionate impact of “lax enforcement” of non-consensual explicit content online, particularly affecting women and girls who often become targets of harassment and abuse. The press secretary highlighted President Biden’s commitment to addressing the issue, citing a recent executive order focused on reducing the risk of generative AI producing such explicit imagery.
In response to a question about potential legislation, Jean-Pierre expressed a clear stance, stating, “There should be legislation, obviously, to deal with this issue.” She affirmed the White House’s determination to tackle the problem at the federal level, emphasizing the need for Congress to take legislative action.
The SAG-AFTRA actors union also condemned the AI-generated images of Taylor Swift, describing them as “upsetting, harmful, and deeply concerning.” The union called for legal measures to combat the development and dissemination of fake explicit images without consent. SAG-AFTRA voiced support for Congressman Joe Morelle’s Preventing Deepfakes of Intimate Images Act, advocating for swift legislative action to prevent further exploitation and protect individuals’ privacy and autonomy.
As the White House and industry organizations join forces to address the alarming rise of AI-generated explicit content, the call for legislative measures gains momentum. Stay tuned for updates on this evolving situation and the potential legal actions to safeguard individuals from such unauthorized use of technology.