Senator Ben Cardin targeted by deepfake video impersonating Ukrainian official

forbes.com

Senator Ben Cardin was targeted by a deepfake video caller impersonating Ukraine's former Foreign Minister Dmytro Kuleba. This incident highlights the increasing threat of AI-generated false images and videos in political contexts. Reports indicate that this spoofing operation is notable for its technical sophistication and believability. Intelligence officials have warned about foreign actors using AI tools to influence U.S. elections. Despite concerns, recent assessments suggest that while AI is enhancing foreign influence operations, it has not yet transformed them significantly. Microsoft’s Vice President noted that the impact of AI in election interference has been less severe than anticipated.


With a significance score of 6.1, this news ranks in the top 0.1% of today's 28787 analyzed articles.

Get summaries of news with significance over 5.5 (usually ~10 stories per week). Read by 10,000+ subscribers: