AI deepfake operation targets US Senator Ben Cardin in sophisticated scheme
An advanced deepfake operation targeted Senator Ben Cardin this month, according to the Office of Senate Security. The scheme involved a fake video call from a person posing as Dmytro Kuleba, a former Ukrainian official, which raised suspicions during politically charged questions. Cardin's office quickly identified the deception and reported it to the Department of State, confirming it was not Kuleba. Law enforcement is now investigating the incident, highlighting the growing sophistication of AI technology in political scams. Experts warn that such deepfake schemes may become more common as technology improves. Recent advancements have made it easier to create convincing fake videos and audio, increasing the risk of similar attacks in the future.