Senator Falls Prey to Deepfake Caller Posing as Top Ukrainian Official

Democratic Senator Benjamin Cardin of Maryland was the recent target of a deepfake scam, where a caller hopped on a videoconference with him posing as a top official from Ukraine.

This recent incident renewed fears that some lawmakers could become targets of people or groups who are looking to gain influence in U.S. politics or obtain some sensitive information from them.

Security officials with the Senate sent an email to different lawmakers’ offices this week explaining what happened, and The New York Times obtained a copy of that email.

It said that Cardin’s office received an email last week that looked to be sent by Dmytro Kuleba, who only recently left as the foreign minister of Ukraine. The email had requested a conference call be held via Zoom.

When Cardin jumped on the call, the person sounded and even looked like Kuleba.

Cardin soon grew suspicious, though, when the person who was posing as the Ukrainian official started acting in a different way than he was used to.

The security officials with the Senate wrote that the person was asking “politically charged questions in relation to the upcoming election.” He also demanded that Cardin give him his opinion on sensitive questions regarding foreign policy, including whether he supported Ukraine firing long-range missiles within Russian territory.

Cardin then ended the videoconference and immediately reported it to authorities at the Department of State. Officials there confirmed the man on the call was impersonating Kuleba.

While the email didn’t confirm which senator was the target of the call, two officials in the Senate who are familiar with the incident confirmed it to The Times.

Cardin himself even partially did so through a statement he issued Wednesday night, saying:

“In recent days, a malign actor engaged in a deceptive attempt to have a conversation with me by posing as a known individual.”

Cardin didn’t specify that the person in question was Kuleba, and he didn’t reference Ukraine at all in his statement.

Artificial intelligence has advanced to a point today where it has become easier to create videos of people who look to be real and sound that way, too. While it can create completely fictitious people, it also has been used to impersonate real-life public figures as well.

In 2022, for instance, a fake video circulated online that purported to show Volodymyr Zelensky, the president of Ukraine, surrendering to Russia.

While Cardin is retiring from Congress at the end of this year, this incident sparked renewed concerns about how other members of Congress could be targeted by foreign actors. It’s obviously an increased concern as there are only a few more weeks until the presidential election.

It’s not clear at this point who was behind this deepfake attack on Cardin. Intelligence officials warn, though, that foreign actors in countries such as China, Iran and Russia are all using AI to augment efforts they are undertaking to influence the upcoming U.S. election.