A company lost $25 million after an employee was tricked by deepfakes of his coworkers on a video ca

Posted by Patria Henriques on Wednesday, June 26, 2024
2024-02-05T06:03:28Z
  • A company lost $25 million after an employee was tricked by deepfake versions of his colleagues, Hong Kong police say.
  • The person had attended a video call with deepfakes of the firm's UK-based CFO and other colleagues. 
  • Hong Kong police said scammers created the deepfakes based on publicly available video.

Scammers made off with about $25 million after they used deepfake technology to trick an employee at the Hong Kong branch of a multinational company, local media reported on Sunday, citing the city's police.

In January, an employee in the company's finance department received a message from someone who said they were the company's UK-based chief financial officer, South China Morning Post reported, citing police. The employee then had a video call with the company's CFO and other company employees — all of which turned out to be deepfakes.

Based on instructions they got during that call, the employee transferred HK$200 million, or $25.6 million, to various Hong Kong bank accounts across 15 transfers, according to the SCMP.

It was a week into the scam, when the employee contacted the company's headquarters, that they realized something was wrong.

The Hong Kong police did not name the company or employees involved. They said scammers created deepfakes of meeting participants based on publicly available video and audio footage, per SCMP.

The employee who was scammed did not interact with the deepfakes during the video conference, according to the media outlet.

Investigations are ongoing, although no arrests have been made, according to the media outlet.

Deepfake videos are causing global concerns. Superstar Taylor Swift is one of the latest to be caught up in a wave of sexually explicit deepfake videos that went viral on X and Telegram last month.

Many politicians are calling for a federal law to combat deepfakes.

In May 2023, Democratic Rep. Joseph Morelle introduced the Preventing Deepfakes of Intimate Images Act that would make it illegal to share non-consensual deepfake pornography. The bill has been referred to the House Judiciary Committee.

ncG1vNJzZmivp6x7o8HSoqWeq6Oeu7S1w56pZ5ufonylscSpnZqjlWKwsMPOq6KeqqNiw6qwxKhknJmcoXqku8ypmKexXaG8tLHSZqSipJyevK%2B%2FjJ6kqaSfrrKmecCiZGtoYml6cw%3D%3D