InformedInsights

Get Informed, Stay Inspired

The Hong Kong Police reported that a company lost $26 million due to a fraudulent video created using deepfake technology.
Technology

The Hong Kong Police reported that a company lost $26 million due to a fraudulent video created using deepfake technology.

According to the Hong Kong police, scammers used deepfake technology to impersonate senior executives and successfully defrauded a multinational company of approximately $26 million. This is one of the first instances of this type of crime in the city.

Police departments are struggling to keep pace with generative AI, which specialists warn could be used for spreading false information and abuse, such as creating fake images of individuals saying things they never actually said.

According to the police, an employee of a company in the financial center of China was contacted through video conferences by individuals pretending to be high-ranking executives of the company. They requested for money to be transferred to specific bank accounts.

On January 29, the police were informed of the incident where 15 transfers had already resulted in a loss of HK$200 million ($26 million).

According to the police, they are continuing to investigate and have not yet made any arrests. They did not reveal the name of the company involved.

According to reports from Hong Kong media, the victim was employed in the finance department and fell victim to scammers who posed as the company’s chief financial officer based in the U.K.

According to Acting Senior Superintendent Baron Chan, the video conference call consisted of several participants, but only the victim’s identity was not genuine.

Chan informed reporters that scammers utilized YouTube to locate public video and audio recordings of their targets, which were then manipulated using deepfake technology to imitate their voices. This was done in order to deceive the victim into following their commands.

According to the statement, the deepfake videos were previously recorded and did not contain any dialogue or interaction with the victim.

Source: voanews.com