An executive with the popular crypto exchange Binance has revealed how solutions created by artificial intelligence can be used to bypass their Know Your Customer process. According to Jimmy Su, Binance’s chief security officer, the use of AI to verify KYC is becoming so advanced that many exchanges find it hard to differentiate between humans and AI.
Deepfake technology is used by cybercriminals who try to bypass the KYC process, which is compulsory on most crypto exchanges. When they bypass these KYC processes, they mostly have access to the account's content, and many end up stealing users' cryptocurrencies and other digital assets. Jimmy Su, Binance’s chief security officer, has warned that the issue is getting out of hand, and exchanges must do something quickly about it.
Deepfake technology creates convincing audio, images, or videos that look exactly like a particular person. These deep fake technologies often combine artificial intelligence tools with machine learning that makes audio and videos that represent another person.
In normal society, there are reasons these solutions are created, and they are legitimate when used. But within the crypto industry, AI creates images that cybercriminals can use to verify a person's identity. During the crypto KYC process in an exchange, these duplicates of a person's image, video, and voice can be used to pass through the verification process.
The improvement of the deepfake technology can be seen in a video that was posted in February 2023. A Twitter user had posted the video, and it featured the CEO of Binance making a speech about creating a new YouTube channel for trading.
A first look at the video will make one believe it is the CEO of Binance talking. However, a closer look at the video showed that it was the creation of artificial intelligence. Nevertheless, those who haven't seen the CEO of Binance talk before will believe he is the one. This is one of the dangers of AI in the crypto industry.
"Deep fake AI poses a serious threat to humankind, and it's no longer just a far-fetched idea. I recently came across a video featuring a deep fake of @cz_binance , and it's scarily convincing," the crypto enthusiast said on Twitter.
Deep fake AI poses a serious threat to humankind, and it's no longer just a far-fetched idea. I recently came across a video featuring a deep fake of @cz_binance , and it's scarily convincing. pic.twitter.com/BRCN7KaDgq
— DigitalMicropreneur.eth (@rbkasr) February 24, 2023
Fraudsters on Rampage with Deepfake Technology
Binance Chief Security Officer Jimmy Su, while speaking to Cointelegraph.com, reiterated the increase of fraudsters in the industry. These cybercriminals now use images, videos, and voices created by artificial intelligence to bypass the crypto KYC process.
In most cases, the exchanges verifying customers won't notice the difference between AI and humans. The Binance chief security officer said these fraudsters usually go online and get the photo of the person they want to attack. Then they use deep fake technology to create voices and videos that resemble the person. This is quite dangerous as exchanges can no longer differentiate.
Also Read: AI Predicts XRP Price for End of 2023
"The hacker will look for a normal picture of the victim online somewhere. Based on that, using deepfake tools, they’re able to produce videos to do the bypass. Some of the verification requires the user, for example, to blink their left eye or look to the left or to the right, look up or look down. The deepfakes are advanced enough today that they can actually execute those commands," Su said.