Changpeng Zhao (CZ), the former head of Binance, has issued a warning to the crypto community about the rise of deepfake videos targeting investors. Recently released from prison after being found guilty of violating U.S. anti-money laundering laws, CZ highlighted in an X post that fake videos of him are circulating on social media.
Active on his X account since September 27, after serving a four-month prison sentence, CZ confirmed his attendance at the Binance Blockchain Week, scheduled for October 31, in a personal capacity. On Friday, he made his most serious comment since his release.
He mentioned that Binance has been performing well during his absence. Both CZ and Binance faced penalties after U.S. authorities discovered the company had facilitated money laundering and helped entities violate sanctions. The exchange was fined $4.3 billion, and CZ personally paid $100 million in penalties. He also stepped down, expressing regret for his oversight.
Deepfakes Use AI Technology to Create Fake Footage
Deepfakes, a form of digital deception, have recently become a significant trend. These "synthetic media" videos manipulate existing footage by altering audio and facial movements using artificial intelligence (AI). The creation of these realistic fakes often involves deep neural networks, specifically generative adversarial networks (GANs). The process includes data collection, AI model training, face swapping, voice synthesis, and refinement to achieve increasingly lifelike results.
Typically, deepfakes are employed to facilitate scams and fraudulent activities. Statistics indicate that fraudsters are earning substantial sums by creating videos featuring public figures like Elon Musk, Donald Trump, and Taylor Swift to promote cryptocurrencies. Charles Hoskinson, the creator of Cardano and a notable figure in the crypto world, has highlighted the growing sophistication and rapid proliferation of deepfakes. He warns that within a few years, it will become challenging for the public to distinguish between deepfake videos and genuine footage.
Elliptic: CryptoCore Reaps $5 Million per Quarter with Deepfakes
Prominent figures in the cryptocurrency industry have also been targeted by deepfake scams. Key individuals frequently impersonated include Michael Saylor, founder of Microstrategy; Brad Garlinghouse, head of Ripple; and Vitalik Buterin, co-founder of Ethereum. Videos featuring these figures lend credibility to fraudulent claims, increasing the likelihood of deceiving victims.
Blockchain research firm Elliptic has identified a common pattern in crypto scams using deepfakes: they entice unsuspecting investors to transfer their digital assets to a specified crypto wallet with the promise of substantial rewards. A group known as "CryptoCore" has reportedly defrauded thousands of investors, amassing over $5 million in just three months through such schemes.
The financial impact of these scams is significant and growing. During SpaceX's integrated flight test in June, approximately 50 YouTube accounts were compromised, resulting in 500 unauthorized transactions and $1.4 million in stolen funds. In another case, a Trump-Musk deepfake scam uncovered by Elliptic generated about $24,000 in a short period. These figures likely represent only a small portion of the total losses, as many scams go unreported or undetected.
Crypto and Politics Are the Most Appealing Subjects for Deepfakes
As the US presidential election nears, concerns are rising about deepfakes being used to spread misinformation, steal cryptocurrency, and influence public opinion of candidates. The presence of Donald Trump and Robert F. Kennedy Jr. at the Bitcoin 2024 Conference in Nashville has provided deepfake creators with ample material for their videos. Furthermore, Vice President Kamala Harris is reportedly planning to adopt a more favorable stance on blockchain technologies, potentially making her a target for deepfake scams.
Elliptic has discovered a website featuring the "Make America Great Again" logo and an image of Trump. The site falsely claims to be the former president's largest crypto giveaway, using deepfake content to deceive visitors. Investors are promised a return of double their donation from a $100 million fund.
Beyond politics, Avast's investigation into CryptoCore has shown that AI-powered deepfake scams frequently exploit topics such as SpaceX, MicroStrategy, Ripple, Tesla, BlackRock, and Cardano. These subjects often coincide with current events or popular trends in the cryptocurrency and technology sectors.
🌎《Now you can now start trading at TNNS PROX》📈
🔥Start trading today, click "sign up" from the link above.
How Can Investors Protect Themselves from Deepfakes
As deepfake scams become more prevalent, it is essential for the crypto community to remain vigilant. Experts suggest several strategies to help investors avoid these scams:
- Verify information from multiple reliable sources before making investment decisions.
- Be skeptical of giveaways or offers promising exaggerated rewards.
- Check official channels to confirm announcements or promotions.
- Use deepfake scanning software to analyze suspicious videos and assess their authenticity.
While AI technology continues to evolve, new tools are being developed to detect deepfake videos. Although not 100% effective, they often provide insights into the video's integrity and the likelihood of its authenticity. Some projects are exploring the use of blockchain technology to verify the authenticity of videos and images by creating an immutable record of original content.
Investors should be cautious of poor lip-syncing, unnatural voice patterns, or inconsistencies in video quality, as these can indicate a deepfake. Using public blockchain explorers to verify transactions and wallet addresses can also help ensure the legitimacy of claims.
As AI technology advances, the sophistication of deepfake scams is likely to increase, posing challenges for the cryptocurrency industry, regulators, and law enforcement agencies. AI-powered tools to detect deepfake videos are being refined to identify and remove such content from platforms like YouTube. However, raising public awareness about the threat of manipulated footage is crucial to mitigating the financial harm caused by these scams.
Disclaimer: This article is provided for informational purposes only. It is not offered or intended to be used as legal, tax, investment, financial, or other advice.
Comments