A dangerous new scam combines AI and crypto
February 19, 2026
-
Scammers are using AI chatbots posing as trusted assistants to sell fake cryptocurrencies.
-
A fraudulent Google Coin presale site featured a chatbot claiming to be Googles Gemini AI, complete with branding and detailed investment projections.
-
Google does not have a cryptocurrency, but the convincing AI-driven pitch is designed to pressure victims into sending irreversible crypto payments.
A new cryptocurrency scam is leveraging artificial intelligence in a way that marks a troubling evolution in online fraud: custom-built chatbots that impersonate legitimate AI assistants to persuade victims to invest in worthless tokens.
Cybersecurity firm Malwarebytes recently uncovered a live Google Coin presale website featuring a chatbot that introduced itself as Gemini your AI assistant for the Google Coin platform. The bot used Gemini-style branding, including a sparkle icon and a green Online status indicator, creating the impression that it was an official Google product.
It wasnt.
AI as the closer
The chatbot didnt just greet visitors it acted as a full-fledged sales representative.
When asked, Will I get rich if I buy 100 coins, the bot provided specific financial projections. A $395 investment at the supposed presale price of $3.95 per token would allegedly be worth $2,755 at a future listing price of $27.55 approximately 7x growth, according to the chatbot. It then invited users to ask how to participate.
This kind of personalized, back-and-forth engagement once required human scammers operating through Telegram or WhatsApp chats. Now, AI can automate the entire pitch, responding instantly with tailored answers designed to build trust and overcome skepticism.
A persona that never breaks
What stood out most in Malwarebytes analysis was how tightly controlled the chatbots persona appeared to be.
The bot consistently claimed to be the official helper for the Google Coin platform. Yet it refused to provide any verifiable company information no registered entity, regulator, license number, audit firm, or official email address.
When confronted with concerns, it redirected users to vague claims about transparency and security. It would not acknowledge any possibility that the project could be a scam. More difficult questions were reportedly referred to an unnamed manager, suggesting a human operator might step in when needed.
Unlike some AI systems that can be pushed off-script, this bot repeatedly looped back to the same selling points: a detailed 2026 roadmap, military-grade encryption, AI integration, and a growing community of investors.
Whoever built it effectively locked the chatbot into a conversion script with a single goal getting victims to send cryptocurrency.
A polished fake
The chatbot sat atop a professionally designed scam site that mimicked Googles branding, complete with the G logo, sleek navigation menus, and a presale dashboard.
The site claimed to be in Stage 5 of 5, with more than 9.9 million tokens sold and a looming February 18 listing date classic urgency tactics. It also displayed logos from major companies, including OpenAI, Google, Binance, Coinbase, Squarespace, and SpaceX, under a Trusted By Industry banner. None of those companies are connected to the project.
Clicking Buy brought users to a wallet dashboard showing balances for a fictional Google token on a made-up Google-Chain, alongside Bitcoin and Ethereum. Buyers could select any number of tokens, triggering a Bitcoin payment request to a specific wallet address.
A tiered bonus system offered additional tokens for larger purchases, with bonuses ranging from 5% at 100 tokens up to 30% at 100,000 tokens a classic upsell strategy to encourage bigger, irreversible payments.
There is no legitimate exchange listing. The token has no real-world value. And once cryptocurrency is sent, it cannot be recovered.
AI scales the scam
Scammers have long relied on social engineering: build trust, create urgency, overcome doubt, and close the deal. Traditionally, that required teams of human operators, limiting how many potential victims they could handle at once.
AI chatbots eliminate that bottleneck.
A single operation can now deploy a chatbot capable of engaging hundreds of visitors simultaneously, 24 hours a day. The messaging is consistent, polished, and authoritative. The bot can impersonate a trusted AI brand, respond with customized financial projections,and escalate only the most promising leads to human closers.
The broader trend is already visible. According to Chainalysis, roughly 60% of funds flowing into crypto scam wallets are tied to scammers using AI tools. AI-powered infrastructure is quickly becoming standard in crypto fraud operations.
The chatbot is just one component of that toolkit but it may be the most persuasive, because it creates the illusion of an interactive relationship between the victim and a trusted brand.
Investment on the rise
The timing is significant. According to the Federal Trade Commissions Consumer Sentinel data, U.S. consumers reported losing $5.7 billion to investment in 2024 more than any other type of fraud and a 24% increase from the previous year.
Cryptocurrency remains the second-most common payment method used in , largely because transactions are fast and irreversible. When combined with AI capable of delivering a convincing sales pitch at scale, the fraud model becomes even more powerful.
How to spot AI-driven crypto
Malwarebytes warns that AI chatbots on scam sites are likely to become more common. Consumers should be wary of:
-
Impersonation of known AI brands. A chatbot calling itself Gemini, ChatGPT, or Copilot on a third-party crypto site is almost certainly not affiliated with those companies.
-
Evasion of due diligence questions. Legitimate operations can provide verifiable information about their legal entity, regulatory oversight, and registration. Scam bots tend to avoid or deflect those questions.
-
Specific return projections. No legitimate investment product guarantees a future price. Promises that a $395 investment will become $2,755 are red flags.
-
High-pressure urgency. Claims about final presale stages, imminent listings, or limited-time bonuses are designed to push quick decisions.
How to protect yourself
Google does not have a cryptocurrency, has not launched a presale, and is not using Gemini as a sales assistant on third-party crypto sites.
Consumer advocates recommend:
-
Verifying claims on a companys official website.
-
Not relying on chatbot branding as proof of legitimacy.
-
Never sending cryptocurrency based on projected returns.
-
Searching a project name along with scam or review before investing.
-
Using web protection tools such as Malwarebytes Browser Guard to block known and suspected scam sites.
Anyone who has already sent funds should report the incident to local law enforcement, the FTC at reportfraud.ftc.gov and the FBIs Internet Crime Complaint Center at ic3.gov.
As AI tools become more accessible, scammers are adapting quickly. The face of online fraud may now look like a friendly chatbot but the outcome remains the same.