Legislature should act now to stop cryptocurrency kiosk scams


Across Minnesota, families are losing thousands of dollars in mere minutes to a fast-growing type of fraud.

We see the same story play out again and again. A scammer calls or messages someone and creates a sense of panic. The caller may claim there is a warrant for the person’s arrest, that their bank account has been compromised, that a grandchild needs bail money, or that a person they have developed a relationship with online is facing an urgent financial crisis and needs help. The victim is told to withdraw cash from their personal bank or credit union account and deposit it into a cryptocurrency kiosk.

The victim follows the instructions. They are coached to lie to their financial institution. The cash is converted into cryptocurrency and sent. The money is gone forever.

That is why legislation being proposed this year that would prohibit cryptocurrency kiosks in Minnesota is a vital consumer protection measure. 

Cryptocurrency kiosks resemble traditional ATMs and are often found in gas stations, grocery stores, and convenience stores. A user inserts cash or a debit card, pays a service fee, and converts their cash to cryptocurrency, often at prices well above market value. Cash cannot be withdrawn. Minnesota currently has roughly 400 reported kiosk locations operated by a handful of companies.

While some people use these kiosks for legitimate purposes, scammers value them because they allow cash to be converted quickly into digital currency and moved beyond reach. Once the transaction is complete, recovery is rare.

In Minnesota, the Department of Commerce has received 134 complaints between 2023 and 2025 involving fraudulent crypto kiosk transactions, totaling nearly $1 million in reported losses. Last year was the worst on record, with 70 cases and more than $540,000 lost. The average reported loss was nearly $6,800.

Importantly, however, those figures only reflect reported cases. Many victims, particularly those caught up in romance scams (where the scammer makes the victim believe they are in a romantic relationship), never come forward because the financial loss is compounded by embarrassment and emotional harm.

Commerce works to recover funds for Minnesotans who fall victim to cryptocurrency kiosk scams. In 2025, Commerce secured more than $110,000 for victims. Even so, only 48% of consumers received any refund, and the average recovery amounted to just 16 percent of what was lost. 

To be clear, the proposed legislation does not ban or prevent anyone from purchasing or trading cryptocurrency. Minnesotans could still buy and sell digital assets through established online exchanges, often with lower fees and stronger safeguards. Instead, it addresses a specific tool that scammers favor – crypto kiosks – because transactions are fast and often irreversible, and the scale of the problem is growing. The Federal Bureau of Investigation estimates that Americans lost about $240 million to crypto kiosk scams in just the first six months of 2025, double the pace of the previous year.

Minnesota has already tried regulating cryptocurrency kiosks, but the fraud has unfortunately continued. The Legislature passed laws required licensing, disclosures and antifraud measures for cryptocurrency kiosks. Fraudsters adapted. They stay on the phone and walk victims through each step of the scam. They tell them how to bypass warnings.

They target older adults and others who may be less familiar with the technology. They break transactions into smaller amounts to avoid limits and even redirect victims to other locations to continue evading anti-fraud measures in order to extract every possible penny from their victims. 

When someone believes their freedom, finances, or loved ones are at risk or that someone they trust is walking them through the process, a warning screen is unlikely to stop them.

Local communities are seeing the impact. Cities including Forest Lake, Stillwater and St. Paul have enacted or proposed bans on cryptocurrency kiosks in their communities in response to resident complaints. Other states are also considering adopting bans, and Indiana just passed a law banning cryptocurrency kiosks. 

This bill removes a physical, cash-based access point that has repeatedly been used to exploit vulnerable people.

As a leader in consumer protection, we see the financial and personal toll these crimes take. Savings are wiped out. Trust is broken. The emotional impact lasts long after the money is gone.

Minnesota has an opportunity to reduce that harm. Lawmakers should pass a cryptocurrency kiosk ban this year and take a clear step to protect Minnesotans from a growing form of financial exploitation.

Sara Payne is assistant commissioner of enforcement at the Minnesota Department of Commerce.



Source link

Leave a Reply

Subscribe to Our Newsletter

Get our latest articles delivered straight to your inbox. No spam, we promise.

Recent Reviews


A new class-action lawsuit, filed on Monday by three teenage girls and their guardians, alleges that Elon Musk’s xAI created and distributed child sexual abuse material featuring their faces and likenesses with its Grok AI tech.

“Their lives have been shattered by the devastating loss of privacy, dignity, and personal safety that the production and dissemination of this CSAM have caused,” the filing says. “xAI’s financial gain through the increased use of its image- and video-making product came at their expense and well-being.”

From December to early January, Grok allowed many AI and X social media users to create AI-generated nonconsensual intimate images, sometimes known as deepfake porn. Reports estimate that Grok users made 4.4 million “undressed” or “nudified” images, 41% of the total number of images created, over a period of nine days. 

X, xAI and its safety and child safety divisions did not immediately respond to a request for comment.

The wave of “undressed” images stirred outrage around the world. The European Commission quickly launched an investigation, while Malaysia and Indonesia banned X within their borders. Some US government representatives called on Apple and Google to remove the app from their app stores for violating their policies, but no federal investigation into X or xAI has been opened. A similar, separate class-action lawsuit was filed (PDF) by a South Carolina woman in late January.

The dehumanizing trend highlighted just how capable modern AI image tools are at creating content that seems realistic. The new complaint compares Grok’s self-proclaimed “spicy AI” generation to the “dark arts” with its ease of subjecting children to “any pose, however sick, however fetishized, however unlawful.”

“To the viewer, the resulting video appears entirely real. For the child, her identifying features will now forever be attached to a video depicting her own child sexual abuse,” the complaint reads.

AI Atlas

The complaint says xAI is at fault because it did not employ industry-standard guardrails that would prevent abusers from making this content. It says xAI licensed use of its tech to third-party companies abroad, which sold subscriptions that led abusers to make child sexual abuse images featuring the faces and likenesses of the victims. The requests ran through xAI’s servers, which makes the company liable, the complaint argues.

The lawsuit was filed by three Jane Does, pseudonyms given to the teens to protect their identities. Jane Doe 1 was first alerted to the fact that abusive, AI-generated sexual material of her was circulating on the web by an anonymous Instagram message in early December. The filing says she was told about a Discord server by the anonymous Instagram user, where the material was shared. That led Jane Doe 1 and her family, and eventually law enforcement, to find and arrest one perpetrator.

Ongoing investigations led the families of Jane Does 2 and 3 to learn their children’s images had been transformed with xAI tech into abusive material.





Source link