Trump admin temporarily blocks MN nursing home wage floor


Unprecedented minimum wages for thousands of nursing home workers in Minnesota are delayed again.

Leah Solo, executive director of the Nursing Home Workforce Standards Board, on Thursday, Oct. 2, 2025, in St. Paul, Minn.
Leah Solo, executive director of the Nursing Home Workforce Standards Board, on Thursday, Oct. 2, 2025, in St. Paul, Minn. Credit: Ellen Schmidt/MinnPost/CatchLight Local/Report for America

Leah Solo, executive director of the state’s Nursing Home Workforce Standards Board, said at a board meeting Thursday that the Trump administration has reset to day one its 90-day clock to review the wage floor. Under the still pending state law, nursing facility employees must earn at least $19 an hour this year and $20.50 in 2027, with workers who have nursing licenses netting substantially more. 

“I hate to bring bad news,” Solo said to a board that was trying to wrap its head around the Trump administration’s move and asked a series of unanswerable questions about what comes next. 

The Trump administration action marks the second major bureaucratic delay over a wage floor that was set to go into effect Jan. 1. It comes as the first of its kind workforce board faces a lawsuit from nursing homes calling for its elimination. 

The wage floor must receive approval from the federal Centers for Medicare and Medicaid Services, because it stipulates that CMS will provide $18 million to Minnesota’s Medicaid program to help nursing homes pay for the salary bumps. The state also chips in $18 million.

Under federal Medicaid law, CMS officials have 90 days at most to examine the funding request. However. CMS is allowed to wind back the clock on this review if they ask the state for more information.

According to Solo, on Wednesday – day 89 in this 90-day evaluation period – CMS wrote to the state requesting more information, and thus restarting the review.

Because it administers Medicaid, it is the Minnesota Department of Human Services and not the workforce board that sought federal approval. 

Related: Minnesota has a plan to turn around nursing homes’ staffing crisis. Nursing home operators say it’s a death knell. 

So, CMS’s email Wednesday was sent not to Solo and the workforce board but the Department of Human Services, which, in turn, emailed Solo Wednesday night.

It is unclear what information CMS wants from Minnesota officials. The Department of Human Services did not convey this to Solo. 

As of Thursday afternoon, the Department of Human Services said it was working on a response to questions about what information the letter contained. CMS did not respond to messages.

The wage statute was initially delayed because the Department of Human Services was months late in filing its request to CMS. Human Services officials, who apologized for their tardiness, sent the necessary paperwork over in January (triggering said 90-day review). 

Setting a wage floor for a specific industry is a relic of Franklin D. Roosevelt’s New Deal that has been resurrected by labor unions in blue states over the last 10 years.

Minnesota is the first state to specifically focus on the nursing home industry. The state has a long history of social programs that provide generous end-of-life services. 

But the nursing home industry has fought the workforce board tooth and nail, including filing a lawsuit last month stating that the board “inflicts irreparable harm on nursing home providers and business partners across Minnesota.”

A federal court hearing is scheduled next month for the industry’s request for an injunction against the workforce board. The board, which includes members of industry who seek the panel’s demise, spent the first 45 minutes of its meeting in a closed session to discuss the lawsuit.

Solo declined to comment on the lawsuit. 



Source link

Leave a Reply

Subscribe to Our Newsletter

Get our latest articles delivered straight to your inbox. No spam, we promise.

Recent Reviews


A new class-action lawsuit, filed on Monday by three teenage girls and their guardians, alleges that Elon Musk’s xAI created and distributed child sexual abuse material featuring their faces and likenesses with its Grok AI tech.

“Their lives have been shattered by the devastating loss of privacy, dignity, and personal safety that the production and dissemination of this CSAM have caused,” the filing says. “xAI’s financial gain through the increased use of its image- and video-making product came at their expense and well-being.”

From December to early January, Grok allowed many AI and X social media users to create AI-generated nonconsensual intimate images, sometimes known as deepfake porn. Reports estimate that Grok users made 4.4 million “undressed” or “nudified” images, 41% of the total number of images created, over a period of nine days. 

X, xAI and its safety and child safety divisions did not immediately respond to a request for comment.

The wave of “undressed” images stirred outrage around the world. The European Commission quickly launched an investigation, while Malaysia and Indonesia banned X within their borders. Some US government representatives called on Apple and Google to remove the app from their app stores for violating their policies, but no federal investigation into X or xAI has been opened. A similar, separate class-action lawsuit was filed (PDF) by a South Carolina woman in late January.

The dehumanizing trend highlighted just how capable modern AI image tools are at creating content that seems realistic. The new complaint compares Grok’s self-proclaimed “spicy AI” generation to the “dark arts” with its ease of subjecting children to “any pose, however sick, however fetishized, however unlawful.”

“To the viewer, the resulting video appears entirely real. For the child, her identifying features will now forever be attached to a video depicting her own child sexual abuse,” the complaint reads.

AI Atlas

The complaint says xAI is at fault because it did not employ industry-standard guardrails that would prevent abusers from making this content. It says xAI licensed use of its tech to third-party companies abroad, which sold subscriptions that led abusers to make child sexual abuse images featuring the faces and likenesses of the victims. The requests ran through xAI’s servers, which makes the company liable, the complaint argues.

The lawsuit was filed by three Jane Does, pseudonyms given to the teens to protect their identities. Jane Doe 1 was first alerted to the fact that abusive, AI-generated sexual material of her was circulating on the web by an anonymous Instagram message in early December. The filing says she was told about a Discord server by the anonymous Instagram user, where the material was shared. That led Jane Doe 1 and her family, and eventually law enforcement, to find and arrest one perpetrator.

Ongoing investigations led the families of Jane Does 2 and 3 to learn their children’s images had been transformed with xAI tech into abusive material.





Source link