With surge abated, what’s next for Twin Cities’ mutual aid efforts?


The Twin Cities feel different than they did at the height of Operation Metro Surge: The weather is warming, there are fewer federal agents in the streets, and many people, after months of sheltering indoors, are returning to work.

At the same time, the mutual aid efforts that kept thousands of immigrant families afloat are evolving.

Nina Jonson, the director of children and youth ministries at Plymouth Congregational Church in Minneapolis, said the church has hosted multiple mutual aid efforts, including Project Brown Bag, offering meal bags to individuals or organizations willing to distribute them.

At the program’s height, it provided 700-900 meals a week, packed in the church’s lobby by a few dozen volunteers. By the end of March, though, Project Brown Bag will wind down.

It’s not that the need has vanished, Jonson said, but it’s changed. The church’s partners are telling them that some people have returned to jobs and others who still need help are finding that local food shelves are now able to meet their needs.

Volunteer needs have changed too, Jonson said. Instead of short-term projects like packing and delivering groceries, demand has shifted to addressing the knock-on effects of the surge, like donating to rental assistance efforts and pushing for policies like eviction moratoriums and loans for small businesses that saw employees, customers or both disappear.

“The crisis is still here,” Jonson said. “It’s not as noticeable, it’s not as front-page. But that’s what mutual aid is.”

Elsewhere, efforts created during the surge haven’t seen a huge shift in demand, but are grappling with a falloff in donations.

Colin Anderson, food coordinator and community organizer at Zion Community Commons in St. Paul’s Hamline-Midway neighborhood, said that his group received a “massive amount of funding” in January to feed those in need. But with the national attention all but gone, Anderson said they’ve reached a point of struggling to pay for the food they provide.

Individuals helped feed their neighbors for a while, Anderson said, “but I don’t know many people who could triple or quadruple their food bill” to feed multiple households. So “the requests on us just increased and grew,” he said. 

Ellen Schmidt/MinnPost/CatchLight Local/Report for America
Colin Anderson, food coordinator and community organizer at Zion Community Commons, puts together supply packages during a weekly food distribution at the church on Tuesday, March 17, 2026, in St. Paul, Minn. Credit: Ellen Schmidt/MinnPost/CatchLight Local/Report for America

Ideally, Zion would have $50,000-$60,000 to meet their needs this month, he said, adding that the group is working on getting their nonprofit status.

“We don’t need a million dollars,” Anderson said. “We need a million people with one dollar.”

‘We received multiple years of funding in 8 weeks’

Neighborhood networks have adjusted, as well, said Nora Patterson, who is part of a group chat that has coordinated school dropoffs and grocery pickups for neighbors afraid to leave their homes. In recent weeks, she said, more people have become comfortable taking care of those tasks themselves.

“Now we’re sitting back and going, ‘Okay, how is the community needing my support in this moment?’” Patterson said.

So the group is spending their money at local businesses that struggled as federal agents swarmed the city. She said neighbors recently held a potluck with dishes from restaurants who needed support.

“It’s led by observation of how our community currently is,” Patterson said. “It feels like the organizing has become really organic.”

Her husband, Seth Patterson, is a board member of the Interfaith Coalition on Immigration, which has been supporting immigrant families and hosting monthly prayer vigils outside the Whipple Federal Building for the last eight years.

In previous years, the largely volunteer-run organization provided resources to maybe a dozen or so families at a time as they moved to the region, he said. The group often helped with food and very occasionally with rent.

Operation Metro Surge, and the response from those opposed to it, led to massive changes as both community need and the number of people eager to help meet that need began to outpace the intended scope of organizations like ICOM.

“We received multiple years of funding in 8 weeks,” Patterson said. That brought on its own set of challenges. Hundreds of people wanted to volunteer, but ICOM, which vets volunteers before allowing them to work with vulnerable people, couldn’t vet people fast enough. 

“The extent of what happened was beyond what any of us imagined,” he said.

Now, as needs shift yet again, Patterson said the question of what ICOM should be is still up in the air. To cover rent even for a dozen families isn’t sustainable in the long term, he said. But if that’s where the need is, he asked, should the group do it anyway?

“None of our systems were made to understand what happens when the federal government occupies our city,” he said.

Organizers noted another shift, too – the sheer number of people not only who wanted to do anything to help, but who said they’d never before felt called to step up in quite the same way.

“It really brought hope to the forefront for me,” said Nora Patterson. “I really needed to feel that other people were a part of this cause.”



Source link

Leave a Reply

Subscribe to Our Newsletter

Get our latest articles delivered straight to your inbox. No spam, we promise.

Recent Reviews


A new class-action lawsuit, filed on Monday by three teenage girls and their guardians, alleges that Elon Musk’s xAI created and distributed child sexual abuse material featuring their faces and likenesses with its Grok AI tech.

“Their lives have been shattered by the devastating loss of privacy, dignity, and personal safety that the production and dissemination of this CSAM have caused,” the filing says. “xAI’s financial gain through the increased use of its image- and video-making product came at their expense and well-being.”

From December to early January, Grok allowed many AI and X social media users to create AI-generated nonconsensual intimate images, sometimes known as deepfake porn. Reports estimate that Grok users made 4.4 million “undressed” or “nudified” images, 41% of the total number of images created, over a period of nine days. 

X, xAI and its safety and child safety divisions did not immediately respond to a request for comment.

The wave of “undressed” images stirred outrage around the world. The European Commission quickly launched an investigation, while Malaysia and Indonesia banned X within their borders. Some US government representatives called on Apple and Google to remove the app from their app stores for violating their policies, but no federal investigation into X or xAI has been opened. A similar, separate class-action lawsuit was filed (PDF) by a South Carolina woman in late January.

The dehumanizing trend highlighted just how capable modern AI image tools are at creating content that seems realistic. The new complaint compares Grok’s self-proclaimed “spicy AI” generation to the “dark arts” with its ease of subjecting children to “any pose, however sick, however fetishized, however unlawful.”

“To the viewer, the resulting video appears entirely real. For the child, her identifying features will now forever be attached to a video depicting her own child sexual abuse,” the complaint reads.

AI Atlas

The complaint says xAI is at fault because it did not employ industry-standard guardrails that would prevent abusers from making this content. It says xAI licensed use of its tech to third-party companies abroad, which sold subscriptions that led abusers to make child sexual abuse images featuring the faces and likenesses of the victims. The requests ran through xAI’s servers, which makes the company liable, the complaint argues.

The lawsuit was filed by three Jane Does, pseudonyms given to the teens to protect their identities. Jane Doe 1 was first alerted to the fact that abusive, AI-generated sexual material of her was circulating on the web by an anonymous Instagram message in early December. The filing says she was told about a Discord server by the anonymous Instagram user, where the material was shared. That led Jane Doe 1 and her family, and eventually law enforcement, to find and arrest one perpetrator.

Ongoing investigations led the families of Jane Does 2 and 3 to learn their children’s images had been transformed with xAI tech into abusive material.





Source link