Minneapolis police oversight group pushes for stricter limits on less-lethal weapons


MinnPost’s Twin Cities Documenters program trains and pays community members to take notes at local government meetings. Below is Glen Johnson’s summary and observations from the April 13 Minneapolis Community Commission on Police Oversight meeting. You can read the full notes here. The notes include links to the video and agenda, as well as timestamps to help you navigate the recording.

Attendance

Present: Latonya Reeves (Chair), Paul Olsen (Vice-Chair), Chris Baker, Eric Bartz, James Canaday, Michael McElhinney, Melissa Newman, Mara Schanfield, Louis Smith, Bridgette Stewart

Absent: Leah Indrelie, Jennifer Clement, Nichelle Williams-Johnson

Summary: 

  • The commission received a presentation on the Office of Police Conduct Review’s current backlog status of reported police incidents and a general briefing on the process for complaints.
    • The complaint backlog has been reduced from 110 cases to 14 as of March 2026. All complaints are reviewed by the Office of Police Conduct Review (OPCR); average review time is 14 days.
    • Commissioner Mara Schanfield expressed surprise that non-disciplinary action (‘coaching’) could occur during the review phase before an investigation is done. Staff advised that if a low-level policy violation is identified for an office with no history of prior issues, coaching would be the appropriate action.
  • The commission unanimously agreed on policy recommendations and requests for information regarding the Minneapolis Police Department’s role in immigration enforcement (policy 9-401).
    • The commission recommended increasing facetime to the public during times of crisis, developing a stronger system for tracking calls about federal agents and providing more robust training on the policy.
    • They also are requesting an after-action report on Operation Metro Surge and a legal opinion from the city attorney on the duty to intervene. 
  • The commission unanimously approved topics for the CCPO annual public hearing on May 11.
    • Topics will include MPD’s domestic violence response, duty to intervene and tension with separation ordinance, and equity, which includes general officer obligations, non-discriminatory policing, MPD’s commitment to impartial enforcement of laws and ongoing issues with MPD culture.
  • The commission unanimously approved a request to the Civil Rights Department to present on the current budget for their work and the expected 2027 budget.
    • The Civil Rights Department advised CCPO commissioners would be unable to attend the National Association for Civilian Oversight of Law Enforcement (NACOLE) Conference this year due to lack of funding.
    • Commissioner James Canaday noted this was the first time hearing of the lack of funds and the conference is the only venue for meeting other people doing this work. Vice Chair Olsen commented that the budget and staff for the CCPO is small and they only know this because of speaking with others who do this type of work at the conference.
  • The commission compiled recommendations and comments on the High-Pressure Air (HPA) Projectiles and Launchers (Pepper Balls) policy.
    • The Police Policy Research and Recommendations committee will finalize the report on behalf of the CCPO and send to the city.
    • Commissioner Michael McElhinney said he would share several suggestions with the commission, which included: banning retaliatory use, banning use near people uninvolved with the situation, and requiring protection for vulnerable people.
    • Commissioners McElhinney and Baker both recommended that the policy shouldn’t allow for property damage prevention to be a reason for using weapons.
    • Commissioner Canaday asked for more detail on training as there was only one line describing any training. 

Observations and follow up questions: 

Accessibility: Did you face any challenges that made it harder to document the meeting or that may have made it difficult for others to attend? For example: trouble accessing the location, difficulty hearing the discussion, lack of nameplates for elected officials, or the agenda being unclear, disorganized, or incomplete.

  • Commissioner names weren’t shown on screen, which made identification harder.

Scene: About how many members of the public attended the meeting? If watching virtually, what was the livestream count (if applicable)? Was anyone protesting outside? 

  • 187 views a day after the meeting.

Notable: Do you have any follow up questions or other observations to share? What stood out to you as interesting or confusing? Is there anything you’d like to see reporters look further into? Were there any particularly memorable quotes?

  • A lot of the work appears to happen in committees, which is not broadcast.

How to get involved:

When is the next meeting for this board/committee? Any upcoming public hearings? Online surveys? 

  • The CCPO will hold a public hearing on Monday, May 11 at 6 p.m. at the Public Service Center. Topics include MPD’s domestic violence response, duty to intervene and tension with separation ordinance, and equity, which includes general officer obligations, non-discriminatory policing, MPD’s commitment to impartial enforcement of laws and ongoing issues with MPD culture.

More context:

Read Documenter Glen Johnson’s full notes here. The notes include links to the full video, agenda and timestamps to help you navigate the recording. Want to become a Documenter? You can start by making an account here.

For more updates from Documenters, follow us on FacebookBluesky and Instagram.





Source link

Leave a Reply

Subscribe to Our Newsletter

Get our latest articles delivered straight to your inbox. No spam, we promise.

Recent Reviews


A new class-action lawsuit, filed on Monday by three teenage girls and their guardians, alleges that Elon Musk’s xAI created and distributed child sexual abuse material featuring their faces and likenesses with its Grok AI tech.

“Their lives have been shattered by the devastating loss of privacy, dignity, and personal safety that the production and dissemination of this CSAM have caused,” the filing says. “xAI’s financial gain through the increased use of its image- and video-making product came at their expense and well-being.”

From December to early January, Grok allowed many AI and X social media users to create AI-generated nonconsensual intimate images, sometimes known as deepfake porn. Reports estimate that Grok users made 4.4 million “undressed” or “nudified” images, 41% of the total number of images created, over a period of nine days. 

X, xAI and its safety and child safety divisions did not immediately respond to a request for comment.

The wave of “undressed” images stirred outrage around the world. The European Commission quickly launched an investigation, while Malaysia and Indonesia banned X within their borders. Some US government representatives called on Apple and Google to remove the app from their app stores for violating their policies, but no federal investigation into X or xAI has been opened. A similar, separate class-action lawsuit was filed (PDF) by a South Carolina woman in late January.

The dehumanizing trend highlighted just how capable modern AI image tools are at creating content that seems realistic. The new complaint compares Grok’s self-proclaimed “spicy AI” generation to the “dark arts” with its ease of subjecting children to “any pose, however sick, however fetishized, however unlawful.”

“To the viewer, the resulting video appears entirely real. For the child, her identifying features will now forever be attached to a video depicting her own child sexual abuse,” the complaint reads.

AI Atlas

The complaint says xAI is at fault because it did not employ industry-standard guardrails that would prevent abusers from making this content. It says xAI licensed use of its tech to third-party companies abroad, which sold subscriptions that led abusers to make child sexual abuse images featuring the faces and likenesses of the victims. The requests ran through xAI’s servers, which makes the company liable, the complaint argues.

The lawsuit was filed by three Jane Does, pseudonyms given to the teens to protect their identities. Jane Doe 1 was first alerted to the fact that abusive, AI-generated sexual material of her was circulating on the web by an anonymous Instagram message in early December. The filing says she was told about a Discord server by the anonymous Instagram user, where the material was shared. That led Jane Doe 1 and her family, and eventually law enforcement, to find and arrest one perpetrator.

Ongoing investigations led the families of Jane Does 2 and 3 to learn their children’s images had been transformed with xAI tech into abusive material.





Source link