Pentagon Names Winners Of Air Dominance ‘Gauntlet’







If the 21st century has revealed anything about the way wars are fought, it’s that drones are the path forward. Since the United States redesignated the RQ-1 as the hunter-killer MQ-1 Predator, making it a combat aircraft, drone warfare has become commonplace. There are all manner of drones that fly, crawl, run, and swim, and the most recent innovations in the space involve designing smaller, more versatile machines with multiple capabilities.

The Department of Defense has launched several initiatives to seek out all manner of new drone technologies, with so-called “Drone Dominance” as the name of the game. In March 2026, the department revealed the winners of the Gauntlet I uncrewed exercise, which pitted numerous platforms against one another in a series of tests. The goal of Gauntlet I, which saw 25 drone companies converging on Fort Benning, Georgia, was to identify the best one-way drones for various military operations and then order 30,000 for future use. The top three performers were Skycutter, Neros, and Nepatree.

Skycutter is a U.K. contractor that produces 3D-printed drones of all kinds, including heavy-lift, fuel-cell-powered, and ISR units that fit inside a backpack. Similarly, Neros develops small, lethal kinetic-strike drones of various types that are highly versatile. Nepatree’s Bumblebee and Hornet drones have already been identified for outfitting the Washington National Guard, and it scored well in the competition alongside the aforementioned contractors.

Gauntlet I and the US’ drone ambitions

Gauntlet I was a two-week exercise, with 25 companies’ products being put through their paces. The chosen winners will provide a combined total of 30,000 drones, which will be delivered over five months. Gauntlet wasn’t merely a demonstration of drones, as around 100 servicemembers from the U.S. Army and Marine Corps, alongside some from the Special Operations community, were also involved. The tests included sending drones out to hit targets 6.2 miles away. Operators were limited to two hours of training on each system before testing.

The Pentagon intends to spend around $5,000 per drone, though it would prefer the cost were closer to $2,000 per unit over the Drone Dominance program’s lifetime. Typically, defense procurements start high and reduce over time as production becomes more streamlined. Gauntlet I is part of the first phase of that program, which will continue for some time as the DoD identifies, chooses, and procures tens of thousands of drones for its inventories.

Other initiatives, like the U.S. Game of Drones at Edwards Air Force Base, California, have similar objectives. Choosing new one-way attack drones is paramount to the U.S. drone initiative. One-way attack drones have proven vital in both the Russo-Ukrainian War and the United States-Iran War; Iranian Shahed drones have proven their worth in the latter conflict, and the Department of Defense understands the U.S.’ need to add similar lethal options to its inventory.





Source link

Leave a Reply

Subscribe to Our Newsletter

Get our latest articles delivered straight to your inbox. No spam, we promise.

Recent Reviews


A new class-action lawsuit, filed on Monday by three teenage girls and their guardians, alleges that Elon Musk’s xAI created and distributed child sexual abuse material featuring their faces and likenesses with its Grok AI tech.

“Their lives have been shattered by the devastating loss of privacy, dignity, and personal safety that the production and dissemination of this CSAM have caused,” the filing says. “xAI’s financial gain through the increased use of its image- and video-making product came at their expense and well-being.”

From December to early January, Grok allowed many AI and X social media users to create AI-generated nonconsensual intimate images, sometimes known as deepfake porn. Reports estimate that Grok users made 4.4 million “undressed” or “nudified” images, 41% of the total number of images created, over a period of nine days. 

X, xAI and its safety and child safety divisions did not immediately respond to a request for comment.

The wave of “undressed” images stirred outrage around the world. The European Commission quickly launched an investigation, while Malaysia and Indonesia banned X within their borders. Some US government representatives called on Apple and Google to remove the app from their app stores for violating their policies, but no federal investigation into X or xAI has been opened. A similar, separate class-action lawsuit was filed (PDF) by a South Carolina woman in late January.

The dehumanizing trend highlighted just how capable modern AI image tools are at creating content that seems realistic. The new complaint compares Grok’s self-proclaimed “spicy AI” generation to the “dark arts” with its ease of subjecting children to “any pose, however sick, however fetishized, however unlawful.”

“To the viewer, the resulting video appears entirely real. For the child, her identifying features will now forever be attached to a video depicting her own child sexual abuse,” the complaint reads.

AI Atlas

The complaint says xAI is at fault because it did not employ industry-standard guardrails that would prevent abusers from making this content. It says xAI licensed use of its tech to third-party companies abroad, which sold subscriptions that led abusers to make child sexual abuse images featuring the faces and likenesses of the victims. The requests ran through xAI’s servers, which makes the company liable, the complaint argues.

The lawsuit was filed by three Jane Does, pseudonyms given to the teens to protect their identities. Jane Doe 1 was first alerted to the fact that abusive, AI-generated sexual material of her was circulating on the web by an anonymous Instagram message in early December. The filing says she was told about a Discord server by the anonymous Instagram user, where the material was shared. That led Jane Doe 1 and her family, and eventually law enforcement, to find and arrest one perpetrator.

Ongoing investigations led the families of Jane Does 2 and 3 to learn their children’s images had been transformed with xAI tech into abusive material.





Source link