XC3 Weaponlight

Archive for the ‘AI / ML’ Category

Air Force Experiments with AI, Boosts Battle Management Speed, Accuracy

Saturday, October 11th, 2025

LAS VEGAS (AFNS) —  

The Air Force wrapped up the second Decision Advantage Sprint for Human-Machine Teaming, known as DASH 2, a fast-paced experiment exploring how artificial intelligence can help operators make faster, smarter decisions in complex battlespaces.

DASH 2 took place at the Shadow Operations Center-Nellis’ unclassified location in downtown Las Vegas and was led by the Advanced Battle Management System Cross-Functional Team.The effort was conducted in partnership with the Air Force Research Lab’s 711th Human Performance Wing, the Integrated Capabilities Command and the 805th Combat Training Squadron, also known as the ShOC-N. 

“DASH 2 proved human-machine teaming is no longer theoretical,” said Col. Jonathan Zall, ABMS Capability Integration chief. “By fusing operator judgment with AI speed, the Air Force is shaping the future of decision advantage in joint and coalition operations.” 

AI Speeds Decision Advantage 

Initial results showed that machines produced recommendations in less than ten seconds and generated 30 times more options than human-only teams. Two vendors each produced more than 6,000 solutions for roughly 20 problems in just one hour. The software’s accuracy was on par with human performance, despite only two weeks of development. In one case, a single algorithm adjustment would have raised recommendation validity from 70 percent to more than 90 percent. 

“This level of output gives commanders options to execute multiple kill chains simultaneously and we’re excited about our next experiment to generate the courses of action with the machines to help illuminate risk, opportunity gain/loss, material gain/loss, among others,” said Col. John Ohlund, ABMS CFT director. 

Inside DASH 2 

The DASH series is part of the Air Force’s campaign to modernize command and control and gain decision advantage through human-machine teaming. Each sprint refines a specific decision function and informs future Department of the Air Force C2 development. The series also supports the Pentagon’s Combined Joint All-Domain Command and Control initiative. 

“Human-machine teaming is critical to accelerating the speed and quality of decisions across the joint force, and DASH 2 provides the insights we need to make that a reality,” Zall said. 

Human-Machine Teaming in Action

Seven teams participated in DASH 2, including six industry teams and one ShOC-N innovation team. Their challenge was to design AI-enabled microservices capable of assisting operators with the “match effectors” function, which determines the best available weapon system to destroy an identified target. 

Developers observed battle management crews operating without machine assistance, then iteratively designed and tested tools to augment human decision-making. Final demonstrations compared human-only performance against human-machine performance, measuring speed, quantity and quality.

“Being part of DASH 2 showed us how human-machine teaming can enhance performance without losing operator judgment,” said Capt. Steven Mohan III, 726th Air Control Squadron chief of standards and evaluations.

Industry and Air Force Collaboration 

Evaluation focused on whether these tools helped operators make more effective decisions, not just process more data. 

DASH 2 also reaffirmed the value of co-development with both industry and Air Force developers. Companies retained intellectual property rights while the Air Force gained insight into integration and functional requirements for future C2 software. 

“At the ShOC-N, our mission is to put new capabilities into operators’ hands and test them under conditions that resemble real-world battle management,” said Lt. Col. Shawn Finney, 805th CTS/ShOC-N commander. “DASH 2 demonstrated how the battle lab enables rigorous testing while maintaining operational fidelity, bridging the gap between concept and capability.” 

Early Results and Lessons Learned 

The 711th HPW collected data on operator performance, workload and teaming dynamics. Findings confirmed that AI can accelerate decision-making while keeping humans at the center of the process. 

“Collaboration with AFRL, the ABMS program office and industry allowed us to rapidly experiment, refine requirements and accelerate the path from concept to capability delivery,” Ohlund said. 

Shaping the Future of C2 

The DASH series is a key step in modernizing Air Force command and control. By combining human judgment with AI, the service is preparing operators to make faster, more informed decisions in future contested environments. 

“DASH 2 proved human-machine teaming is no longer theoretical,” Zall said. “By fusing operator judgment with AI speed, the Air Force is shaping the future of decision advantage in joint and coalition operations.” 

By Deb Henley, 505th Command and Control Wing Public Affairs

805th Combat Training Squadron, also known as the Shadow Operations Center-Nellis

SOFWERX – Computer Vision (CV) Inference Engine and Model Training for Unmanned Systems (UMS) Assessment Event (AE)

Monday, September 29th, 2025

SOFWERX, in collaboration with the USSOCOM PEO-SOF Digital Applications (SDA) Unmanned Systems Autonomy and Interoperability (UxSAI) Program, will host an Assessment Event (AE) to identify technology providers capable of delivering cutting-edge computer vision capabilities for detection and classification for all USSOCOM unmanned systems. This event aims to evaluate whitepaper responses and down-select innovative solutions for a computer vision inference engine and model training solution that can autonomously detect, classify, and adapt to new targets and environments ultimately enhancing the operational effectiveness of UxS in resource-constrained and communication-denied environments to align with the goals and objectives of the UxSAI Program.

Computer vision rarely performs as intended and results in missed detections or improperly classified objects. Challenges exist with obtaining training data, utilizing the models on constrained resources, and deploying models over the air. UxSAI requests computer vision developers to improve our artificial intelligence and machine learning capabilities and develop highly effective computer vision inference engines and pretrained models as part of an Enterprise machine learning operations (MLOPS) pipeline. Providers selected for participation will receive additional information and funding to prepare for the AE upon selection. The UxSAI Program intends to evaluate solutions through their Enterprise MLOPS pipeline as part of this event.

The UxSAI Program seeks a computer vision inference engine and model retraining solution that:

  • Demonstrates robust object detection and classification capabilities.
  • Optimizes Size, Weight, Power, and Cost (SWaP-C) for deployment on a variety of UxS platforms.
  • Enables training and adaptation to new objects and environments.
  • Offers a modular architecture suitable for seamless integration with existing architectures.
  • Enhances the autonomous capabilities of unmanned systems operating in challenging environments.
  • Informs the necessary interfaces, protocols, and data formats for integration, contributing to the development of an Interface Control Document (ICD).
  • The intent from this event is for UxSAI to work with providers selected from the AE to further develop their model through collaboration with the UxSAI Program

Submit via events.sofwerx.org/uxsai-cvmodeldev NLT 13 October 2025 11:59 PM ET

ITAR Restricted

Scout AI Partners with Hendrick Motorsports Technical Solutions on NOMAD – Defense UGV Automated by Fury

Friday, September 26th, 2025

SUNNYVALE, Calif., Sept, 2025 — Scout AI Inc. (“Scout”) and Hendrick Motorsports Technical Solutions (“HMS”) today announced a partnership on NOMAD, HMS’s next-generation unmanned ground vehicle (“UGV”) controlled by Scout’s Fury autonomy system. NOMAD represents Fury’s second UGV form factor and debuts Scout’s fastest foundation model to date, lightweight, low-latency, and purpose-built for compact robotic platforms.

NOMAD also introduces Scout’s second-generation Fury hardware stack, which is more than 90% smaller and significantly more power-efficient than prior versions. The system remains low-signature and passive-sensing, enabling NOMAD to operate autonomously beyond line of sight, follow a human teammate from a safe distance, and integrate a wide range of payloads for light tactical ground missions.

“NOMAD is an incredible platform, fast, adaptable, and designed for the toughest mission environments,” said Colby Adcock, Co-Founder and CEO of Scout AI. “Our partnership with Hendrick Motorsports underscores the extensibility of Fury across multiple form factors. We’re just beginning to unlock its potential across ground, air, sea, and space domains.”

Building on Scout’s mission to deliver next-generation, camera-only autonomy, NOMAD combines Vision-Language-Action (VLA) reasoning with rugged, commercial off-the-shelf hardware. By eliminating costly and fragile sensors and relying on fully learned models, Fury provides human-like judgment in real time while maintaining a low-cost, low-signature footprint.

“At Hendrick Motorsports, our greatest strength has always been our people—the engineers, builders, and innovators who thrive under pressure and push technology to its limits,” said Rhegan Flanagan, Director of Government Programs at HMS. “We are proud to bring that same dedication to supporting the warfighter, whose mission and safety drive everything we do. Partnering with Scout AI allows us to combine world-class vehicle engineering with cutting-edge autonomy to deliver NOMAD—a commercial platform designed to give our servicemembers greater capability, protection, and confidence on the battlefield.”

Lethality, Innovation, and Transformation Through AI Education at the U.S. Army School of Advanced Military Studies

Sunday, September 21st, 2025

THE ARMY UNIVERSITY, FORT LEAVENWORTH, Kansas – In late July 2025, the Advanced Military Studies Program at the School of Advanced Military Studies, known as SAMS, launched its first-ever experimental, three-day, Practical Application of Artificial Intelligence module.

The mission was simple: transform the program with an innovative, hands-on AI learning experience for students and faculty. The purpose was to enable warfighter lethality through AI education and training.

“AI is changing the character of warfare. Our graduates have got to be ready to lead formations powered by AI—and that’s why we did something about it,” Col. Dwight Domengeaux, Director, SAMS said.

Dr. Bruce Stanley, Director, AMSP, envisioned a module that pushed institutional norms about how mid-career officers learn about AI and learn with AI.

“Did we accept risk? Yes. We did—to create a critical learning opportunity for our students,” Stanley remarked. “We knew what was at stake, and we trusted our faculty and students to make it work.”

And make it work they did.

According to AMSP faculty, the module’s experimental instructional design was key, consisting of ten-and-a-half hours of total classroom contact time divided over three lessons.

“We covered a lot of ground with our students in three days,” Dr. Jacob Mauslein, associate professor, AMSP, said. “Subjects ranged from AI theory and ethical considerations of AI, to applying AI tools, and leading AI-enabled organizations.”

A novel feature of the module was that it was developed by AMSP students. As a task in their Future Operational Environment course, six students from the Class of 2025, mentored by two faculty, developed the AI module that would be taught to the Class of 2026. The students’ final draft was adopted almost without change by the faculty.

“Incorporating students as full participants in the process allowed us to co-develop lesson objectives and materials that deeply mattered to them,” Dr. Luke Herrington, one of the faculty leads for the module shared.

Meeting students where they were in terms of their AI skills and then taking them to the next level was part of the academic approach for the AI module, Herrington explained.

Maj. Justin Webb, PhD, an AY 2025 AMSP student, and one of the module’s developers explained it this way: “SAMS is a warfighting school—so we chose learning activities that would help us become more lethal warfighters with AI. Using AI tools like CamoGPT, Ask Sage, and others for several hours over three days helped us get there.”

Some students in the AY 2026 class were initially skeptical of using AI.

“At first, I didn’t know what I didn’t know,” Army Maj. Stuart Allgood, an Armor officer SAMS student said. “But by the end of the first day my thinking about AI had changed. After the second day, I could use AI tools I had never even heard of.”

Maj. Callum Knight, an intelligence officer from the United Kingdom summed up his experience.

“Before this course I viewed AI as just a data point,” Knight said. “Now that I’ve experienced what’s possible with AI, I realize it’s an imperative that is going to impact everything I do going forward.”

So, what’s next for AI at SAMS?

“Based on what our students got out of this, we intend to add more AI learning moments across the program,” Stanley said. “The priority now is to integrate AI into our upcoming operational warfare practical exercise.”

AMSP is one of the three distinct academic programs within SAMS.

The other two SAMS programs are the Advanced Strategic Leadership Studies Program or ASLSP – a Senior Service College equivalent, and, the Advanced Strategic Planning and Policy Program or ASP3 also known as the Goodpaster Scholars—a post-graduate degree program.

Matthew Yandura is an AMSP assistant professor, and retired Army colonel.

By Matt Yandura, Assistant Professor, School of Advanced Military Studies

Anduril’s Menace-I Brings Petabyte-Scale Processing to the Warfighter at the Tactical Edge

Monday, September 15th, 2025

On August 11, 2025, U.S. Marines sling loaded Anduril’s Menace-I via a CH-53K King Stallion helicopter, demonstrating new levels of mobility for expanded expeditionary mission planning and coordination. From a distance, it looked like any other grey shipping container. In reality, it was a deployable node for planning, coordination, and data processing—equipped with the power, climate control, compute, connectivity, and security of a fixed facility.

Menace-I is a turnkey command, control, compute, communications, cyber, and intelligence, surveillance and reconnaissance (C5ISR) solution accredited for use as both a Sensitive Compartmented Information Facility (SCIF) and a Special Access Program Facility (SAPF). In less than ten minutes after setup, Menace-I is fully operational and supporting missions in forward, contested environments.

The challenge is delivering large quantities of processing power—secure, accredited, and reliable—to the tactical edge. Today, anything involving classified data in a SCIF or SAPF can only be done in fixed facilities or in Temporary Sensitive Compartmented Information Facilities (T-SCIFs) that require a day or more to set up. That timeline doesn’t work for expeditionary forces that maneuver in hours, not days.

Traditional approaches rely on reach-back to distant data centers over SATCOM links that may be degraded or denied in conflict. At the tactical edge, connectivity cannot be assumed, yet forces still require AI, analytics, mission planning, briefing, and debriefing in seconds. Menace-I solves this by bringing the compute with you.

Menace-I delivers a powerful, secure, accredited SCIF/SAPF set of edge nodes wherever forces are operating—enabling classified mission planning, force generation, and battle management at the point of need. What once took a day or more to set up can now be established in under ten minutes. Every Menace-I runs on Lattice, Anduril’s AI-powered software, is powered by Voyager’s rugged edge computing platform, and is connected through Lattice Mesh, our secure networking fabric.

Proven Real-World Mobility Options

The recent sling load operation validated Menace-I as the only fully integrated mission planning solution for fifth-generation aircraft that is transportable by all organic Marine Corps assets: truck, KC-130J Super Hercules, and rotary wing aircraft.

This mobility matters. Expeditionary forces can now reposition a fully accredited planning node as quickly as they maneuver, ensuring secure command centers move in lockstep with the fight. What once required hours of setup or reach-back can now move forward with the unit, giving commanders immediate access to secure facilities wherever the mission takes them.

Petabyte-Scale AI at the Edge

Artificial intelligence, advanced analytics, and cross-domain data processing demand massive compute capacity—rarely available at the tactical edge. To meet this need, in July, Anduril delivered the first Menace-I in a petabyte-scale configuration, powered by Voyager.

The configuration quadruples compute capacity with tens of thousands of cores, brings petabyte-scale storage, and delivers high performance computing (HPC) and graphics processing unit (GPU) acceleration to the edge. It provides the same expeditionary capabilities of Menace-I, scaled to handle AI workloads, data fusion, mission planning, briefing, and debriefing—all without relying on fragile reach-back to distant data centers.

In a D-Day environment where connectivity is uncertain, Menace-I brings the data center with you.

At the heart of Menace-I is Voyager, Anduril’s family of rugged edge communications and computing solutions. Voyager is engineered to withstand extreme environments, electronic attack, and jamming. Its modular design makes it easily adaptable to different mission needs.

Voyager is deployed in austere environments worldwide, trusted by thousands of customers, and is the preferred solution for rugged computing for militaries and special operations forces.

Cross-Domain Operations with Everfox

Conflicts are contested across land, air, sea, space, and cyber. Winning requires seamless data movement across classification levels.

Voyager is now the preferred edge server hardware platform for Everfox’s cross-domain solutions, enabling enterprise-grade data transfer between classification levels in expeditionary environments. This partnership ensures that forces operating at the tactical edge can move intelligence across domains and networks without sacrificing security or speed. Imagery, targeting data, and mission plans can flow seamlessly from unclassified to classified environments—and back—enabling faster, more informed decisions in contested battlespaces.

Everfox, powered by Voyager, will be deployed across Anduril’s Menace family of systems, enabling customers to conduct cross-domain operations at the edge.

In the Field Today

Menace-I is deployed with customers and partners today, enabling forward-deployed forces to plan, process, and fight with the speed, security, and mobility needed to stay connected wherever the fight takes them.

DSEI 2025: AimLock and Teledyne FLIR Defense Collaborate on Autonomous Kinetic Capabilities

Wednesday, September 10th, 2025

At this year’s DSEI event in London, AimLock’s autonomous targeting and engagement systems will be showcased on Teledyne FLIR’s SUGV and Kobra ground robots 

LONDON, 10 September 2025 – Today at DSEI UK 2025, AimLock, a pioneer in autonomous targeting and engagement systems, announced it is collaborating with Teledyne FLIR Defense to provide autonomous kinetic capabilities for the company’s SUGV™ 325 and Kobra™ 725 ground robots.  

AimLock’s systems, powered by the company’s Core Targeting Modules (CTMs), will support Teledyne FLIR’s robots in delivering decision-accelerating kinetic autonomy across key mission sets in Counter-UAS, Force Protection, Direct Action, Integrated Defense, Strike Anti Armor, and Support by Fire. As the autonomous battlefield continues to evolve, both companies will be displaying these mission-critical systems at their booths during DSEI. 

On Display at Teledyne’s Booth (S3-110):  

Teledyne FLIR’s SUGV 325, Integrated with AimLock’s RS-2 Solution with Dual 40mm Grenade Launchers SUGV 325, a backpackable robot that offers a versatile solution for a variety of missions, will be integrated with AimLock’s RS-2, a remote engagement speed and accuracy system that powers automated target acquisition and firing solutions. R-S2 can be affixed with a range of weapon systems: dual 40mm grenade launchers, quad M72 rocket launchers, a lightweight machine gun, .50 caliber semi-automatic rifle, and more. 

The R-S2 provides:  

Multi-sensor AI targeting: The system uses edge-processed, multi-modal sensors with AI-enhanced target detection, classification, recognition, and identification. 

Automated stabilized firing: two-axis stabilized positioners provide automated firing solutions with corrections for target and platform motion, environmental conditions, and targeting ranges. 

Flexible integration: open architecture control interfaces integrate into larger combat systems or standalone use, with platform kits available for manned and unmanned air and ground vehicles. 

On Display at AimLock’s Booth (N5-260):  

Teledyne FLIR’s Kobra 725, Integrated with AimLock’s RM-1 The Kobra 725, a powerful, heavy-payload robot, will be integrated with AimLock’s RM-1, a semi-autonomous remote weapon station for medium machine gun platforms. The RM-1 provides:  

An advanced targeting system: The R-M1 combines day/night sensors, laser range-finding, active stabilization, autonomous detection, classification, tracking, and automatic firing to deliver fast, accurate engagement, even while on the move. 

Multi-threat capability: It detects and tracks small drones, vehicles, and enemy combatants, enabling 7.62mm and .338 Norma Mag machine gun engagement out to the host weapon’s effective range. 

Flexible deployment: The ultra-portable system can be mounted on tripods, buildings, vehicles, boats, and helicopters for maximum operational versatility. 

“As autonomy on the battlefield continues to become the status quo, our unique ability to provide autonomous kinetic capabilities is more important than ever,”

said Bryan Bockmon, CEO of AimLock. “We are pleased to collaborate with Teledyne FLIR Defense to power their robots with the kind of technology that helps keep warfighters safe and bring them home from the most precarious of missions.”  

“Globally, our customers are placing a sharper focus on lethality for unmanned systems, which is why our collaboration with Aim-Lock is a win-win,”

said Nate Winn, director of product management, Unmanned Systems North America, at Teledyne FLIR Defense.

“Whether it’s our SUGV or Kobra robots or Rogue 1 UAS, our highly modular platforms can easily integrate a wide range of kinetic effect systems that are proving to be difference-makers in modern warfare.”   

Elbit Systems Launches Frontier: Next-Generation AI-Based System to Tackle Evolving Border Defense Challenges

Tuesday, September 9th, 2025

Elevating Persistent Surveillance with Real-Time AI-Driven Threat Detection and Decision-Making Support, Frontier reduces the workload on operating teams, reduces operational costs, and enhances mission success rates.

London, UK, September 9, 2025 – Elbit Systems Ltd. introduces Frontier, its cutting-edge wide-area persistent surveillance system, designed to address the increasing complexity and intensity of border defense challenges. Presented for the first time at the DSEI 2025 exhibition, Frontier is built to autonomously detect, classify, and assess threats in real-time. Frontier leverages advanced artificial intelligence (AI) to optimize intelligence gathering and decision-making in land, air, and maritime domains.

As global threats continue to evolve, intelligence teams are burdened with monitoring and managing thousands of alerts and massive amounts of data in real time. This challenge places significant demands on operational teams, often requiring substantial resources and risking potential errors. To address these needs, Elbit Systems developed Frontier, a smart, AI-based edge system designed to enhance surveillance operations.

Key features of Frontier include:

  • AI Based Adaptive Routine Learning: using cutting edge computing capabilities the system continuously learns and adapts to routine operations by analyzing vast amounts of data to detect anomalies and deviations.
  • Autonomous Threat Classification: Leveraging AI, the system autonomously identifies and classifies threats in real-time, enabing fast and accurate responses.
  • Smart Decision-Making Support: Frontier prioritizes and assesses the risk of potential threats, providing operators with clear, actionable insights.
  • Sensor Integration: Maximizes the capabilities of multiple sensors, turning data overflow into a coherent understanding of the peremiter.
  • Operational Efficiency: Reduces the workload on operating teams, and lowers operational costs, while improving mission success rates.

Frontier autonomously operates various types of sensors to visually confirm and classify threats, transmitting only the most relevant and analyzed information to the appropriate forces.

Army Awards TurbineOne Contract for AI-Powered Edge Target Recognition

Friday, September 5th, 2025

SAN FRANCISCO–TurbineOne announced today it has been awarded a five-year, IDIQ contract with a $98.9M ceiling from the U.S. Army to deliver and demonstrate its Frontline Perception System (FPS) as part of the Army’s Intelligence Enterprise modernization.

With FPS, warfighters can build, retrain, and deploy custom machine learning models at the edge without coding, leveraging multiple sensor feeds in degraded communications environments.

Founded in 2021, TurbineOne successfully completed a Phase II Small Business Innovation Research (SBIR) contract which led to this SBIR Phase III IDIQ, a milestone reached by only a fraction of small businesses. In addition to successful completion of a Phase II SBIR, TurbineOne earned a Defense Innovation Unit (DIU) “success memo” for prototype performance, a key factor in the Army’s decision to award this contract.

TurbineOne will deliver AI/ML-driven automated target recognition, counter-UAS, and collaborative autonomy capabilities to accelerate intelligence and targeting cycles, including processing, exploitation, and dissemination (PED) to Army units. With FPS, warfighters can build, retrain, and deploy custom machine learning models at the edge without coding, leveraging multiple sensor feeds in degraded communications environments.

“We are grateful to partner with innovative Army units that pushed us through years of iterative exercises to deliver a software-first, hardware-agnostic capability ready for global distribution and rapid scale,” said Ian Kalin, CEO of TurbineOne. “We look forward to supporting the Army’s Transformation Initiative in collaboration with the innovative intelligence leaders in the Army G-2 and Program Executive Office-Intelligence, Electronic Warfare & Sensors.”

www.turbineone.com

Editor’s note: we write about this technology in November 22 soldiersystems.net/2022/11/10/mww-23-turbine-one-frontline-perception-system.