B5 Systems

Archive for the ‘AI / ML’ Category

Army Invests Nearly $50 Million in Artificial Intelligence and Machine Learning

Tuesday, April 23rd, 2024

WASHINGTON — The U.S. Army invested $50 million in small and nontraditional businesses to develop a variety of artificial intelligence and machine learning solutions under its AI/ML open-topic solicitation.

Released in December 2022, the U.S. Army Small Business Innovation Research Program’s solicitation sought to enhance the Army’s operational capabilities and address broader national security efforts by tackling critical information gaps via AI technologies. With the help of industry, the Army prioritized the development of solutions ranging from radio-frequency identification to language translation.

During the Phase I performance period, 39 small and nontraditional vendors delivered concepts within these priority areas that highlighted their technologies’ commercial viability and technical feasibility. Now the Army will fund 26 of the selected businesses to a total of nearly $50 million to transform their concepts into prototypes ready for demonstration.

View the full AI/ML Open Topic infographic here.

The Army SBIR Program offers Phase I contract opportunities to small and nontraditional vendors exhibiting commercial viability, feasibility and technical merit. The program provides Phase II and Direct to Phase II contracts to vendors with mature technologies capable of gaining increased federal support and solving Army needs.

Vendors receive access to technical, acquisition and operational Army experts. These specialists offer information on the Army’s critical needs while providing guidance from within the Army research and development ecosystem. Selectees capitalize on this by collaborating with technical points of contact that serve as a resource for vendors as they mature their technologies for insertion into Army acquisition programs.

The Army SBIR Program releases contract opportunities on a rolling basis to respond to current and anticipated Soldier technology needs. The program will continue to promote new contract releases via topic announcements and email. We encourage you to follow U.S. Army SBIR|STTR on Facebook, X (formerly Twitter) and LinkedIn for the latest program announcements, updates and solicitation opportunities.

The Office of the Assistant Secretary of the Army for Acquisition, Logistics and Technology leverages technologies and capabilities to provide U.S. Soldiers a decisive advantage in any environment by developing, acquiring, fielding and sustaining the world’s finest equipment and services. For more information, visit the ASA(ALT) web page and follow @ArmyASAALT.

Please contact the Army SBIR mailbox if you have any questions.

By Daniel Smoot, Office of Army Prize Competitions and Army SBIR Program

Army, Industry Discuss Future Implications of Augmenting Humans with AI

Friday, March 22nd, 2024

AUSTIN, Texas — As artificial intelligence becomes increasingly integrated into a number of industries, organizational leaders from the public and private sector are considering both the opportunities and risk posed by this rapidly evolving field of technology.

During a March 12 South by Southwest Conference panel in Austin, tech enthusiasts from the U.S. Army and industry discussed how advances in AI-augmented humans and humanoids — non-human entities, such as robots, that possess human characteristics — have the potential to reshape how humans work and accomplish complex tasks.

The panelists additionally discussed the importance of pursuing responsible AI, so that the new technology will serve to improve human lives and abilities.

“AI is not a panacea,” said Army Futures Command Director of Integration Col. Troy Denomy, who participated in the panel. Denomy clarified that AI can be a useful tool in optimizing the capabilities of humans and machines but is not a replacement for human brainpower or skill. He added that the Army does not want to create situations in which humans are working for robots but rather seeks to enable robots to work for humans.

To better understand the advantages AI can offer, the Army is evaluating new AI assistance methods through its Soldier-centered design model, which places Soldier participation and feedback at the core of experimentation efforts. The method takes inspiration from private industry best practices shaped around ensuring end-user satisfaction, such as Microsoft’s human-centered design methods.

Panelist Steven Bathiche, who leads Microsoft’s Applied Sciences Group, highlighted how AI developers are shifting away from remote-controlled programming toward task-based programming, which allows humans to complete more complicated tasks by automating the repetitive ones. Bathiche commented on the Army’s historical ability to enable greater innovation and problem-solving in emerging fields of technology through mutually beneficial partnerships with entrepreneurs and industry.

Fellow speaker Young Bang, who serves as Principal Deputy Assistant Secretary of the Army for Acquisition, Logistics and Technology, emphasized that interfaces with technology must evolve alongside the technology itself, so that analysts and Soldiers can more easily and intuitively interact with AI systems. Carefully assessing risk is also critical, and the Army continues to apply frameworks to identify and counteract risks, including when adopting third-party generated algorithms. The Army also plans — with the help of industry — to deepen its understanding of how integrating new AI capabilities may impact Soldiers’ well-being and behaviors, with an aim of improving personal, professional and operational outcomes.

“It’s about innovation and failing quickly. We don’t want programs that last 10 years and then decide to kill it. We want to learn faster and faster from our mistakes,” Bang said.

By Army Futures Command

Battlefield Technology Focus: Featuring OKSI, KAGWERKS, Firebird Electro-Optics & ONYX Industries

Friday, February 9th, 2024

During SHOT Show 2024, leading tech experts curated a parlor space to showcase pushing the boundaries of innovation with elite technologies. They all share the same goal; to bring the warfighter technical solutions required to overcome challenges faced on the ever-evolving battlefield.

Let’s cover down on Tech:

OKSI – Their Autonomous Precisions Weapon Systems includes Passive Ranging, Sentry Remote Weapon System, EO/IR Seeker for APKWS, and an 81mm Precision Guidance Kit. Additionally, they have Unmanned Autonomous Systems & Networks portfolio, which includes Autonomous Vehicle Kits, GPS-denied Navigation, Coordinated Drone Teaming & Swarming, and ATD/ATR.

KÄGWERKS – Their chest mounted radio systems featuring Silvus Technologies MN MiMo tech was on display, along with the Dock Ultra body worn compute system. The system enables operators to do real time processing of map data, image recognition, along with other AI/ML capabilities.

Firebird Electro-Optics – Their weapon mounted and handheld LED & LEP illuminators, along with their MAID MFAL dual beam single aperture laser, focusable VCSEL illuminator was on display. They also showcased their SWIR and LWIR solutions, with active and passive range finding and designation.

Onyx Industries – The Sentry Remote Weapon System was on display in the parlor and show floor in partnership with Persistent Systems, LLC, showcasing its multifunctional ATD/ATR human in the loop capabilities, in both its kinetic and ISR variants, ready to be deployed in overwatch or terrain denial positions.

GA-ASI Uses Autonomy to Close F2T2EA Engagement Chain

Thursday, January 11th, 2024

-Avenger Flight Demonstrates Multi-Objective Collaborative Combat Mission

-GA-ASI Combines Skills of Multiple Autonomy Providers to Advance UCAV Ecosystem

SAN DIEGO – 09 January 2024 – General Atomics Aeronautical Systems, Inc. (GA-ASI) demonstrated its rapidly maturing open standards-based autonomy ecosystem for Unmanned Combat Air Vehicles (UCAVs) on an MQ-20 Avenger® as part of a live flight test on Nov. 2, 2023. The flight combined three autonomy providers, government-provided human-machine interface (HMI) hardware, and GA-ASI’s autonomy core to meet multiple objectives for collaborative combat missions and closed the Find, Fix, Track, Target, Engage, and Assess (F2T2EA) engagement chain using a mix of Live, Virtual, and Constructive (LVC) entities.

The flight, which took place from GA-ASI’s Desert Horizon Flight Operations Facility in El Mirage, Calif., illustrates the company’s commitment to maturing its open standards-based autonomy software ecosystem for Autonomous Collaborative Platforms (ACPs). Designing the system around government-owned and -maintained standards avoids vendor lock and allows rapid integration of best-of-breed capabilities in areas such as Artificial Intelligence (AI), HMIs, and other skills from third-party providers.

“This flight underscores GA-ASI’s commitment to proving combat operational readiness for vendor-agnostic autonomy architecture for UCAV platforms,” said GA-ASI Vice President of Advanced Programs Michael Atwood. “Ultimately, GA-ASI’s series of flight tests demonstrate our unmatched ability to deploy best-of-breed mission software, autonomy, and hardware capabilities on unmanned platforms, accelerating the operationalization of this critical technology for the warfighter. This most recent test shows multi-service compatibility of the autonomy core through the integration of USAF and Navy software skills together.”

Another important goal of GA-ASI’s flights is to demonstrate the company’s commitment to developing an open government standards-based autonomy ecosystem that enables rapid integration and validation of third-party tactical software applications. GA-ASI is focused on supporting the emerging App Store-based model that allows organizations to rapidly develop and deploy software while maintaining safety of flight and ensuring warfighters have up-to-date access to the industry’s best capabilities.

Autonomy skills for the recent flight test were provided by GA-ASI, Scientific Systems Company, Inc. (SSCI), and NAVAIR PMA-281’s ARCANE (Architecture and Capabilities for Autonomy in Naval Enterprise) Team. The PMA-281 ARCANE Team accomplishes Intelligent Autonomy & AI integration, compliance, and sustainment objectives for Naval Aviation UAV Tactical Operations. Different skills on the aircraft were activated based on the F2T2EA phase or via human-on-the-loop interaction using the FOX tablet HMI. A government-furnished autonomy core and Open Mission Systems (OMS) messaging protocols were used to coordinate between provider skills during different F2T2EA phases. Rapid integration of these disparate skills was made possible by utilizing government standards, such as OMS, and adhering to state-of-the-art government autonomy design methods.

Collaborative mission autonomy capabilities provided by SSCI successfully commanded a fully autonomous multi-vehicle Defensive Counter Air (DCA) mission—from Combat Air Patrol (CAP) through detection, identification, tracking, and multiple successful engagements.

“Our Collaborative Mission Autonomy (CMA) development kit enables the team to perform development and integration in short time frames in a tactically relevant way,” said David “Heat” Lyons, SSCI’s Vice President of Business Development and former F-16 Weapons Officer and combat fighter pilot. “For the warfighter, we are demonstrating mission-ready behaviors on GA-ASI’s UCAV that are trustworthy, understandable, and explainable.”

GA-ASI provided weapon-target pairing (WTP) and electronic warfare (EW) autonomy skills for the flight. These were developed using GA-ASI’s deep reinforcement learning (RL) framework. The mission skills were activated like play calls in real time, and their status was monitored by the pilot via the FOX tablet.

NAVAIR PMA-281’s ARCANE program delivered a cooperative weave skill, whereby a live lead MQ-20 was paired with a simulated follower MQ-20 to demonstrate a collaborative flight formation technique aimed at increasing survivability. This demonstration showcased the flexibility of GA-ASI’s autonomy core to rapidly integrate third-party best-of-breed skills in support of a wide range of evolving mission types.

Collectively, these skills were integrated into and orchestrated by the government-furnished autonomy core architecture that was enhanced by GA-ASI. The flexibility of the government managed autonomy core software stack enabled rapid and seamless integration of multi-UAS third-party behaviors.

Dominus Technological at SHOT Show

Monday, January 8th, 2024

Meetings are by appointment only – email SHOTshow@oksi.ai

First Army Taps AI to Enhance Command and Control

Thursday, December 14th, 2023

ROCK ISLAND ARSENAL, Ill. — First Army is leveraging the potential of artificial intelligence during large scale mobilization exercises and other missions.

Lt. Col. Melissa Sayers, First Army operations research systems analyst, or ORSA, said the first use will come during the 2024 iteration of Pershing Strike, First Army’s annual exercise to validate the Army’s ability to mobilize forces in support of large-scale combat operations.

The use will be limited but by the following year, the plan is for it be an integral part of the exercise and other First Army operations. It is forecasted to eventually be a routine part of how First Army does business. The hope, Sayers said, is that, “not only do will we have a simulation that we can run a million scenarios on but it’s part of our everyday operations, helping us get to decisions faster.”

Artificial intelligence is the use of computer systems to perform tasks that traditionally require human input and do them much faster. For First Army, faster information would lead to a boost in efficiency during operations that move a multitude of Soldiers and equipment to an assortment of locations across the country for training and mobilization.

“The machine can’t do it without the human,” Sayers noted. “Say we had a large-scale mobilization operation and we had all these units ready to head out the door and the medical unit shows up at 50 percent strength. With AI, we have the ability to pre-calculate solutions. We estimate what is going to happen if you make this decision, and we can go ahead and run it and calculate all those different decisions and have the best three or four recommended to the commander. The commander still makes the decisions but we can get there a lot faster if we have it pre-calculated and ready to run when something happens.”

AI is used in all manner of situations, from customer service to medical diagnoses to traffic patterns. At First Army, the plan is for it to create more efficient operations in exercises and mobilizations, including a large-scale mobilization operation if such an event arises.

“That’s what First Army cares about,” Sayers said. “We want to be able to push out the Reserve Component in a timely manner in event of a large-scale conflict. Once you have the model created, you can start playing with it. It helps leaders at very high levels figure out what levers to pull and what resources to apply to maximize what’s happening on the ground.”

Sayers noted the positive impact this can have for units of any size and the individual Soldiers.

“We have units full of people that need to be processed,” she said. “They need to arrive at their home station, they need to make sure they have all their equipment. What does it take to get the equipment fully maintained? What does the shipping network look like? How many observer coach/trainers should we have and of what flavor — do we need aviation, medical, infantry? How many medical stations? What if one of those stations goes down? What if one shows up at only half-strength? What happens at that location and what are our options to react to that problem. We can plot all this out.”

Partially due to the value of AI, First Army added an ORSA this year. “Anytime First Army has needed to do advanced analytics, it’s had to outsource it,” Sayers said. “They’ve never had anyone inhouse to advise the command and do the work.”

It’s a microcosm of what’s talking place across the Army.

”The ORSA role has exploded in the last couple of years,” Sayers said. “We’ve been limited on what we’re able to provide to the commands because the amount of data was not there to do deep qualitative analysis. Suddenly all this data is able to be collected because we have the hardware to be able to store it and we have the hardware to be able to collect it.”

Because of that, First Army and its partners will be better equipped to provide combatant commanders with trained and ready Reserve Component Soldiers.

By Warren Marlow

Department of the Air Force Leaders Emphasize Adapting AI for Warfighting Success

Wednesday, December 6th, 2023

ARLINGTON, Va. (AFNS) —  

Air Force Secretary Frank Kendall made it clear that the Air Force and Space Force are fully committed — and pushing hard — to develop and deploy artificial intelligence as a key element in meeting security challenges posed by China and other adversaries Dec. 2.

Kendall’s remarks were not new, but by voicing them during a session at the influential Reagan National Defense Forum, he added additional weight to the Department of the Air Force’s efforts to use AI as part of a larger push to modernize.

“I care a lot about civil society and the law of armed conflict,” Kendall said. “Our policies are written around those laws. You don’t enforce laws against machines, you enforce them against people. Our challenge is not to limit what we can do with AI but to find how to hold people accountable for what the AI does. The way we should approach is to figure out how to apply the laws of armed conflict to the applications of AI. Who do we hold responsible for the performance of that AI and what do we require institutions to do before we field these kinds of capabilities and use them operationally.”

Kendall pointed out that China and other adversaries are aggressively using AI, and while the U.S. maintains an edge, it is shrinking. Kendall’s comments dovetailed with those from Air Force Chief of Staff Gen. David Allvin, who said at a separate session during the conference that the Air Force must modernize to properly meet the security threats of today.

Part of that effort, Allvin said, is diligently working to integrate AI and machine learning into new capabilities that mesh seamlessly with mission needs and proven technologies, while understanding performance tradeoffs.

“I do believe the future is going to be about human-machine teaming,” Allvin said. “Optimizing the performance and being able to operate at speed. That investment in our collaborative combat aircraft program is what is going to get us there.”

Speed and automation of AI systems have vastly shortened decision timelines. That’s why the DoD’s National Defense Strategy focuses on accelerating decision making and the way information is analyzed and shared.

“We are leveraging algorithms and starting with data fusion and being able to gain insights,” Allvin said. “The changing character of war is speed. If we are going to be privileging speed and have massive amounts of data, the ability to have algorithms and the tools that support and let the analysts do what only humans can do which is make that human decision.”

“Our job on the government side more than anything else is to thoroughly understand this technology, have the expertise we need to really get into the details of it and appreciate how it really works,” Kendall said. “To be creative about helping industry find new applications for that technology and developing ways to evaluate it get the confidence we’re going to need to ensure that it can be used ethically and reliably when it is in the hands of our warfighters.”

Replacing obsolete, legacy systems by harnessing emerging information, communications, and AI technologies to provide operational targeting and decision support with the speed, adaptability and resilience needed to fight in a highly contested environment is a priority for DAF and falls under Kendall’s Operation Imperatives.

“The critical parameter on the battlefield is time,” Kendall said. “The AI will be able to do much more complicated things much more accurately and much faster than human beings can. If the human is in the loop, you will lose. You can have human supervision and watch over what the AI is doing, but if you try to intervene you are going to lose. The difference in how long it takes a person to do something and how long it takes the AI to do something is the key difference.”

Rapid AI development requires DAF to be agile and adaptable in its approach, focusing on rapid testing, experimentation and deployment. The Department of Defense continues to maintain a robust regulatory and ethical framework to ensure the responsible use of AI in defense.

Both men stressed the importance of innovation. Allvin said that innovation is a critical element of modernization and is necessary for maintaining readiness.

“War is a human thing and the ability to leverage technology with human innovation is something we can never walk away from as we’re continuing to develop and more sophisticated systems,” Allvin said.

The Reagan National Defense Forum, celebrating “10 Years of Promoting Peace Through Strength,” brings together leaders from across the political spectrum and key stakeholders in the defense community, including members of Congress, current and former presidential administration officials, senior military leadership, industry executives, technology innovators and thought leaders. Their mission is to review and assess policies that strengthen America’s national defense in the context of the global threat environment.

Secretary of the Air Force Public Affairs

Launch of SensorFusionAI

Monday, October 23rd, 2023

• DroneShield launches SensorFusionAI (SFAI), a sensor-agnostic, 3D data fusion engine for complex environments

• Currently deployed as a module in DroneSentry-C2, DroneShield’s Command-and-Control (C2) system

• This launch enables SFAI as a standalone module which can integrate into third party C2 systems on SaaS basis, providing smart fusion capability from diverse sensor arrays

DroneShield (“DroneShield” or the “Company”) is pleased to launch SensorFusionAI (SFAI), a sensor-agnostic, 3D data fusion engine for complex environments.

Angus Bean, DroneShield’s CTO, commented “Detection of drones or Unmanned Aerial Systems (UAS) is moving towards multi-sensor approach for fixed site (and in certain situations, vehicle and ship systems) where the space and budget allows for such approach, due to ability to provide better detection results with multiple sensor modalities, such as radiofrequency, radar, acoustic and camera systems, either deployed in a single or across multiple nodes.”

“However the multi-sensor approach only generates better results, with an intelligent software engine to fuse together the sensor outputs and give an intelligent set of outputs – otherwise adding more sensors is counterproductive as it creates more data without a clear way to manage it.”

DroneShield has developed a true AI-based sensorfusion engine, initially for its own DroneSentry-C2 command-and-control system, including all common drone detection modalities (RF, radar, acoustics, camera).

This separation enables third party C2 manufacturers (including primes) to add SFAI to their C2 systems, on a subscription basis (SaaS), thus improving the performance.

Oleg Vornik, DroneShield’s CEO, added “DroneShield seeks to be both the complete supplier of C-UAS solutions where possible, or a subcontractor where it makes sense. There will be numerous situations globally where the customer has an existing preference for another C2 supplier, based on their existing relationships or other requirements. Providing SFAI to such third party suppliers, maximises our market share and further monetises the IP that we have developed.”

Key feature of SFAI include:

• Behaviour Analysis – Track an object to determine classification and predict trajectory.

• Threat Assessment – Intelligently determine threat level based on a wide range of data types.

• Confidence Levels – Designed for complex, high noise environments, with inconsistent data inputs.

• After-Action Reporting – Sophisticated analytics presented in easy to interpret graphical dashboards.

• Edge Processing – Utilises an edge processing device (SmartHub) for reduced network load and high scalability.

• Versatile Adaptable Inputs – New sensors use existing software adaptors to improve integration time.

• Output to Any Platform – Visualisation on DroneSentry-C2 or third-party C2 platforms, data analysis, alert systems or security management software.

SFAI has significant advantages over traditional multi-sensor C2 engines, whereby system sensors are utilised for their strengths with their weaknesses offset by the strengths of sensor types:

• System intelligently builds a model informed by all inputs over time.

• Confidence values allow for soft sensitivity selection, reducing false positives or false negatives.

• Prediction model can interpolate paths for consistent tracking even with sparse data.

• Any incomplete or contradictory data mediated by comprehensive object model.

• All sensor data fused into one consistent intelligence packet.