FN Herstal

Archive for the ‘Digitization’ Category

Persistent Systems Improves MPU5 Radio – Releases Software Update

Thursday, January 16th, 2020


NEW YORK, NY. – Dec 04, 2019 – Persistent Systems, LLC (“Persistent”) is excited to release Firmware Version 19.5.3 for the MPU5 and Embedded Module. This firmware addresses feedback from a readiness exercise conducted by the U.S. Army 101st Airborne Division.

“Thank you to the Rakkasan’s for the in-depth after-action report. Your feedback is greatly appreciated and helps Persistent continue to improve the MPU5 in ways that are most beneficial to the warfighter,” said Eric Stern, Director of Engineering at Persistent.

Firmware Version 19.5.3 contains significant performance improvements specifically intended to benefit dismounted end users.  Improvements include:

• Improved Battery Life: about 3-hours increase in MPU5 runtime on a standard 6.8 Ah battery pack reduces the number of batteries soldiers must carry to support their mission duration. Power consumption improvements also benefit Embedded Module users.

• Improved Audio Quality: new Opus audio codec implementation, delivers crystal clear audio and an 8x reduction in network utilization for voice traffic. Improved audio clarity further reduces the cognitive load on the soldier.

• Rotary Knob implementation: users can now modify Audio Volume or select their Intercom Talk Group via the 8-position knob on the MPU5.

• LED Blackout Mode: users can now disable the status LED to support low-visibility operations.

• Simplified Web Management: Web interface is now streamlined based on the cables currently connected to the radio and displays only relevant settings to the user, improving ease of use and ensuring users are maximizing the capabilities of the MPU5.

• Multicast Firmware Upgrade: firmware upgrades to large MPU5 networks now occur via multicast, enabling a rapid upgrade of a large number of nodes.

• Rapid Configuration Tool: implementation of a tool to help automate the mass configuration of MPU5s. As users continue to create larger networks, rapid configuration becomes even more critical. 

“We want to empower warfighters with industry-leading capabilities, and receiving direct user feedback from operational units is extremely beneficial. Their feedback allows us to focus on improving existing capabilities and developing new ones to address capability gaps that can only be discovered in real-world deployments of the network,” Stern added.

Firmware Version 19.5.3 is immediately available. Existing customers will receive an email notification and can download the firmware from the new Persistent Customer Support Portal. All customers are encouraged to upgrade.

See the Latest from Propel, LLC at the Consumer Electronics Show

Monday, January 6th, 2020

Propel, LLC has been doing some spectacular work in eTextiles and they’ve been invited by the Small Business Administration to exhibit at CES.

See them in booth #50000 at this week’s CES in Las Vegas.

Mobile battlefield Devices Show Great Potential Thanks to Army Research

Saturday, January 4th, 2020

ADELPHI, Md. — Soldiers on the battlefield are not able to rely on high-powered bulky devices or the cloud to conduct operations, so how can they efficiently run the programs and algorithms needed to be successful in their missions?

A collaborative effort between Army researchers has resulted in a tool that will enable the Army to model, characterize and predict the performance of current and future machine learning-based applications on mobile devices, enabling the deployment of advanced analytics to the tactical edge to support Command, Control, Communications, Computers, Intelligence, Surveillance and Reconnaissance operations.

This research is being conducted by Dr. Kevin Chan from the U.S. Army Combat Capabilities Development Command’s Army Research Laboratory, Pennsylvania State University and IBM, a collaborative effort made possible by the lab’s Network Science Collaborative Technology Alliance that is slated to conclude this year after a 10-year run.

The researchers detail their achievements in papers recently accepted to the Institute of Electrical and Electronics Engineers Transactions on Mobile Computing titled Augur: Modeling the Resource Requirements of ConvNets on Mobile Devices and to the IEEE/ACM Transactions on Networking titled NetVision: On-demand Video Processing in Wireless Networks.

This research studies how convolutional neural networks on mobile devices such as smartphones are being used for various applications like object detection, language translation and audio classification, Chan said.

“Given the rapid advances and development of artificial intelligence and machine learning techniques, most of the research in deep learning is studied using devices or platforms that have a lot more resources to include processing, energy and storage, and commercial applications use the cloud for some of these complex computations,” Chan said. “As a result, there’s a great deal of uncertainty in the performance and resource requirements of these algorithms on mobile devices, for instance if they’ll take forever to run or use up all of the battery.”

The researchers profiled several different commonly used deep learning algorithms on numerous different current mobile computing platforms, including smartphones and mobile graphics processing units, and characterized how they performed.

The primary collaborator of this work was Professor Thomas La Porta, director, School of Electrical Engineering and Computer Science, and Evan Pugh Professor and William E. Leonhard Professor at Pennsylvania State University.

“We characterized the runtime, memory usage and energy usage of these platforms, whereas typical studies are concerned with runtime and performance,” La Porta said. “The edge analytics requires us to study how these algorithms work on mobile devices. Obviously, commercial applications and vendors are interested in having applications work on smartphones, but they can more readily go to the cloud for help.”

With this, the researchers developed a tool called Augur that is able to predict the performance and resource usage of future algorithms on future mobile devices.

“The result of this research can readily be used on future generations of algorithms and mobile devices,” Chan said.

Understanding how these applications/algorithms work on mobile devices such as tablets, head-mounted displays and handhelds will be crucial towards enabling (or for) analytics at the edge, he said.

Further, the research also shows how the analytics can run on mobile devices, and how these operations can leverage other more capable computing platforms deployed near the tactical edge to support the complex analytics.

“Tactical networks have proposed the deployment of such capabilities called microclouds, for example server class machines in the back of humvees,” Chan said. “The work on NetVision employs tactical microcloud capabilities in which mobile edge devices offload (parts of) the analytics workflow to these devices to speed up processing of the data.”

Chan stated the approach finds optimal processing of the data between the mobile and microcloud computing resources as it still has to deal with a limited bandwidth network to transfer the data.

“The Army will want to employ the latest AI&ML capabilities,” Chan said. “As algorithms and the devices running them improve, it will be important to understand what can run and what sort of performance to expect.”

For Chan, having this work published in an IEEE journal is a huge accomplishment.

“ToN and TMC is an indication that the work is high-quality and well-regarded,” Chan said. “In our field, these are considered as the top-tier journals in which we aim for our research to be published. Earlier versions of this work was published at the 25th ACM International Conference on Multimedia and the Conference on Communications and Networks, which are both highly-rated networking computer science conference and an accomplishment on their own.”

This work was specifically performed within the NSCTA under the distributed video analytics task, and NetVision, in particular, was shown at the NSCTA Expo as a research highlight of the Quality of Information — Semantically Adaptive Networks thrust area.

“As a result of the second half of the program, we had a research task on video analytics,” Chan said. “This research, a collaboration with Penn State and IBM was very productive, enabling CCDC ARL to work with academic and industrial partners, both world-class researchers. This highly-collaborative research leveraged diverse technical expertise – even shared equipment!”

Chan stated that this project and all research conducted under the NSCTA is crucial as the Army continues to develop science and technology for the future fight.

Since the Army has identified communications and networks as a critical capability towards current and future operations, stated Chan, researchers must consider how networked systems behave.

“The concept of multi domain operations implies that operational domains are inherently interconnected,” Chan said. “The Army must understand and develop new technology and capabilities to enable a new way of operations. This will require, for example, understanding on how to execute multi domain command and control, and to create situational awareness through exchange of information across and within operational domains. ARL’s research in network science has resulted in advancement in the state-of-the-art of these capabilities to support multi domain operations for a variety of the Army’s functions.”

For La Porta, this collaboration and research established a foundation for great things to come.

“This work was a valuable building block that allowed us as academic partners to build even deeper collaboration with CCDC ARL and develop systems and algorithms that allow for very fast object and action recognition in videos that are stored on mobile cameras,” La Porta said.

Looking to the future, laboratory officials said they will continue to engage the CCDC C5ISR (Command, Control, Computers, Communications, Cyber, Intelligence, Surveillance and Reconnaissance) Center and the U.S. Army Futures and Concepts Center to best understand where this research can be transitioned to get it one step closer to a Soldier’s hands.

By US Army CCDC Army Research Laboratory Public Affairs

Soldiers Test New Integrated Visual Augmentation System

Friday, November 22nd, 2019

FORT BENNING, Ga. — Soldiers at Fort Pickett, Va. are testing a Microsoft-designed prototype goggle, the Integrated Visual Augmentation System (IVAS), that offers the capabilities they need to regain and maintain overmatch in multi-domain operations on battlefields that are increasingly urban, congested, dark and unpredictable.

The event is called a Soldier touch point, or STP, and it is fast becoming the standard for the new Army Futures Command’s (AFC) rapid acquisitions methodology. STPs allow industry partners to field test system prototypes repeatedly throughout the research and development process to ensure the final product, in this case the multi-functional IVAS goggle, is met with enthusiasm and truly useful when its fielded to the force.

The Soldier Lethality Cross Functional Team (SL CFT) and their partners in military and industry are hosting the STP at Fort Pickett, a National Guard post known for relevant training sites, like the urban village used to replicate combat scenarios that have become commonplace in Middle Eastern operations. The object is to make sure the warfighter drives the design and development based on need and utility. The concept is called Soldier Centered Design, and though it’s not a new concept, it is the first time it has been institutionalized, the first time it has been applied systemically to increase speed and efficiency.

In the spring, Soldier and Marines from various line and special forces units tested an early IVAS prototype based on Microsoft’s heads up display which was designed using Microsoft’s HoloLens 2. That first STP was geared toward proving concept and utility.

The STP underway today at Pickett, the second of four STPs in the 24-month development schedule, is a tougher test designed to assess new capabilities at the platoon level and increase demands on the system in more complex training environments. At this point, about half-way through STP 2, Microsoft has gathered feedback from more than 3,200 hours of user experience.

The SL CFT is one of AFC’s eight CFTs tasked with modernizing the Army after the 2018 National Defense Strategy identified an erosion in close combat capabilities relative to pacing threats around the world. The SL CFT focuses on developing weapons for the Close Combat Force — those who close with and destroy the enemy — to make them more successful in battle. Success is defined in terms of survivability, lethality, situational awareness and maneuverability.

Of all the products and programs in the SL CFT portfolio, IVAS is arguably the most intriguing, as it is the result of complex, non-traditional partnerships and unconventional funding methods (contracts with Microsoft funded through Other Transaction Agreements), and it harnesses a variety of next generation technologies unlike anything the American Soldier has employed ever before.

The final product — officials say it will likely be fielded in the fourth quarter of FY21 — will include a variety of features: a color see-through digital display that makes it possible for the user to access information without taking his eye off the battlefield; thermal and low-light sensors that make it possible to see in the dark, literally; rapid target acquisition and aided target identification; augmented reality and artificial intelligence, to name just a few. IVAS is billed as a fight-rehearse-train system, meaning its function on the battlefield is priority, but its augmented reality capabilities, like real-time mapping, will make it useful for training and rehearsing operations anywhere at any time. And though it’s said to “enhance the survivability” of combatants, its target identification technology will save civilian lives, too.

“When terms like ‘situational awareness’ get thrown around time after time, it’s easy to lose sight of what it really means,” said MAJ Brad Winn, the CFT’s lead action officer for IVAS. “In this case, one of the greatest capabilities of IVAS is Aided Target Recognition, a feature that gives users the ability to quickly identify anything or anyone in sight, which means they can tell the difference between a threat and a civilian non-combatant.”

Winn is one of many members of Team IVAS, a diverse group of Soldiers, civilian employees, academics and industry partners who leverage their respective organizations’ expertise to expedite the development and fielding process. Aside from the SL CFT, Team IVAS includes experts from Microsoft, other CFTs, PEO Soldier, ATEC, the Army’s Combat Capabilities Development Command Research Lab and Soldier Center, and a half dozen other members of that complex integrated network of mostly military command-level organizations known as the Futures Force Modernization Enterprise.

Microsoft “deployed” a team from the west coast to live at Fort Pickett for the duration of this STP, more than a month, to gather feedback and make changes to the goggle every day. They’ll repeat the process next summer, when they put the next iteration of IVAS, the all-weather, ruggedized and militarized, form-fitting prototype to the test in company level operations.

STP 4 will follow in 2021.

By Bridgett Siter

Thales Develops the Future of Soldier Weapon Systems in Lithgow

Wednesday, November 6th, 2019

Paris, Ile-de-France, France – In order to maintain a capability advantage for Australia’s Defence Forces, the soldier systems of the future will integrate disruptive digital technologies, advanced sensor and targeting equipment and networked communications – ThalesGroup.com. Euronext: HO

Thales is building on more than a century of small arms manufacture in Lithgow in developing the soldier weapon systems of the future.
• The digitised battlespace will require a fundamental technology leap to ensure Australian soldiers maintain a capability edge against emerging threats.
• This future weapon system is an evolution of the individual weapon and will provide soldiers with an enduring battlefield overmatch.

Drawing together advanced manufacturing techniques and materials, Thales’s advanced future soldier weapon system will integrate:

• cutting edge sensors and targeting systems;
• biometric security safeguards;
• tactical network links to enable collaborative engagement
• enhanced command, control and situational awareness for both individual soldiers and commanders.

Thales’s Lithgow small arms manufacturing facility has been proudly supporting Australia’s soldiers on battlefields around the world since 1912. The future advanced individual weapon system will continue this heritage of manufacturing the world’s most advance systems as the battlespace becomes more digitised and networked.

Building on this century of sovereign capability, Thales’s development of the future soldier weapon system is undertaken in Lithgow, NSW and aligns with the Australian Government’s recognition that the research, design, development and manufacture of small arms is a priority sovereign industrial capability.

“Rapid advances in digital technology bring increasing threats as well as new capabilities. Thales’s future weapon system accelerates the development process for an era of networked warfare.”
Chris Jenkins, CEO, Thales Australia

Integrated Visual Augmentation System Brings AI to Soldier Training

Tuesday, October 22nd, 2019

WASHINGTON — The Army is now testing virtual-reality goggles that will allow Soldiers to rehearse combat missions that they are about to undertake.

The Integrated Visual Augmentation System, known as IVAS, will be tested by 82nd Airborne Division troops next month at Fort Pickett, Virginia. The IVAS goggles will allow Soldiers to see simulated images superimposed over the actual terrain.

The Soldiers will wear the goggles and miniature computer equipment as they negotiate obstacle courses, run land navigation and conduct other missions, said officials from Program Executive Office Soldier.

Called Soldier Touchpoint 2, the test is designed to provide feedback to PEO Soldier so the IVAS heads-up display can be further enhanced before 200,000 of the headsets begin to be fielded in 2021.

IVAS has been touted by senior leaders as a “game-changer” for Soldier lethality and a quick win for the modernization priority.

The IVAS headsets are a good example of how artificial intelligence is being used to enhance Soldier lethality, said Brig. Gen. Matthew Easley, director of the Army’s AI Task Force.

Each pair of IVAS goggles has “significant amounts of high-tech sensors onboard and processors,” Easley said at a Warriors Corner presentation Monday afternoon during the Association of the U.S. Army Annual Meeting and Exposition.

Each IVAS headset has integrated AI chips built into the system, he said.

“Those chips are doing visual recognition,” he said. “They’re tracking a Soldier’s eye movements, they’re tracking a Soldier’s hand as it interfaces with the system, and they’re tracking a Soldier’s voice.”

The IVAS headset “uses a customized AI piece” to make it work, he said.

AI will be an enabler for all of the Army’s modernization programs over the next decade, Easley said.

“Each one of those systems need AI,” he said, from Future Vertical Lift to Long-Range Precision Fires to the Next Generation Combat Vehicle.

“AI, as you know, is becoming a pervasive part of our society,” he said.

“Every system that you can think of — from self-driverless cars to ride-sharing applications, to restaurant recommendation systems to healthcare systems — they span every area of our society.

“They need to span every battlefield system that we have,” as well, he said, from maneuver to fire control.

By Gary Sheftick, Army News Service

AUSA 19 – WL Gore & Assoc Integrated Cabling for Soldier Systems

Wednesday, October 16th, 2019

I first saw Gore’s Integrated Cabling for Soldier Systems at DSEI last month in London. I was quite pleased to see that they had brought the technology across the pond to the US. Gore’s cable systems are across the board, lighter, more flexible and less prone to breakage than alternatives, thanks to the ePTFE exteriors. Using them to provide power and databus within an armored vest, was a logical step.

The armor vest itself was manufactured by WL Gore partner brand Costas Siamidis, which is based in Greece. The actual Gore cabling is inside of this vest. They are connector agnostic, which is important considering there are at least four different connectors on the market.

This is what their cable bundles look like and they will configure them how needed. Compared to other systems, they are less than half the weight and much less bulky.


Soldier Integrated Protective Ensemble

Saturday, October 12th, 2019

The Soldier Integrated Protective Ensemble Advanced Technology Demonstration was conducted in the fall of 1992 at Fort Benning, Georgia.

These photos of SIPE components were taken by Natick Research, Development, and Engineering Center.

Download the report here.