SureFire

Archive for the ‘Digitization’ Category

Mobile battlefield Devices Show Great Potential Thanks to Army Research

Saturday, January 4th, 2020

ADELPHI, Md. — Soldiers on the battlefield are not able to rely on high-powered bulky devices or the cloud to conduct operations, so how can they efficiently run the programs and algorithms needed to be successful in their missions?

A collaborative effort between Army researchers has resulted in a tool that will enable the Army to model, characterize and predict the performance of current and future machine learning-based applications on mobile devices, enabling the deployment of advanced analytics to the tactical edge to support Command, Control, Communications, Computers, Intelligence, Surveillance and Reconnaissance operations.

This research is being conducted by Dr. Kevin Chan from the U.S. Army Combat Capabilities Development Command’s Army Research Laboratory, Pennsylvania State University and IBM, a collaborative effort made possible by the lab’s Network Science Collaborative Technology Alliance that is slated to conclude this year after a 10-year run.

The researchers detail their achievements in papers recently accepted to the Institute of Electrical and Electronics Engineers Transactions on Mobile Computing titled Augur: Modeling the Resource Requirements of ConvNets on Mobile Devices and to the IEEE/ACM Transactions on Networking titled NetVision: On-demand Video Processing in Wireless Networks.

This research studies how convolutional neural networks on mobile devices such as smartphones are being used for various applications like object detection, language translation and audio classification, Chan said.

“Given the rapid advances and development of artificial intelligence and machine learning techniques, most of the research in deep learning is studied using devices or platforms that have a lot more resources to include processing, energy and storage, and commercial applications use the cloud for some of these complex computations,” Chan said. “As a result, there’s a great deal of uncertainty in the performance and resource requirements of these algorithms on mobile devices, for instance if they’ll take forever to run or use up all of the battery.”

The researchers profiled several different commonly used deep learning algorithms on numerous different current mobile computing platforms, including smartphones and mobile graphics processing units, and characterized how they performed.

The primary collaborator of this work was Professor Thomas La Porta, director, School of Electrical Engineering and Computer Science, and Evan Pugh Professor and William E. Leonhard Professor at Pennsylvania State University.

“We characterized the runtime, memory usage and energy usage of these platforms, whereas typical studies are concerned with runtime and performance,” La Porta said. “The edge analytics requires us to study how these algorithms work on mobile devices. Obviously, commercial applications and vendors are interested in having applications work on smartphones, but they can more readily go to the cloud for help.”

With this, the researchers developed a tool called Augur that is able to predict the performance and resource usage of future algorithms on future mobile devices.

“The result of this research can readily be used on future generations of algorithms and mobile devices,” Chan said.

Understanding how these applications/algorithms work on mobile devices such as tablets, head-mounted displays and handhelds will be crucial towards enabling (or for) analytics at the edge, he said.

Further, the research also shows how the analytics can run on mobile devices, and how these operations can leverage other more capable computing platforms deployed near the tactical edge to support the complex analytics.

“Tactical networks have proposed the deployment of such capabilities called microclouds, for example server class machines in the back of humvees,” Chan said. “The work on NetVision employs tactical microcloud capabilities in which mobile edge devices offload (parts of) the analytics workflow to these devices to speed up processing of the data.”

Chan stated the approach finds optimal processing of the data between the mobile and microcloud computing resources as it still has to deal with a limited bandwidth network to transfer the data.

“The Army will want to employ the latest AI&ML capabilities,” Chan said. “As algorithms and the devices running them improve, it will be important to understand what can run and what sort of performance to expect.”

For Chan, having this work published in an IEEE journal is a huge accomplishment.

“ToN and TMC is an indication that the work is high-quality and well-regarded,” Chan said. “In our field, these are considered as the top-tier journals in which we aim for our research to be published. Earlier versions of this work was published at the 25th ACM International Conference on Multimedia and the Conference on Communications and Networks, which are both highly-rated networking computer science conference and an accomplishment on their own.”

This work was specifically performed within the NSCTA under the distributed video analytics task, and NetVision, in particular, was shown at the NSCTA Expo as a research highlight of the Quality of Information — Semantically Adaptive Networks thrust area.

“As a result of the second half of the program, we had a research task on video analytics,” Chan said. “This research, a collaboration with Penn State and IBM was very productive, enabling CCDC ARL to work with academic and industrial partners, both world-class researchers. This highly-collaborative research leveraged diverse technical expertise – even shared equipment!”

Chan stated that this project and all research conducted under the NSCTA is crucial as the Army continues to develop science and technology for the future fight.

Since the Army has identified communications and networks as a critical capability towards current and future operations, stated Chan, researchers must consider how networked systems behave.

“The concept of multi domain operations implies that operational domains are inherently interconnected,” Chan said. “The Army must understand and develop new technology and capabilities to enable a new way of operations. This will require, for example, understanding on how to execute multi domain command and control, and to create situational awareness through exchange of information across and within operational domains. ARL’s research in network science has resulted in advancement in the state-of-the-art of these capabilities to support multi domain operations for a variety of the Army’s functions.”

For La Porta, this collaboration and research established a foundation for great things to come.

“This work was a valuable building block that allowed us as academic partners to build even deeper collaboration with CCDC ARL and develop systems and algorithms that allow for very fast object and action recognition in videos that are stored on mobile cameras,” La Porta said.

Looking to the future, laboratory officials said they will continue to engage the CCDC C5ISR (Command, Control, Computers, Communications, Cyber, Intelligence, Surveillance and Reconnaissance) Center and the U.S. Army Futures and Concepts Center to best understand where this research can be transitioned to get it one step closer to a Soldier’s hands.

By US Army CCDC Army Research Laboratory Public Affairs

Soldiers Test New Integrated Visual Augmentation System

Friday, November 22nd, 2019

FORT BENNING, Ga. — Soldiers at Fort Pickett, Va. are testing a Microsoft-designed prototype goggle, the Integrated Visual Augmentation System (IVAS), that offers the capabilities they need to regain and maintain overmatch in multi-domain operations on battlefields that are increasingly urban, congested, dark and unpredictable.

The event is called a Soldier touch point, or STP, and it is fast becoming the standard for the new Army Futures Command’s (AFC) rapid acquisitions methodology. STPs allow industry partners to field test system prototypes repeatedly throughout the research and development process to ensure the final product, in this case the multi-functional IVAS goggle, is met with enthusiasm and truly useful when its fielded to the force.

The Soldier Lethality Cross Functional Team (SL CFT) and their partners in military and industry are hosting the STP at Fort Pickett, a National Guard post known for relevant training sites, like the urban village used to replicate combat scenarios that have become commonplace in Middle Eastern operations. The object is to make sure the warfighter drives the design and development based on need and utility. The concept is called Soldier Centered Design, and though it’s not a new concept, it is the first time it has been institutionalized, the first time it has been applied systemically to increase speed and efficiency.

In the spring, Soldier and Marines from various line and special forces units tested an early IVAS prototype based on Microsoft’s heads up display which was designed using Microsoft’s HoloLens 2. That first STP was geared toward proving concept and utility.

The STP underway today at Pickett, the second of four STPs in the 24-month development schedule, is a tougher test designed to assess new capabilities at the platoon level and increase demands on the system in more complex training environments. At this point, about half-way through STP 2, Microsoft has gathered feedback from more than 3,200 hours of user experience.

The SL CFT is one of AFC’s eight CFTs tasked with modernizing the Army after the 2018 National Defense Strategy identified an erosion in close combat capabilities relative to pacing threats around the world. The SL CFT focuses on developing weapons for the Close Combat Force — those who close with and destroy the enemy — to make them more successful in battle. Success is defined in terms of survivability, lethality, situational awareness and maneuverability.

Of all the products and programs in the SL CFT portfolio, IVAS is arguably the most intriguing, as it is the result of complex, non-traditional partnerships and unconventional funding methods (contracts with Microsoft funded through Other Transaction Agreements), and it harnesses a variety of next generation technologies unlike anything the American Soldier has employed ever before.

The final product — officials say it will likely be fielded in the fourth quarter of FY21 — will include a variety of features: a color see-through digital display that makes it possible for the user to access information without taking his eye off the battlefield; thermal and low-light sensors that make it possible to see in the dark, literally; rapid target acquisition and aided target identification; augmented reality and artificial intelligence, to name just a few. IVAS is billed as a fight-rehearse-train system, meaning its function on the battlefield is priority, but its augmented reality capabilities, like real-time mapping, will make it useful for training and rehearsing operations anywhere at any time. And though it’s said to “enhance the survivability” of combatants, its target identification technology will save civilian lives, too.

“When terms like ‘situational awareness’ get thrown around time after time, it’s easy to lose sight of what it really means,” said MAJ Brad Winn, the CFT’s lead action officer for IVAS. “In this case, one of the greatest capabilities of IVAS is Aided Target Recognition, a feature that gives users the ability to quickly identify anything or anyone in sight, which means they can tell the difference between a threat and a civilian non-combatant.”

Winn is one of many members of Team IVAS, a diverse group of Soldiers, civilian employees, academics and industry partners who leverage their respective organizations’ expertise to expedite the development and fielding process. Aside from the SL CFT, Team IVAS includes experts from Microsoft, other CFTs, PEO Soldier, ATEC, the Army’s Combat Capabilities Development Command Research Lab and Soldier Center, and a half dozen other members of that complex integrated network of mostly military command-level organizations known as the Futures Force Modernization Enterprise.

Microsoft “deployed” a team from the west coast to live at Fort Pickett for the duration of this STP, more than a month, to gather feedback and make changes to the goggle every day. They’ll repeat the process next summer, when they put the next iteration of IVAS, the all-weather, ruggedized and militarized, form-fitting prototype to the test in company level operations.

STP 4 will follow in 2021.

By Bridgett Siter

Thales Develops the Future of Soldier Weapon Systems in Lithgow

Wednesday, November 6th, 2019

Paris, Ile-de-France, France – In order to maintain a capability advantage for Australia’s Defence Forces, the soldier systems of the future will integrate disruptive digital technologies, advanced sensor and targeting equipment and networked communications – ThalesGroup.com. Euronext: HO

Thales is building on more than a century of small arms manufacture in Lithgow in developing the soldier weapon systems of the future.
• The digitised battlespace will require a fundamental technology leap to ensure Australian soldiers maintain a capability edge against emerging threats.
• This future weapon system is an evolution of the individual weapon and will provide soldiers with an enduring battlefield overmatch.

Drawing together advanced manufacturing techniques and materials, Thales’s advanced future soldier weapon system will integrate:

• cutting edge sensors and targeting systems;
• biometric security safeguards;
• tactical network links to enable collaborative engagement
• enhanced command, control and situational awareness for both individual soldiers and commanders.

Thales’s Lithgow small arms manufacturing facility has been proudly supporting Australia’s soldiers on battlefields around the world since 1912. The future advanced individual weapon system will continue this heritage of manufacturing the world’s most advance systems as the battlespace becomes more digitised and networked.

Building on this century of sovereign capability, Thales’s development of the future soldier weapon system is undertaken in Lithgow, NSW and aligns with the Australian Government’s recognition that the research, design, development and manufacture of small arms is a priority sovereign industrial capability.

“Rapid advances in digital technology bring increasing threats as well as new capabilities. Thales’s future weapon system accelerates the development process for an era of networked warfare.”
Chris Jenkins, CEO, Thales Australia

Integrated Visual Augmentation System Brings AI to Soldier Training

Tuesday, October 22nd, 2019

WASHINGTON — The Army is now testing virtual-reality goggles that will allow Soldiers to rehearse combat missions that they are about to undertake.

The Integrated Visual Augmentation System, known as IVAS, will be tested by 82nd Airborne Division troops next month at Fort Pickett, Virginia. The IVAS goggles will allow Soldiers to see simulated images superimposed over the actual terrain.

The Soldiers will wear the goggles and miniature computer equipment as they negotiate obstacle courses, run land navigation and conduct other missions, said officials from Program Executive Office Soldier.

Called Soldier Touchpoint 2, the test is designed to provide feedback to PEO Soldier so the IVAS heads-up display can be further enhanced before 200,000 of the headsets begin to be fielded in 2021.

IVAS has been touted by senior leaders as a “game-changer” for Soldier lethality and a quick win for the modernization priority.

The IVAS headsets are a good example of how artificial intelligence is being used to enhance Soldier lethality, said Brig. Gen. Matthew Easley, director of the Army’s AI Task Force.

Each pair of IVAS goggles has “significant amounts of high-tech sensors onboard and processors,” Easley said at a Warriors Corner presentation Monday afternoon during the Association of the U.S. Army Annual Meeting and Exposition.

Each IVAS headset has integrated AI chips built into the system, he said.

“Those chips are doing visual recognition,” he said. “They’re tracking a Soldier’s eye movements, they’re tracking a Soldier’s hand as it interfaces with the system, and they’re tracking a Soldier’s voice.”

The IVAS headset “uses a customized AI piece” to make it work, he said.

AI will be an enabler for all of the Army’s modernization programs over the next decade, Easley said.

“Each one of those systems need AI,” he said, from Future Vertical Lift to Long-Range Precision Fires to the Next Generation Combat Vehicle.

“AI, as you know, is becoming a pervasive part of our society,” he said.

“Every system that you can think of — from self-driverless cars to ride-sharing applications, to restaurant recommendation systems to healthcare systems — they span every area of our society.

“They need to span every battlefield system that we have,” as well, he said, from maneuver to fire control.

By Gary Sheftick, Army News Service

AUSA 19 – WL Gore & Assoc Integrated Cabling for Soldier Systems

Wednesday, October 16th, 2019

I first saw Gore’s Integrated Cabling for Soldier Systems at DSEI last month in London. I was quite pleased to see that they had brought the technology across the pond to the US. Gore’s cable systems are across the board, lighter, more flexible and less prone to breakage than alternatives, thanks to the ePTFE exteriors. Using them to provide power and databus within an armored vest, was a logical step.

The armor vest itself was manufactured by WL Gore partner brand Costas Siamidis, which is based in Greece. The actual Gore cabling is inside of this vest. They are connector agnostic, which is important considering there are at least four different connectors on the market.

This is what their cable bundles look like and they will configure them how needed. Compared to other systems, they are less than half the weight and much less bulky.

www.gore.com/militarylandsystems

Soldier Integrated Protective Ensemble

Saturday, October 12th, 2019

The Soldier Integrated Protective Ensemble Advanced Technology Demonstration was conducted in the fall of 1992 at Fort Benning, Georgia.

These photos of SIPE components were taken by Natick Research, Development, and Engineering Center.

Download the report here.

Aircrews to Get Hand-Held Devices Linked Via Secure WiFi for Improved Air-to-Ground Operations

Tuesday, October 8th, 2019

INDIANAPOLIS, Oct. 7, 2019 — Raytheon Company (NYSE: RTN) received a $48 million engineering services contract to support the integration and qualification of hand-held devices into platform-mounted WiFi systems secured up to secret. Loaded with situational awareness and mission planning applications, the mobile devices will improve air-to-ground communication between combat teams, enhancing situational awareness as the mission unfolds.

“We’re helping aircrews and ground forces better communicate and collaborate in real time on the battlefield,” said Matt Gilligan, vice president at Raytheon’s Intelligence, Information and Services business. “Right now Blackhawk crews and dismounted soldiers rely heavily on voice communications during a mission, and when dynamics are changing in the air and on the ground minute by minute, that’s a huge challenge.”

The contract is part of the U.S. Army’s Air Soldier System (Air SS), the service’s effort to equip their rotary-wing aircrews with wearable electronics that increase their mission effectiveness and survivability.

Under the contract, Raytheon will load mission applications on commercial off-the-shelf phones and tablets to allow air and ground users to access and share current weather updates, friendly force trackers, and secure text messages.

The video features the Tennessee National Guard using the system during a recent FEMA exercise Shaken Fury.

Video by SPC Joshua Syberg
120th Public Affairs Detachment

Army Project Brings Quantum Internet Closer To Reality

Saturday, September 28th, 2019

RESEARCH TRIANGLE PARK, N.C. — A U.S. Army research result brings the quantum internet a step closer. Such an internet could offer the military security, sensing and timekeeping capabilities not possible with traditional networking approaches.

The U.S. Army’s Combat Capability Development’s Army Research Laboratory’s Center for Distributed Quantum Information, funded and managed by the lab’s Army Research Office, saw researchers at the University of Innsbruck achieve a record for the transfer of quantum entanglement between matter and light — a distance of 50 kilometers using fiber optic cables.

Entanglement is a correlation that can be created between quantum entities such as qubits. When two qubits are entangled and a measurement is made on one, it will affect the outcome of a measurement made on the other, even if that second qubit is physically far away.

“This [50 kilometers] is two orders of magnitude further than was previously possible and is a practical distance to start building inter-city quantum networks,” said Dr. Ben Lanyon, experimental physicist at University of Innsbruck and the principal investigator for the project, whose findings are published in the Nature journal Quantum Information.

Intercity quantum networks would be composed of distant network nodes of physical qubits, which are, despite the large physical separation, nevertheless entangled. This distribution of entanglement is essential for establishing a quantum internet, researchers said.

“The demonstration is a major step forward for achieving large scale distributed entanglement,” said Dr. Sara Gamble, co-manager of the Army program supporting the research. “The quality of the entanglement after traveling through fiber is also high enough at the other end to meet some of the requirements for some of the most difficult quantum networking applications.”

The research team started the experiment with a calcium atom trapped in an ion trap. Using laser beams, the researchers wrote a quantum state onto the ion and simultaneously excited it to emit a photon in which quantum information is stored. As a result, the quantum states of the atom and the light particle were entangled.

The challenge is to transmit the photon over fiber optic cables.

“The photon emitted by the calcium ion has a wavelength of 854 nanometers and is quickly absorbed by the optical fiber,” Lanyon said.

His team therefore initially sent the light particle through a nonlinear crystal illuminated by a strong laser. The photon wavelength was converted to the optimal value for long-distance travel — the current telecommunications standard wavelength of 1,550 nanometers.

The researchers then sent this photon through the 50-kilometer-long optical fiber line. Their measurements show that atom and light particles were still entangled even after the wavelength conversion and the distance traveled.

“The choice to use calcium means these results also provide a direct path to realizing an entangled network of atomic clocks over a large physical distance, since calcium can be co-trapped with a high quality “clock” qubit. Large scale entangled clock networks are of great interest to the Army for precision position, navigation, and timing applications,” said Dr. Fredrik Fatemi, an Army researcher who also co-manages the program.

Story by U.S. Army CCDC Army Research Laboratory Public Affairs

Photo courtesy IQOQI InnsbruckHarald Ritsch