Sensors, Army STTR, Phase I

Standoff Detection of Hidden Objects and Personnel In and Around Foliage

Release Date: 04/17/2024
Solicitation: 24.B
Open Date: 05/15/2024
Topic Number: A24B-T016
Application Due Date: 06/12/2024
Duration: Up to 6 months
Close Date: 06/12/2024
Amount Up To: Up to: $194,000

Objective

Businesses must develop a solution for the autonomous, standoff detection of hidden objects and personnel in or around foliage at 50 to 250 feet.

Description

This topic seeks technology capabilities to autonomously detect hidden objects and personnel in and around foliage and roadsides at standoff distances from 50 to 250 feet, and provide a warning. Current commercial screening technologies include millimeter wave, terahertz sensors, magnetometers, x-rays and neutron scattering.

These technologies can effectively detect target objects, primarily for near-field detection (inches to feet). The Army wants to detect and track target objects at standoff distances of 50 to 250 feet for “agile node,” including expeditionary airfields, survivable command and control, and agile support.

Autonomy will facilitate maneuverability, enhance force protection, reduce cognitive loads and minimize training burdens on the operators. Autonomous detection and alarms reduce cognitive burden on operators by preventing screen fatigue and highlighting suspicious objects in a scene.

Autonomous software can reduce training demands by supporting and assisting the operator during system start-up and operation, while suggesting courses of action in response to a given alarm. Autonomy enables the operator to operate at a distance greater than 300 to 450 feet; the operator does not have to stay next to the sensor to see information and alarm, which enhances force protection.

This topic call does not include leave behind components such as point and vibration sensors.

Phase I

Businesses must demonstrate the detection of various sizes and shapes of metal, plastics that are approximately the size of soup cans, gallon paint cans, small manhole covers and personnel from distances of 50 feet, 100 feet, 150 feet, 200 feet and 250 feet from a starting point on the ground representing the sensor position to the target. If the sensor is 30 feet in the air or on a post, as an example only, drop a line to the ground for the “starting point”.

The objects and personnel should operate in and around different types of foliage in spring, summer, fall brush, roadside brush and  trees. Firms should collect sufficient target data to develop and demonstrate feasibility for target object detection, classification, tracking using machine learning, artificial intelligence and signal processing innovations. Vendors should develop and deliver a sensor design that can help build a Phase II experimental prototype sensor in a field experiment by a Government scientist, engineer and Soldier.

Businesses must consider false alarms. They should develop an approach that can characterize system performance for detection and false alarms. An example would be to develop a randomized or semi-randomized experimental design and test matrix that can operate within the budget boundaries of Phase I. It should also provide data sufficient for the preliminary limited receiver operator characteristics curve while demonstrating the feasibility of the sensor design concept.

The Army eneds to think about false alarm states and mitigation. The Phase I deliverable should include both the sensor design and experimental data that support the design and mitigate the Phase II risk. The Army will consider offerings of market surveys and later down selection as non-responsive.

Phase II

Vendors should build and demonstrate a smart prototype sensor based on the design and algorithms developed in Phase I. It must enablegovernment scientists, engineers and Soldiers to operate it for the purpose of participating in an Army Expeditionary Warrior or equivalent user experiment.

Businesses should collect target data in sufficient quantity to develop and demonstrate machine learning and artificial intelligence. This will help to scan, detect, classify, locate and track target objects and personnel to develop receiver operator characteristic curves, or similar statistical analyses. The Phase II smart sensor should issue a visual alarm on a screen that an operator can see.

The screen may be either a monitor screen attached to the sensor or a remote screen. One example would be a cell phone. The prototype should demonstrate covert autonomous standoff detection from an agile node at 50to 250 feet, tracking a variety of metal shapes and personnel in and around foliage. Examples of an agile node might be expeditionary airfields, survivable command and control, or covert agile support.

The prototype should demonstrate the preliminary feasibility for operation from a moving vehicle traveling one to 20 miles per hour. Using multiple sensors to scan surrounding area is acceptable. Innovations in AI/ML may scan, detect, classify, locate and track target objects and personnel.

The Phase II deliverable should be a prototype demonstration in the contractor’s facilities and a warfighter experiment, such as an Army Expeditionary Warrior Experiment or equivalent. The business must deliver the Phase II prototype sensor “in place” to the Ggovernment. “In place” means that the prototype delivery is in the contractor’s facility, but accessible for future work by the government.

Phase III

Firms will direct further research and development during Phase III efforts toward refining the final deployable equipment and procedures. Businesses will incorporate design modifications based on results from tests conducted during Phase III. The Army will examine manufacturability specific to Counter Improvised Explosives Devices Program Concept of Operations  and end-user requirements.

Submission Information

All eligible businesses must submit proposals by noon ET.

To view full solicitation details, click here.

For more information, and to submit your full proposal package, visit the DSIP Portal.

STTR Help Desk: usarmy.rtp.devcom-arl.mbx.sttr-pmo@army.mil

A24B | Phase I

References:

  • David A. Andrews, Stuart William Harmer, Nicholas J. Bowring, Nacer D. Rezgui, and Matthew J. Southgate. “Active millimeter wave sensor for standoff concealed threat detection.” IEEE Sensors journal 13, no. 12 (2013): 4948-4954.
  • Zhongmin Wang, Tianying Chang, and Hong-Liang Cui. “Review of active millimeter wave imaging techniques for personnel security screening.” IEEE Access 7 (2019): 148336-148350.;
  • Boris Kapilevich, and Moshe Einat. “Detecting hidden objects on human body using active millimeter wave sensor.” IEEE Sensors journal 10, no. 11 (2010): 1746-1752.
  • Federico García-Rial, Daniel Montesano, Ignacio Gómez, Carlos Callejero, Francis Bazus, and Jesús Grajal. “Combining commercially available active and passive sensors into a millimeter-wave imager for concealed weapon detection.” IEEE Transactions on Microwave Theory and Techniques 67, no. 3 (2018): 1167-1183
  • Bram van Berlo, Amany Elkelany, Tanir Ozcelebi, and Nirvana Meratnia. “Millimeter wave sensing: A review of application pipelines and building blocks.” IEEE Sensors Journal 21, no. 9 (2021): 10332-10368
  • Roger Appleby, Duncan A. Robertson, and David Wikner. “Millimeter wave imaging: a historical review.” In Passive and Active Millimeter-Wave Imaging XX, vol. 10189, p. 1018902. SPIE, 2017
  • Ting Liu, Yao Zhao, Yunchao Wei, Yufeng Zhao, and Shikui Wei. “Concealed object detection for activate millimeter wave image.” IEEE Transactions on Industrial Electronics 66, no. 12 (2019): 9909-9917.
  • Jeffrey A. Nanzer, Microwave and millimeter-wave remote sensing for security applications. Artech House, 2012; Boris Y. Kapilevich, Stuart W. Harmer, and Nicholas J. Bowring. Non-imaging microwave and millimetre-wave sensors for concealed object detection. CRC Press, 2014
  • A. Huizing, M. Heiligers, B. Dekker, J. de Wit, L. Cifola and R. Harmanny, “Deep Learning for Classification of Mini-UAVs Using Micro-Doppler Spectrograms in Cognitive Radar,” in IEEE Aerospace and Electronic Systems Magazine, vol. 34, no. 11, pp. 46-56, 1 Nov. 2019, doi: 10.1109/MAES.2019.2933972.
  • Abhishek Gupta, Alagan Anpalagan, Ling Guan, Ahmed Shaharyar Khwaja, “Deep learning for object detection and scene perception in self-driving cars: Survey, challenges, and open issues”, Array, Volume
  • Standoff detection 20-250 feet,  open unstructured environment, moving targets, millimeter wave, autonomous identification and tracking  hidden threats and personnel

Objective

Businesses must develop a solution for the autonomous, standoff detection of hidden objects and personnel in or around foliage at 50 to 250 feet.

Description

This topic seeks technology capabilities to autonomously detect hidden objects and personnel in and around foliage and roadsides at standoff distances from 50 to 250 feet, and provide a warning. Current commercial screening technologies include millimeter wave, terahertz sensors, magnetometers, x-rays and neutron scattering.

These technologies can effectively detect target objects, primarily for near-field detection (inches to feet). The Army wants to detect and track target objects at standoff distances of 50 to 250 feet for “agile node,” including expeditionary airfields, survivable command and control, and agile support.

Autonomy will facilitate maneuverability, enhance force protection, reduce cognitive loads and minimize training burdens on the operators. Autonomous detection and alarms reduce cognitive burden on operators by preventing screen fatigue and highlighting suspicious objects in a scene.

Autonomous software can reduce training demands by supporting and assisting the operator during system start-up and operation, while suggesting courses of action in response to a given alarm. Autonomy enables the operator to operate at a distance greater than 300 to 450 feet; the operator does not have to stay next to the sensor to see information and alarm, which enhances force protection.

This topic call does not include leave behind components such as point and vibration sensors.

Phase I

Businesses must demonstrate the detection of various sizes and shapes of metal, plastics that are approximately the size of soup cans, gallon paint cans, small manhole covers and personnel from distances of 50 feet, 100 feet, 150 feet, 200 feet and 250 feet from a starting point on the ground representing the sensor position to the target. If the sensor is 30 feet in the air or on a post, as an example only, drop a line to the ground for the “starting point”.

The objects and personnel should operate in and around different types of foliage in spring, summer, fall brush, roadside brush and  trees. Firms should collect sufficient target data to develop and demonstrate feasibility for target object detection, classification, tracking using machine learning, artificial intelligence and signal processing innovations. Vendors should develop and deliver a sensor design that can help build a Phase II experimental prototype sensor in a field experiment by a Government scientist, engineer and Soldier.

Businesses must consider false alarms. They should develop an approach that can characterize system performance for detection and false alarms. An example would be to develop a randomized or semi-randomized experimental design and test matrix that can operate within the budget boundaries of Phase I. It should also provide data sufficient for the preliminary limited receiver operator characteristics curve while demonstrating the feasibility of the sensor design concept.

The Army eneds to think about false alarm states and mitigation. The Phase I deliverable should include both the sensor design and experimental data that support the design and mitigate the Phase II risk. The Army will consider offerings of market surveys and later down selection as non-responsive.

Phase II

Vendors should build and demonstrate a smart prototype sensor based on the design and algorithms developed in Phase I. It must enablegovernment scientists, engineers and Soldiers to operate it for the purpose of participating in an Army Expeditionary Warrior or equivalent user experiment.

Businesses should collect target data in sufficient quantity to develop and demonstrate machine learning and artificial intelligence. This will help to scan, detect, classify, locate and track target objects and personnel to develop receiver operator characteristic curves, or similar statistical analyses. The Phase II smart sensor should issue a visual alarm on a screen that an operator can see.

The screen may be either a monitor screen attached to the sensor or a remote screen. One example would be a cell phone. The prototype should demonstrate covert autonomous standoff detection from an agile node at 50to 250 feet, tracking a variety of metal shapes and personnel in and around foliage. Examples of an agile node might be expeditionary airfields, survivable command and control, or covert agile support.

The prototype should demonstrate the preliminary feasibility for operation from a moving vehicle traveling one to 20 miles per hour. Using multiple sensors to scan surrounding area is acceptable. Innovations in AI/ML may scan, detect, classify, locate and track target objects and personnel.

The Phase II deliverable should be a prototype demonstration in the contractor’s facilities and a warfighter experiment, such as an Army Expeditionary Warrior Experiment or equivalent. The business must deliver the Phase II prototype sensor “in place” to the Ggovernment. “In place” means that the prototype delivery is in the contractor’s facility, but accessible for future work by the government.

Phase III

Firms will direct further research and development during Phase III efforts toward refining the final deployable equipment and procedures. Businesses will incorporate design modifications based on results from tests conducted during Phase III. The Army will examine manufacturability specific to Counter Improvised Explosives Devices Program Concept of Operations  and end-user requirements.

Submission Information

All eligible businesses must submit proposals by noon ET.

To view full solicitation details, click here.

For more information, and to submit your full proposal package, visit the DSIP Portal.

STTR Help Desk: usarmy.rtp.devcom-arl.mbx.sttr-pmo@army.mil

References:

  • David A. Andrews, Stuart William Harmer, Nicholas J. Bowring, Nacer D. Rezgui, and Matthew J. Southgate. “Active millimeter wave sensor for standoff concealed threat detection.” IEEE Sensors journal 13, no. 12 (2013): 4948-4954.
  • Zhongmin Wang, Tianying Chang, and Hong-Liang Cui. “Review of active millimeter wave imaging techniques for personnel security screening.” IEEE Access 7 (2019): 148336-148350.;
  • Boris Kapilevich, and Moshe Einat. “Detecting hidden objects on human body using active millimeter wave sensor.” IEEE Sensors journal 10, no. 11 (2010): 1746-1752.
  • Federico García-Rial, Daniel Montesano, Ignacio Gómez, Carlos Callejero, Francis Bazus, and Jesús Grajal. “Combining commercially available active and passive sensors into a millimeter-wave imager for concealed weapon detection.” IEEE Transactions on Microwave Theory and Techniques 67, no. 3 (2018): 1167-1183
  • Bram van Berlo, Amany Elkelany, Tanir Ozcelebi, and Nirvana Meratnia. “Millimeter wave sensing: A review of application pipelines and building blocks.” IEEE Sensors Journal 21, no. 9 (2021): 10332-10368
  • Roger Appleby, Duncan A. Robertson, and David Wikner. “Millimeter wave imaging: a historical review.” In Passive and Active Millimeter-Wave Imaging XX, vol. 10189, p. 1018902. SPIE, 2017
  • Ting Liu, Yao Zhao, Yunchao Wei, Yufeng Zhao, and Shikui Wei. “Concealed object detection for activate millimeter wave image.” IEEE Transactions on Industrial Electronics 66, no. 12 (2019): 9909-9917.
  • Jeffrey A. Nanzer, Microwave and millimeter-wave remote sensing for security applications. Artech House, 2012; Boris Y. Kapilevich, Stuart W. Harmer, and Nicholas J. Bowring. Non-imaging microwave and millimetre-wave sensors for concealed object detection. CRC Press, 2014
  • A. Huizing, M. Heiligers, B. Dekker, J. de Wit, L. Cifola and R. Harmanny, “Deep Learning for Classification of Mini-UAVs Using Micro-Doppler Spectrograms in Cognitive Radar,” in IEEE Aerospace and Electronic Systems Magazine, vol. 34, no. 11, pp. 46-56, 1 Nov. 2019, doi: 10.1109/MAES.2019.2933972.
  • Abhishek Gupta, Alagan Anpalagan, Ling Guan, Ahmed Shaharyar Khwaja, “Deep learning for object detection and scene perception in self-driving cars: Survey, challenges, and open issues”, Array, Volume
  • Standoff detection 20-250 feet,  open unstructured environment, moving targets, millimeter wave, autonomous identification and tracking  hidden threats and personnel

A24B | Phase I

Standoff Detection of Hidden Objects and Personnel In and Around Foliage

Scroll to Top