Artificial Intelligence/Machine Learning, ASA(ALT), Phase I

Correlation of Detected Objects from Multiple Sensor Platforms

Release Date: 03/30/2021
Solicitation: 21.4
Open Date: 04/14/2021
Topic Number: A214-020
Application Due Date: 05/18/2021
Duration: 6 months
Close Date: 05/18/2021
Amount Up To: 256K

Topic Objective 

Research methodologies, frameworks, and processes to ingest, process, and correlate object detections and tracks from multiple imaging sensors including but not limited to ground, aerial, and overhead imagery and video. Outputs from the proposed study and eventual system will enable improved accuracy and confidence of detections as well as location across the ever-increasing number of sensors employed by Army Force Protection Systems (FPS). 

Description  

Army Force Protection Systems (FPS) currently employs ground-based video sensors for situational awareness around bases; however, there exist multiple other forms of video and imagery-based systems that can provide additional situational awareness and intelligence to operators and commanders. With these additional data sources comes the possibility for improved accuracy of detections, tracks, and geolocation of objects of interest. This SBIR is intended to study (Phase I) how current detections are used, how to add additional data sources, and analytic products and produce a software prototype (Phase II) to improve correlation across the data sources identified in Phase I. 

Phase I 

Conduct a study on current use of analytics and type of sensors used in current systems deployed by FPS to determine additional sensor types that can be incorporated in Phase II along with architecture diagrams of data flow, possible outputs, and how to increase accuracy of analytics with the option to develop an initial software prototype design. 

Phase II 

This phase will produce a software prototype based on the Phase I study and outcomes. It is expected that the system will define an architecture and/or framework in order to ingest analytic products (i.e., metadata) and correlate the products from multiple sensor types and sensor modalities (e.g., aerial, ground, tower, etc.). It may be necessary to develop additional computer vision and machine learning algorithms in order to be able to determine that an object is the same object from multiple sensor platform perspectives (e.g., between a ground sensor and an overhead system). The prototype developed under Phase II can be run on already collected data to prove out the concept but is desired to be run in real time during Phase III. 

Phase III 

The end-state of the proposed research is a system that can improve upon the current baseline of sensor systems and analytic products used by FPS to improve upon the understanding of operators using deployed systems. During Phase III, the software prototype is desired to be matured to run in real time on multiple data sources (greater than two) and integrated with FPS programs. 

Submission Information  

To submit full proposal packages, and for more information, visit the DSIP Portal. 

References:

Ghaffary, S. (7 Feb 2020). “The Smarter Wall: How drones, sensors, and AI are patrolling the border.” Vox.com. URL: https://www.vox.com/recode/2019/5/16/18511583/smart-border-wall-drones-sensors-ai (last accessed: 16 Jun 2020) H.-X. Yu, A. Wu, and W.-S. Zheng. (2019) Unsupervised person reidentification by deep asymmetric metric embedding. TPAMI (DOI 10.1109/TPAMI.2018.2886878). 

Wang, X. Zhu, S. Gong, and W. Li. (2018) Transferable joint attribute-identity deep learning for unsupervised person reidentification. CVPR. 

Wang, Jingjing, Jiang, Chunxiao, Han, Zhu, Ren, Yong, Maunder, Robert G. and Hanzo, Lajos (2016) Cooperative distributed unmanned aerial vehicular networks: Small and mini drones. IEEE Vehicular Technology Magazine, 1-18. (In Press) Z.  

Zhong, L. Zheng, Z. Luo, S. Li, and Y. Yang. (2019) Invariance matters: Exemplar memory for domain adaptive person re-identification. CVPR 

Wang, Jingjing, Jiang, Chunxiao, Han, Zhu, Ren, Yong, Maunder, Robert G. and Hanzo, Lajos (2016) Cooperative distributed unmanned aerial vehicular networks: Small and mini drones. IEEE Vehicular Technology Magazine, 1-18. (In Press) 

Topic Objective 

Research methodologies, frameworks, and processes to ingest, process, and correlate object detections and tracks from multiple imaging sensors including but not limited to ground, aerial, and overhead imagery and video. Outputs from the proposed study and eventual system will enable improved accuracy and confidence of detections as well as location across the ever-increasing number of sensors employed by Army Force Protection Systems (FPS). 

Description  

Army Force Protection Systems (FPS) currently employs ground-based video sensors for situational awareness around bases; however, there exist multiple other forms of video and imagery-based systems that can provide additional situational awareness and intelligence to operators and commanders. With these additional data sources comes the possibility for improved accuracy of detections, tracks, and geolocation of objects of interest. This SBIR is intended to study (Phase I) how current detections are used, how to add additional data sources, and analytic products and produce a software prototype (Phase II) to improve correlation across the data sources identified in Phase I. 

Phase I 

Conduct a study on current use of analytics and type of sensors used in current systems deployed by FPS to determine additional sensor types that can be incorporated in Phase II along with architecture diagrams of data flow, possible outputs, and how to increase accuracy of analytics with the option to develop an initial software prototype design. 

Phase II 

This phase will produce a software prototype based on the Phase I study and outcomes. It is expected that the system will define an architecture and/or framework in order to ingest analytic products (i.e., metadata) and correlate the products from multiple sensor types and sensor modalities (e.g., aerial, ground, tower, etc.). It may be necessary to develop additional computer vision and machine learning algorithms in order to be able to determine that an object is the same object from multiple sensor platform perspectives (e.g., between a ground sensor and an overhead system). The prototype developed under Phase II can be run on already collected data to prove out the concept but is desired to be run in real time during Phase III. 

Phase III 

The end-state of the proposed research is a system that can improve upon the current baseline of sensor systems and analytic products used by FPS to improve upon the understanding of operators using deployed systems. During Phase III, the software prototype is desired to be matured to run in real time on multiple data sources (greater than two) and integrated with FPS programs. 

Submission Information  

To submit full proposal packages, and for more information, visit the DSIP Portal. 

References:

Ghaffary, S. (7 Feb 2020). “The Smarter Wall: How drones, sensors, and AI are patrolling the border.” Vox.com. URL: https://www.vox.com/recode/2019/5/16/18511583/smart-border-wall-drones-sensors-ai (last accessed: 16 Jun 2020) H.-X. Yu, A. Wu, and W.-S. Zheng. (2019) Unsupervised person reidentification by deep asymmetric metric embedding. TPAMI (DOI 10.1109/TPAMI.2018.2886878). 

Wang, X. Zhu, S. Gong, and W. Li. (2018) Transferable joint attribute-identity deep learning for unsupervised person reidentification. CVPR. 

Wang, Jingjing, Jiang, Chunxiao, Han, Zhu, Ren, Yong, Maunder, Robert G. and Hanzo, Lajos (2016) Cooperative distributed unmanned aerial vehicular networks: Small and mini drones. IEEE Vehicular Technology Magazine, 1-18. (In Press) Z.  

Zhong, L. Zheng, Z. Luo, S. Li, and Y. Yang. (2019) Invariance matters: Exemplar memory for domain adaptive person re-identification. CVPR 

Wang, Jingjing, Jiang, Chunxiao, Han, Zhu, Ren, Yong, Maunder, Robert G. and Hanzo, Lajos (2016) Cooperative distributed unmanned aerial vehicular networks: Small and mini drones. IEEE Vehicular Technology Magazine, 1-18. (In Press) 

Correlation of Detected Objects from Multiple Sensor Platforms

Scroll to Top