TY - GEN
T1 - Deep ensemble for rotorcraft attitude prediction
AU - Khan, Hikmat
AU - Rasool, Ghulam
AU - Bouaynaya, Nidhal Carla
AU - Travis, Tyler
AU - Thompson, Lacey
AU - Johnson, Charles C.
N1 - Funding Information:
This work was supported by the Federal Aviation Administration (FAA) Cooperative Agreement Number 16-G-015 and NSF Awards OAC-2008690 and DUE-1610911. Ghu-lam Rasool was also partly supported by NSF Award OAC-2008690. This publication was in part supported by a sub-award from Rutgers University, Center for Advanced Infrastructure & Transportation, under Grant no. 69A3551847102 from the U.S. Department of Transportation, Office of the Assistant Secretary for Research and Technology (OST-R).
Funding Information:
Ghulam Rasool is an Assistant Professor of Electrical and Computer Engineering at Rowan University. He received a BS. in Mechanical Engineering from the National University of Sciences and Technology (NUST), Pakistan, in 2000, an M.S. in Computer Engineering from the Center for Advanced Studies in Engineering (CASE), Pakistan, in 2010, and the Ph.D. in Systems Engineering from the University of Arkansas at Little Rock in 2014. He was a postdoctoral fellow with the Rehabilitation Institute of Chicago and Northwestern University from 2014 to 2016. He joined Rowan University as an adjunct professor and later as a lecturer in the year 2018. Currently, he is the co-director of the Rowan AI Lab. His current research focuses on machine learning, artificial intelligence, data analytics, signal, image, and video processing. His research is funded by National Science Foundation (NSF), U.S. Department of Education, U.S. Department of Transportation (through the University Transportation Center (UTC), Rutgers University), Federal Aviation Administration (FAA), New Jersey Health Foundation (NJHF), and Lockheed Martin, Inc. His recent work on Bayesian machine learning won the Best Student Award at the 2019 IEEE Machine Learning for Signal Processing Workshop.
Publisher Copyright:
Copyright © 2021 by the Vertical Flight Society. All rights reserved.
PY - 2021
Y1 - 2021
N2 - Historically, the rotorcraft community has experienced a higher fatal accident rate than other aviation segments, including commercial and general aviation. To date, traditional methods applied to reduce incident rates have not proven hugely successful for the rotorcraft community. Recent advancements in artificial intelligence (AI) and the application of these technologies in different areas of our lives are both intriguing and encouraging. When developed appropriately for the aviation domain, AI techniques may provide an opportunity to help design systems that can address rotorcraft safety challenges. Our recent work demonstrated that AI algorithms could use video data from onboard cameras and correctly identify different flight parameters from cockpit gauges, e.g., indicated airspeed. These AI-based techniques provide a potentially cost-effective solution, especially for small helicopter operators, to record the flight state information and perform post-flight analyses. We also showed that carefully designed and trained AI systems can accurately predict rotorcraft attitude (i.e., pitch and yaw) from outside scenes (images or video data). Ordinary off-the-shelf video cameras were installed inside the rotorcraft cockpit to record the outside scene, including the horizon. The AI algorithm was able to correctly identify rotorcraft attitude at an accuracy in the range of 80%. In this work, we combined five different onboard camera viewpoints to improve attitude prediction accuracy to 94%. Our current approach, which is referred to as ensembled prediction, significantly increased the reliability in the predicted attitude (i.e., pitch and yaw). For example, in some camera views, the horizon may be obstructed or not visible. The proposed ensemble method can combine visual details recorded from other cameras and predict the attitude with high reliability. In our setup, the five onboard camera views included pilot windshield, co-pilot windshield, pilot Electronic Flight Instrument System (EFIS) display, co-pilot EFIS display, and the attitude indicator gauge. Using video data from each camera view, we trained a variety of convolutional neural networks (CNNs), which achieved prediction accuracy in the range of 79% to 90%. We subsequently ensembled the learned knowledge from all CNNs and achieved an ensembled accuracy of 93.3%. Our efforts could potentially provide a cost-effective means to supplement traditional Flight Data Recorders (FDR), a technology that to date has been challenging to incorporate into the fleets of most rotorcraft operators due to cost and resource constraints. Such cost-effective solutions can gradually increase the rotorcraft community's participation in various safety programs, enhancing safety and opening up helicopter flight data monitoring (HFDM) to historically underrepresented segments of the vertical flight community.
AB - Historically, the rotorcraft community has experienced a higher fatal accident rate than other aviation segments, including commercial and general aviation. To date, traditional methods applied to reduce incident rates have not proven hugely successful for the rotorcraft community. Recent advancements in artificial intelligence (AI) and the application of these technologies in different areas of our lives are both intriguing and encouraging. When developed appropriately for the aviation domain, AI techniques may provide an opportunity to help design systems that can address rotorcraft safety challenges. Our recent work demonstrated that AI algorithms could use video data from onboard cameras and correctly identify different flight parameters from cockpit gauges, e.g., indicated airspeed. These AI-based techniques provide a potentially cost-effective solution, especially for small helicopter operators, to record the flight state information and perform post-flight analyses. We also showed that carefully designed and trained AI systems can accurately predict rotorcraft attitude (i.e., pitch and yaw) from outside scenes (images or video data). Ordinary off-the-shelf video cameras were installed inside the rotorcraft cockpit to record the outside scene, including the horizon. The AI algorithm was able to correctly identify rotorcraft attitude at an accuracy in the range of 80%. In this work, we combined five different onboard camera viewpoints to improve attitude prediction accuracy to 94%. Our current approach, which is referred to as ensembled prediction, significantly increased the reliability in the predicted attitude (i.e., pitch and yaw). For example, in some camera views, the horizon may be obstructed or not visible. The proposed ensemble method can combine visual details recorded from other cameras and predict the attitude with high reliability. In our setup, the five onboard camera views included pilot windshield, co-pilot windshield, pilot Electronic Flight Instrument System (EFIS) display, co-pilot EFIS display, and the attitude indicator gauge. Using video data from each camera view, we trained a variety of convolutional neural networks (CNNs), which achieved prediction accuracy in the range of 79% to 90%. We subsequently ensembled the learned knowledge from all CNNs and achieved an ensembled accuracy of 93.3%. Our efforts could potentially provide a cost-effective means to supplement traditional Flight Data Recorders (FDR), a technology that to date has been challenging to incorporate into the fleets of most rotorcraft operators due to cost and resource constraints. Such cost-effective solutions can gradually increase the rotorcraft community's participation in various safety programs, enhancing safety and opening up helicopter flight data monitoring (HFDM) to historically underrepresented segments of the vertical flight community.
UR - http://www.scopus.com/inward/record.url?scp=85108951994&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85108951994&partnerID=8YFLogxK
M3 - Conference contribution
AN - SCOPUS:85108951994
T3 - 77th Annual Vertical Flight Society Forum and Technology Display, FORUM 2021: The Future of Vertical Flight
BT - 77th Annual Vertical Flight Society Forum and Technology Display, FORUM 2021
PB - Vertical Flight Society
T2 - 77th Annual Vertical Flight Society Forum and Technology Display: The Future of Vertical Flight, FORUM 2021
Y2 - 10 May 2021 through 14 May 2021
ER -