비전 및 진동 융합을 통한 이동성-인식 다중센서 무인 항공기 상태 모니터링
Ensuring operational safety and reliability in Unmanned Aerial Vehicles (UAVs) necessitates advanced onboard fault detection. This paper presents a novel, mobility-aware multi-sensor health monitoring framework, uniquely fusing visual (camera) and vibration (IMU) data for enhanced near real-time inference of rotor and structural faults. Our approach is tailored for resource-constrained flight controllers (e.g., Pixhawk) without auxiliary hardware, utilizing standard flight logs. Validated on a 40 kg-class UAV with induced rotor damage (10% blade loss) over 100+ minutes of flight, the system demonstrated strong performance: a Multi-Layer Perceptron (MLP) achieved an RMSE of 0.1414 and R² of 0.92 for rotor imbalance, while a Convolutional Neural Network (CNN) detected visual anomalies. Significantly, incorporating UAV mobility context reduced false positives by over 30%. This work demonstrates a practical pathway to deploying sophisticated, lightweight diagnostic models on standard UAV hardware, supporting real-time onboard fault inference and paving the way for more autonomous and resilient health-aware aerial systems.