In this paper, a water rescue mission system was developed for water safety management areas by utilizing unmanned mobility( drone systems) and AI-based visual recognition technology to enable automatic detection and localization of drowning persons, allowing timely response within the golden time. First, we detected suspected human subjects in daytime and nighttime videos, then estimated human skeleton-based poses to extract human features and patterns using LSTM models. After detecting the drowning person, we proposed an algorithm to obtain accurate GPS location information of the drowning person for rescue activities. In our experimental results, the accuracy of the Drown detection rate is 80.1% as F1-Score, and the average error of position estimation is about 0.29 meters.
Ensuring operational safety and reliability in Unmanned Aerial Vehicles (UAVs) necessitates advanced onboard fault detection. This paper presents a novel, mobility-aware multi-sensor health monitoring framework, uniquely fusing visual (camera) and vibration (IMU) data for enhanced near real-time inference of rotor and structural faults. Our approach is tailored for resource-constrained flight controllers (e.g., Pixhawk) without auxiliary hardware, utilizing standard flight logs. Validated on a 40 kg-class UAV with induced rotor damage (10% blade loss) over 100+ minutes of flight, the system demonstrated strong performance: a Multi-Layer Perceptron (MLP) achieved an RMSE of 0.1414 and R² of 0.92 for rotor imbalance, while a Convolutional Neural Network (CNN) detected visual anomalies. Significantly, incorporating UAV mobility context reduced false positives by over 30%. This work demonstrates a practical pathway to deploying sophisticated, lightweight diagnostic models on standard UAV hardware, supporting real-time onboard fault inference and paving the way for more autonomous and resilient health-aware aerial systems.