Development of a Hybrid Quadrotor Control Framework Integrating Classical Control and Vision-Based Deep Learning for Autonomous Takeoff, Landing, and Object Tracking
Main Article Content
Abstract
The object of this study is a quadrotor unmanned aerial vehicle (UAV) equipped with classical PID (proportional-integral-derivative) and backstepping controllers for stabilization and deep learning-based vision modules (YOLOv8(You Only Look Once) and ByteTrack (a multi-object tracking algorithm) for perception and tracking. The research addresses the problem of limited robustness and adaptability of traditional control systems when operating in dynamic environments with external disturbances, sensor noise, and moving targets. As a result of the study, a hybrid quadrotor control framework integrating classical control and vision-based deep learning was developed and experimentally validated. The proposed system enables autonomous takeoff, smooth landing, stable hovering, and reliable multi-object tracking under varying illumination and occlusion conditions. These results were achieved through the layered coupling of model-based controllers with visual feedback, which allows real-time compensation for sensor drift and external perturbations, ensuring stable flight trajectories. The developed framework can be effectively applied in practice for infrastructure monitoring, environmental observation, search-and-rescue missions, and intelligent transportation, especially in GPS-denied (Global Positioning System) or visually complex environments.