Abstract
Scene matching navigation is an effective localization method for UAVs(Unmanned Aerial Vehicles) in denied environments, however, most of the existing studies require the optical axis of the on-board camera to be perpendicular to the ground. In fact, most of the acquired images are oblique visions due to the constraints of the environment and the UAV flight attitude. With this in mind, this paper designs a new oblique vision perspective conversion method to reconstruct a vertical image map, which in turn enables effective scene matching navigation. Specifically, the designed scene matching navigation scheme contains two parts. Firstly, the captured oblique view is restored to the nadir view according to the camera’s internal and external parameters. Secondly, after completing the feature point detection and matching with the remote sensing map, the vision points can be matched with the remote sensing map without calculating the homography transformation matrix to achieve the scene matching navigation. Finally, the effectiveness of the proposed scheme is verified by building the camera model in the Gazebo simulation platform. The simulation results show that the algorithm designed in this paper performs better than the scene matching navigation algorithm that only relies on the UAV oblique vision information.