• Title/Summary/Keyword: vision-aided navigation

Search Result 10, Processing Time 0.029 seconds

Development of a Test Environment for Performance Evaluation of the Vision-aided Navigation System for VTOL UAVs (수직 이착륙 무인 항공기용 영상보정항법 시스템 성능평가를 위한 검증환경 개발)

  • Sebeen Park;Hyuncheol Shin;Chul Joo Chung
    • Journal of Advanced Navigation Technology
    • /
    • v.27 no.6
    • /
    • pp.788-797
    • /
    • 2023
  • In this paper, we introduced a test environment to test a vision-aided navigation system, as an alternative navigation system when global positioning system (GPS) is unavailable, for vertical take-off and landing (VTOL) unmanned aerial system. It is efficient to use a virtual environment to test and evaluate the vision-aided navigation system under development, but currently no suitable equipment has been developed in Korea. Thus, the proposed test environment is developed to evaluate the performance of the navigation system by generating input signal modeling and simulating operation environment of the system, and by monitoring output signal. This paper comprehensively describes research procedure from derivation of requirements specifications to hardware/software design according to the requirements, and production of the test environment. This test environment was used for evaluating the vision-aided navigation algorithm which we are developing, and conducting simulation based pre-flight tests.

SLAM Aided GPS/INS/Vision Navigation System for Helicopter (SLAM 기반 GPS/INS/영상센서를 결합한 헬리콥터 항법시스템의 구성)

  • Kim, Jae-Hyung;Lyou, Joon;Kwak, Hwy-Kuen
    • Journal of Institute of Control, Robotics and Systems
    • /
    • v.14 no.8
    • /
    • pp.745-751
    • /
    • 2008
  • This paper presents a framework for GPS/INS/Vision based navigation system of helicopters. GPS/INS coupled algorithm has weak points such as GPS blockage and jamming, while the helicopter is a speedy and high dynamical vehicle amenable to lose the GPS signal. In case of the vision sensor, it is not affected by signal jamming and also navigation error is not accumulated. So, we have implemented an GPS/INS/Vision aided navigation system providing the robust localization suitable for helicopters operating in various environments. The core algorithm is the vision based simultaneous localization and mapping (SLAM) technique. For the verification of the SLAM algorithm, we performed flight tests. From the tests, we confirm the developed system is robust enough under the GPS blockage. The system design, software algorithm, and flight test results are described.

Visual Target Tracking and Relative Navigation for Unmanned Aerial Vehicles in a GPS-Denied Environment

  • Kim, Youngjoo;Jung, Wooyoung;Bang, Hyochoong
    • International Journal of Aeronautical and Space Sciences
    • /
    • v.15 no.3
    • /
    • pp.258-266
    • /
    • 2014
  • We present a system for the real-time visual relative navigation of a fixed-wing unmanned aerial vehicle in a GPS-denied environment. An extended Kalman filter is used to construct a vision-aided navigation system by fusing the image processing results with barometer and inertial sensor measurements. Using a mean-shift object tracking algorithm, an onboard vision system provides pixel measurements to the navigation filter. The filter is slightly modified to deal with delayed measurements from the vision system. The image processing algorithm and the navigation filter are verified by flight tests. The results show that the proposed aerial system is able to maintain circling around a target without using GPS data.

Loosely-Coupled Vision/INS Integrated Navigation System

  • Kim, Youngsun;Hwang, Dong-Hwan
    • Journal of Positioning, Navigation, and Timing
    • /
    • v.6 no.2
    • /
    • pp.59-70
    • /
    • 2017
  • Since GPS signals are vulnerable to interference and obstruction, many alternate aiding systems have been proposed to integrate with an inertial navigation system. Among these alternate systems, the vision-aided method has become more attractive due to its benefits in weight, cost and power consumption. This paper proposes a loosely-coupled vision/INS integrated navigation method which can work in GPS-denied environments. The proposed method improves the navigation accuracy by correcting INS navigation and sensor errors using position and attitude outputs of a landmark based vision navigation system. Furthermore, it has advantage to provide redundant navigation output regardless of INS output. Computer simulations and the van tests have been carried out in order to show validity of the proposed method. The results show that the proposed method works well and gives reliable navigation outputs with better performance.

Integrated Navigation Design Using a Gimbaled Vision/LiDAR System with an Approximate Ground Description Model

  • Yun, Sukchang;Lee, Young Jae;Kim, Chang Joo;Sung, Sangkyung
    • International Journal of Aeronautical and Space Sciences
    • /
    • v.14 no.4
    • /
    • pp.369-378
    • /
    • 2013
  • This paper presents a vision/LiDAR integrated navigation system that provides accurate relative navigation performance on a general ground surface, in GNSS-denied environments. The considered ground surface during flight is approximated as a piecewise continuous model, with flat and slope surface profiles. In its implementation, the presented system consists of a strapdown IMU, and an aided sensor block, consisting of a vision sensor and a LiDAR on a stabilized gimbal platform. Thus, two-dimensional optical flow vectors from the vision sensor, and range information from LiDAR to ground are used to overcome the performance limit of the tactical grade inertial navigation solution without GNSS signal. In filter realization, the INS error model is employed, with measurement vectors containing two-dimensional velocity errors, and one differenced altitude in the navigation frame. In computing the altitude difference, the ground slope angle is estimated in a novel way, through two bisectional LiDAR signals, with a practical assumption representing a general ground profile. Finally, the overall integrated system is implemented, based on the extended Kalman filter framework, and the performance is demonstrated through a simulation study, with an aircraft flight trajectory scenario.

Mobile Robot Destination Generation by Tracking a Remote Controller Using a Vision-aided Inertial Navigation Algorithm

  • Dang, Quoc Khanh;Suh, Young-Soo
    • Journal of Electrical Engineering and Technology
    • /
    • v.8 no.3
    • /
    • pp.613-620
    • /
    • 2013
  • A new remote control algorithm for a mobile robot is proposed, where a remote controller consists of a camera and inertial sensors. Initially the relative position and orientation of a robot is estimated by capturing four circle landmarks on the plate of the robot. When the remote controller moves to point to the destination, the camera pointing trajectory is estimated using an inertial navigation algorithm. The destination is transmitted wirelessly to the robot and then the robot is controlled to move to the destination. A quick movement of the remote controller is possible since the destination is estimated using inertial sensors. Also unlike the vision only control, the robot can be out of camera's range of view.

A Real-Time NDGPS/INS Navigation System Based on Artificial Vision for Helicopter (인공시계기반 헬기용 3차원 항법시스템 구성)

  • Kim, Jae-Hyung;Lyou, Joon;Kwak, Hwy-Kuen
    • Journal of the Korea Institute of Military Science and Technology
    • /
    • v.11 no.3
    • /
    • pp.30-39
    • /
    • 2008
  • An artificial vision aided NDGPS/INS system has been developed and tested in the dynamic environment of ground and flight vehicles to evaluate the overall system performance. The results show the significant advantages in position accuracy and situation awareness. Accuracy meets the CAT-I precision approach and landing using NDGPS/INS integration. Also we confirm the proposed system is effective enough to improve flight safety by using artificial vision. The system design, software algorithm, and flight test results are presented in details.

Observability Analysis of a Vision-INS Integrated Navigation System Using Landmark (비전센서와 INS 기반의 항법 시스템 구현 시 랜드마크 사용에 따른 가관측성 분석)

  • Won, Dae-Hee;Chun, Se-Bum;Sung, Sang-Kyung;Cho, Jin-Soo;Lee, Young-Jae
    • Journal of the Korean Society for Aeronautical & Space Sciences
    • /
    • v.38 no.3
    • /
    • pp.236-242
    • /
    • 2010
  • A GNSS/INS integration system can not provide navigation solutions if there are no available satellites. To overcome this problem, a vision sensor is integrated with this system. Since generally a vision aided integration system uses only feature point to compute navigation solutions, it has a problem in observability. In this case, additional landmarks, which is priory known points, can improve the observability. In this paper, the observability is evaluated using TOM/SOM matrix and Eigenvalues. There are always the observability problems in the feature-point-only case, but the landmark-use case is fully observable after the $2^{nd}$ update time. Consequently the landmarks ensure full observability, so the system performance can be improved.

Multiple Templates and Weighted Correlation Coefficient-based Object Detection and Tracking for Underwater Robots (수중 로봇을 위한 다중 템플릿 및 가중치 상관 계수 기반의 물체 인식 및 추종)

  • Kim, Dong-Hoon;Lee, Dong-Hwa;Myung, Hyun;Choi, Hyun-Taek
    • The Journal of Korea Robotics Society
    • /
    • v.7 no.2
    • /
    • pp.142-149
    • /
    • 2012
  • The camera has limitations of poor visibility in underwater environment due to the limited light source and medium noise of the environment. However, its usefulness in close range has been proved in many studies, especially for navigation. Thus, in this paper, vision-based object detection and tracking techniques using artificial objects for underwater robots have been studied. We employed template matching and mean shift algorithms for the object detection and tracking methods. Also, we propose the weighted correlation coefficient of adaptive threshold -based and color-region-aided approaches to enhance the object detection performance in various illumination conditions. The color information is incorporated into the template matched area and the features of the template are used to robustly calculate correlation coefficients. And the objects are recognized using multi-template matching approach. Finally, the water basin experiments have been conducted to demonstrate the performance of the proposed techniques using an underwater robot platform yShark made by KORDI.

Development of Real-Time Vision Aided Navigation Using EO/IR Image Information of Tactical Unmanned Aerial System in GPS Denied Environment (GPS 취약 환경에서 전술급 무인항공기의 주/야간 영상정보를 기반으로 한 실시간 비행체 위치 보정 시스템 개발)

  • Choi, SeungKie;Cho, ShinJe;Kang, SeungMo;Lee, KilTae;Lee, WonKeun;Jeong, GilSun
    • Journal of the Korean Society for Aeronautical & Space Sciences
    • /
    • v.48 no.6
    • /
    • pp.401-410
    • /
    • 2020
  • In this study, a real-time Tactical UAS position compensation system based on image information developed to compensate for the weakness of location navigation information during GPS signal interference and jamming / spoofing attack is described. The Tactical UAS (KUS-FT) is capable of automatic flight by switching the mode from GPS/INS integrated navigation to DR/AHRS when GPS signal is lost. However, in the case of location navigation, errors accumulate over time due to dead reckoning (DR) using airspeed and azimuth which causes problems such as UAS positioning and data link antenna tracking. To minimize the accumulation of position error, based on the target data of specific region through image sensor, we developed a system that calculates the position using the UAS attitude, EO/IR (Electric Optic/Infra-Red) azimuth and elevation and numerical map data and corrects the calculated position in real-time. In addition, function and performance of the image information based real-time UAS position compensation system has been verified by ground test using GPS simulator and flight test in DR mode.