• Title/Summary/Keyword: Vision-based Guidance

Search Result 34, Processing Time 0.025 seconds

Vision-based Guidance for Loitering over a Target

  • Park, Sanghyuk
    • International Journal of Aeronautical and Space Sciences
    • /
    • v.17 no.3
    • /
    • pp.366-377
    • /
    • 2016
  • This paper presents a vision-based guidance method that allows a fixed-wing aircraft to orbit around a target at a given radius. The guidance method uses a simple formula that regulates a relative side-bearing angle estimated by a vision system. The global asymptotic stability of the associated guidance law is demonstrated, and a linear analysis is performed to facilitate the proper selection of the relevant control parameters. A flight experiment is presented to demonstrate the feasibility and performance of the proposed guidance method.

Guidance Law for Vision-Based Automatic Landing of UAV

  • Min, Byoung-Mun;Tahk, Min-Jea;Shim, Hyun-Chul David;Bang, Hyo-Choong
    • International Journal of Aeronautical and Space Sciences
    • /
    • v.8 no.1
    • /
    • pp.46-53
    • /
    • 2007
  • In this paper, a guidance law for vision-based automatic landing of unmanned aerial vehicles (UAVs) is proposed. Automatic landing is a challenging but crucial capability for UAVs to achieve a fully autonomous flight. In an autonomous landing maneuver of UAVs, the decision of where to landing and the generation of guidance command to achieve a successful landing are very significant problem. This paper is focused on the design of guidance law applicable to automatic landing problem of fixed-wing UAV and rotary-wing UAV, simultaneously. The proposed guidance law generates acceleration command as a control input which derived from a specified time-to-go ($t_go$) polynomial function. The coefficient of $t_go$-polynomial function are determined to satisfy some terminal constraints. Nonlinear simulation results using a fixed-wing and rotary-wing UAV models are presented.

Guidance Line Extraction Algorithm using Central Region Data of Crop for Vision Camera based Autonomous Robot in Paddy Field (비전 카메라 기반의 무논환경 자율주행 로봇을 위한 중심영역 추출 정보를 이용한 주행기준선 추출 알고리즘)

  • Choi, Keun Ha;Han, Sang Kwon;Park, Kwang-Ho;Kim, Kyung-Soo;Kim, Soohyun
    • The Journal of Korea Robotics Society
    • /
    • v.11 no.1
    • /
    • pp.1-8
    • /
    • 2016
  • In this paper, we propose a new algorithm of the guidance line extraction for autonomous agricultural robot based on vision camera in paddy field. It is the important process for guidance line extraction which finds the central point or area of rice row. We are trying to use the central region data of crop that the direction of rice leaves have convergence to central area of rice row in order to improve accuracy of the guidance line. The guidance line is extracted from the intersection points of extended virtual lines using the modified robust regression. The extended virtual lines are represented as the extended line from each segmented straight line created on the edges of the rice plants in the image using the Hough transform. We also have verified an accuracy of the proposed algorithm by experiments in the real wet paddy.

Implementation of Virtual Instrumentation based Realtime Vision Guided Autopilot System and Onboard Flight Test using Rotory UAV (가상계측기반 실시간 영상유도 자동비행 시스템 구현 및 무인 로터기를 이용한 비행시험)

  • Lee, Byoung-Jin;Yun, Suk-Chang;Lee, Young-Jae;Sung, Sang-Kyung
    • Journal of Institute of Control, Robotics and Systems
    • /
    • v.18 no.9
    • /
    • pp.878-886
    • /
    • 2012
  • This paper investigates the implementation and flight test of realtime vision guided autopilot system based on virtual instrumentation platform. A graphical design process via virtual instrumentation platform is fully used for the image processing, communication between systems, vehicle dynamics control, and vision coupled guidance algorithms. A significatnt ojective of the algorithm is to achieve an environment robust autopilot despite wind and an irregular image acquisition condition. For a robust vision guided path tracking and hovering performance, the flight path guidance logic is combined in a multi conditional basis with the position estimation algorithm coupled with the vehicle attitude dynamics. An onboard flight test equipped with the developed realtime vision guided autopilot system is done using the rotary UAV system with full attitude control capability. Outdoor flight test demonstrated that the designed vision guided autopilot system succeeded in UAV's hovering on top of ground target within about several meters under geenral windy environment.

Monocular Vision-Based Guidance and Control for a Formation Flight

  • Cheon, Bong-kyu;Kim, Jeong-ho;Min, Chan-oh;Han, Dong-in;Cho, Kyeum-rae;Lee, Dae-woo;Seong, kie-jeong
    • International Journal of Aeronautical and Space Sciences
    • /
    • v.16 no.4
    • /
    • pp.581-589
    • /
    • 2015
  • This paper describes a monocular vision-based formation flight technology using two fixed wing unmanned aerial vehicles. To measuring relative position and attitude of a leader aircraft, a monocular camera installed in the front of the follower aircraft captures an image of the leader, and position and attitude are measured from the image using the KLT feature point tracker and POSIT algorithm. To verify the feasibility of this vision processing algorithm, a field test was performed using two light sports aircraft, and our experimental results show that the proposed monocular vision-based measurement algorithm is feasible. Performance verification for the proposed formation flight technology was carried out using the X-Plane flight simulator. The formation flight simulation system consists of two PCs playing the role of leader and follower. When the leader flies by the command of user, the follower aircraft tracks the leader by designed guidance and a PI control law, and all the information about leader was measured using monocular vision. This simulation shows that guidance using relative attitude information tracks the leader aircraft better than not using attitude information. This simulation shows absolute average errors for the relative position as follows: X-axis: 2.88 m, Y-axis: 2.09 m, and Z-axis: 0.44 m.

Development of an IGVM Integrated Navigation System for Vehicular Lane-Level Guidance Services

  • Cho, Seong Yun
    • Journal of Positioning, Navigation, and Timing
    • /
    • v.5 no.3
    • /
    • pp.119-129
    • /
    • 2016
  • This paper presents an integrated navigation system for accurate navigation solution-based safety and convenience services in the vehicular augmented reality (AR)-head up display (HUD) system. For lane-level guidance service, especially, an accurate navigation system is essential. To achieve this, an inertial navigation system (INS)/global positioning system (GPS)/vision/digital map (IGVM) integrated navigation system has been developing. In this paper, the concept of the integrated navigation system is introduced and is implemented based on a multi-model switching filter and vehicle status decided by using the GPS data and inertial measurement unit (IMU) measurements. The performance of the implemented navigation system is verified experimentally.

Integrated System for Autonomous Proximity Operations and Docking

  • Lee, Dae-Ro;Pernicka, Henry
    • International Journal of Aeronautical and Space Sciences
    • /
    • v.12 no.1
    • /
    • pp.43-56
    • /
    • 2011
  • An integrated system composed of guidance, navigation and control (GNC) system for autonomous proximity operations and the docking of two spacecraft was developed. The position maneuvers were determined through the integration of the state-dependent Riccati equation formulated from nonlinear relative motion dynamics and relative navigation using rendezvous laser vision (Lidar) and a vision sensor system. In the vision sensor system, a switch between sensors was made along the approach phase in order to provide continuously effective navigation. As an extension of the rendezvous laser vision system, an automated terminal guidance scheme based on the Clohessy-Wiltshire state transition matrix was used to formulate a "V-bar hopping approach" reference trajectory. A proximity operations strategy was then adapted from the approach strategy used with the automated transfer vehicle. The attitude maneuvers, determined from a linear quadratic Gaussian-type control including quaternion based attitude estimation using star trackers or a vision sensor system, provided precise attitude control and robustness under uncertainties in the moments of inertia and external disturbances. These functions were then integrated into an autonomous GNC system that can perform proximity operations and meet all conditions for successful docking. A six-degree of freedom simulation was used to demonstrate the effectiveness of the integrated system.

Guidance Line Extraction for Autonomous Weeding robot based-on Rice Morphology Characteristic in Wet Paddy (논 잡초 방제용 자율주행 로봇을 위한 벼의 형태학적 특징 기반의 주행기준선 추출)

  • Choi, Keun Ha;Han, Sang Kwon;Han, Sang Hoon;Park, Kwang-Ho;Kim, Kyung-Soo;Kim, Soohyun
    • The Journal of Korea Robotics Society
    • /
    • v.9 no.3
    • /
    • pp.147-153
    • /
    • 2014
  • In this paper, we proposed a new algorithm of the guidance line extraction for autonomous weeding robot based on infrared vision sensor in wet paddy. It is the critical process for guidance line extraction which finds the central point or area of rice row. In order to improve accuracy of the guidance line, we are trying to use the morphological characteristics of rice that the direction of rice leaves have convergence to central area of rice row. Using Hough transform, we were represented the curved leaves as a combination of segmented straight lines on binary image that has been skeletonized and segmented object. A slope of the guidance line was gotten as calculate the average slope of all segmented lines. An initial point of the guidance line was determined that is the maximum pixel value of the accumulated white columns of a binary image which is rotated the slope of guidance line in the opposite direction. We also have verified an accuracy of the proposed algorithm by experiments in the real wet paddy.

Linear Velocity Control of the Mobile Robot with the Vision System at Corridor Navigation (비전 센서를 갖는 이동 로봇의 복도 주행 시 직진 속도 제어)

  • Kwon, Ji-Wook;Hong, Suk-Kyo;Chwa, Dong-Kyoung
    • Journal of Institute of Control, Robotics and Systems
    • /
    • v.13 no.9
    • /
    • pp.896-902
    • /
    • 2007
  • This paper proposes a vision-based kinematic control method for mobile robots with camera-on-board. In the previous literature on the control of mobile robots using camera vision information, the forward velocity is set to be a constant, and only the rotational velocity of the robot is controlled. More efficient motion, however, is needed by controlling the forward velocity, depending on the position in the corridor. Thus, both forward and rotational velocities are controlled in the proposed method such that the mobile robots can move faster when the comer of the corridor is far away, and it slows down as it approaches the dead end of the corridor. In this way, the smooth turning motion along the corridor is possible. To this end, visual information using the camera is used to obtain the perspective lines and the distance from the current robot position to the dead end. Then, the vanishing point and the pseudo desired position are obtained, and the forward and rotational velocities are controlled by the LOS(Line Of Sight) guidance law. Both numerical and experimental results are included to demonstrate the validity of the proposed method.

Light Source Target Detection Algorithm for Vision-based UAV Recovery

  • Won, Dae-Yeon;Tahk, Min-Jea;Roh, Eun-Jung;Shin, Sung-Sik
    • International Journal of Aeronautical and Space Sciences
    • /
    • v.9 no.2
    • /
    • pp.114-120
    • /
    • 2008
  • In the vision-based recovery phase, a terminal guidance for the blended-wing UAV requires visual information of high accuracy. This paper presents the light source target design and detection algorithm for vision-based UAV recovery. We propose a recovery target design with red and green LEDs. This frame provides the relative position between the target and the UAV. The target detection algorithm includes HSV-based segmentation, morphology, and blob processing. These techniques are employed to give efficient detection results in day and night net recovery operations. The performance of the proposed target design and detection algorithm are evaluated through ground-based experiments.