• Title/Summary/Keyword: visual tracking

Search Result 522, Processing Time 0.027 seconds

The development of a visual tracking algorithm for the stable grasping of a moving object (움직이는 물체의 안정한 파지를 위한 시각추적 알고리즘 개발)

  • Cha, In-Hyuk;Sun, Yeong-Gab;Han, Chang-Soo
    • Journal of Institute of Control, Robotics and Systems
    • /
    • v.4 no.2
    • /
    • pp.187-193
    • /
    • 1998
  • This paper proposes an advanced visual tracking algorithm for the stable grasping of a moving target(2D). This algorithm is programmed to find grasping points of an unknown polygonal object and execute visual tracking. The Kalman Filter(KF) algorithm based on the SVD(Singular Value Decomposition) is applied to the visual tracking system for the tracking of a moving object. The KF based on the SVD improves the accuracy of the tracking and the robustness in the estimation of state variables and noise statistics. In addition, it does not have the numerical unstability problem that can occur in the visual tracking system based on Kalman filter. In the grasping system, a parameterized family is constructcd, and through the family, the grasping system finds the stable grasping points of an unknown object through the geometric properties of the parameterized family. In the previous studies, many researchers have been studied on only 'How to track a moving target'. This paper concern not only on 'how to track' but also 'how to grasp' and apply the grasping theory to a visual tracking system.

  • PDF

Robust Position Tracking for Position-Based Visual Servoing and Its Application to Dual-Arm Task (위치기반 비주얼 서보잉을 위한 견실한 위치 추적 및 양팔 로봇의 조작작업에의 응용)

  • Kim, Chan-O;Choi, Sung;Cheong, Joo-No;Yang, Gwang-Woong;Kim, Hong-Seo
    • The Journal of Korea Robotics Society
    • /
    • v.2 no.2
    • /
    • pp.129-136
    • /
    • 2007
  • This paper introduces a position-based robust visual servoing method which is developed for operation of a human-like robot with two arms. The proposed visual servoing method utilizes SIFT algorithm for object detection and CAMSHIFT algorithm for object tracking. While the conventional CAMSHIFT has been used mainly for object tracking in a 2D image plane, we extend its usage for object tracking in 3D space, by combining the results of CAMSHIFT for two image plane of a stereo camera. This approach shows a robust and dependable result. Once the robot's task is defined based on the extracted 3D information, the robot is commanded to carry out the task. We conduct several position-based visual servoing tasks and compare performances under different conditions. The results show that the proposed visual tracking algorithm is simple but very effective for position-based visual servoing.

  • PDF

An Advanced Visual Tracking and Stable Grasping Algorithm for a Moving Object (시각센서를 이용한 움직이는 물체의 추적 및 안정된 파지를 위한 알고리즘의 개발)

  • 차인혁;손영갑;한창수
    • Journal of the Korean Society for Precision Engineering
    • /
    • v.15 no.6
    • /
    • pp.175-182
    • /
    • 1998
  • An advanced visual tracking and stable grasping algorithm for a moving object is proposed. The stable grasping points for a moving 2D polygonal object are obtained through the visual tracking system with the Kalman filter and image prediction technique. The accuracy and efficiency are improved more than any other prediction algorithms for the tracking of an object. In the processing of a visual tracking. the shape predictors construct the parameterized family and grasp planner find the grasping points of unknown object through the geometric properties of the parameterized family. This algorithm conducts a process of ‘stable grasping and real time tracking’.

  • PDF

Trends on Visual Object Tracking Using Siamese Network (Siamese 네트워크 기반 영상 객체 추적 기술 동향)

  • Oh, J.;Lee, J.
    • Electronics and Telecommunications Trends
    • /
    • v.37 no.1
    • /
    • pp.73-83
    • /
    • 2022
  • Visual object tracking can be utilized in various applications and has attracted considerable attention in the field of computer vision. Visual object tracking technology is classified in various ways based on the number of tracking objects and the methodologies employed for tracking algorithms. This report briefly introduces the visual object tracking challenge that contributes to the development of single object tracking technology. Furthermore, we review ten Siamese network-based algorithms that have attracted attention, owing to their high tracking speed (despite the use of neural networks). In addition, we discuss the prospects of the Siamese network-based object tracking algorithms.

Real-time Visual Tracking System and Control Method for Laparoscope Manipulator (복강경 수술용 도구의 실시간 영상 추적 및 복강경 조종기의 지능형 제어 방법)

  • 김민석;허진석;이정주
    • Journal of the Korean Society for Precision Engineering
    • /
    • v.21 no.11
    • /
    • pp.83-90
    • /
    • 2004
  • In this paper we present a new real-time visual servoing unit for laparoscopic surgery This unit can automatically control laparoscope manipulator through visual tracking of laparoscopic surgical tool. For the tracking, we present two-stage adaptive CONDENSATION(conditional density propagation) algorithm to extract the accurate position of the surgical tool tip from a surgical image sequence in real-time. This algorithm can be adaptable to abrupt change of laparoscope illumination. For the control, we present virtual damper system to control a laparoscope manipulator safely and stably. This system causes the laparoscope to move under constraint of the virtual dampers which are linked to the four sides of image. The visual servoing unit operates the manipulator in real-time with locating the surgical tool in the center of image. The experimental results show that the proposed visual tracking algorithm is highly robust and the controlled manipulator can present stable view with safe.

Target Tracking of the Wheeled Mobile Robot using the Combined Visual Servo Control Method (혼합 비주얼 서보 제어 기법을 이용한 이동로봇의 목표물 추종)

  • Lee, Ho-Won;Kwon, Ji-Wook;Hong, Suk-Kyo;Chwa, Dong-Kyoung
    • The Transactions of The Korean Institute of Electrical Engineers
    • /
    • v.60 no.6
    • /
    • pp.1245-1254
    • /
    • 2011
  • This paper proposes a target tracking algorithm for wheeled mobile robots using in various fields. For the stable tracking, we apply a vision system to a mobile robot which can extract targets through image processing algorithms. Furthermore, this paper presents an algorithm to position the mobile robot at the desired location from the target by estimating its relative position and attitude. We show the problem in the tracking method using the Position-Based Visual Servo(PBVS) control, and propose a tracking method, which can achieve the stable tracking performance by combining the PBVS control with Image-Based Visual Servo(IBVS) control. When the target is located around the outskirt of the camera image, the target can disappear from the field of view. Thus the proposed algorithm combines the control inputs with of the hyperbolic form the switching function to solve this problem. Through both simulations and experiments for the mobile robot we have confirmed that the proposed visual servo control method is able to enhance the stability compared to of the method using only either PBVS or IBVS control method.

The development of a visual tracking system for the stable grasping of a moving object (움직이는 물체의 안정한 Grasping을 위한 시각추적 시스템 개발)

  • 차인혁;손영갑;한창수
    • 제어로봇시스템학회:학술대회논문집
    • /
    • 1996.10b
    • /
    • pp.543-546
    • /
    • 1996
  • We propose a new visual tracking system for grasping which can find grasping points of an unknown polygonal object. We construct the system with the image prediction technique and Extended Kalman Filter algorithm. The Extended Kalman Filter(EKF) based on the SVD can improve the accuracy and processing time for the estimation of the nonlinear state variables. By using it, we can solve the numerical unstability problem that can occur in the visual tracking system based on Kalman filter. The image prediction algorithm can reduce the effect of noise and the image processing time. In the processing of a visual tracking, we can construct the parameterized family and can found the grasping points of unknown object through the geometric properties of the parameterized family.

  • PDF

Robust Visual Tracking for 3-D Moving Object using Kalman Filter (칼만필터를 이용한 3-D 이동물체의 강건한 시각추적)

  • 조지승;정병묵
    • Proceedings of the Korean Society of Precision Engineering Conference
    • /
    • 2003.06a
    • /
    • pp.1055-1058
    • /
    • 2003
  • The robustness and reliability of vision algorithms is the key issue in robotic research and industrial applications. In this paper robust real time visual tracking in complex scene is considered. A common approach to increase robustness of a tracking system is the use of different model (CAD model etc.) known a priori. Also fusion or multiple features facilitates robust detection and tracking of objects in scenes of realistic complexity. Voting-based fusion of cues is adapted. In voting. a very simple or no model is used for fusion. The approach for this algorithm is tested in a 3D Cartesian robot which tracks a toy vehicle moving along 3D rail, and the Kalman filter is used to estimate the motion parameters. namely the system state vector of moving object with unknown dynamics. Experimental results show that fusion of cues and motion estimation in a tracking system has a robust performance.

  • PDF

Control of Robot Manipulators Using LQG Visual Tracking Cotroller (LQG 시각추종제어기를 이용한 로봇매니퓰레이터의 제어)

  • Lim, Tai-Hun;Jun, Hyang-Sig;Choi, Young-Kiu;Kim, Sung-Shin
    • Proceedings of the KIEE Conference
    • /
    • 1999.07g
    • /
    • pp.2995-2997
    • /
    • 1999
  • Recently, real-time visual tracking control for a robot manipulator is performed by using a vision feedback sensor information. In this paper, the optical flow is computed based on the eye-in-hand robot configuration. The image jacobian is employed to calculate the rotation and translation velocity of a 3D moving object. LQG visual controller generates the real-time visual trajectory. In order to improving the visual tracking performance. VSC controller is employed to control the robot manipulator. Simulation results show a better visual tracking performance than other method.

  • PDF

On Addressing Network Synchronization in Object Tracking with Multi-modal Sensors

  • Jung, Sang-Kil;Lee, Jin-Seok;Hong, Sang-Jin
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • v.3 no.4
    • /
    • pp.344-365
    • /
    • 2009
  • The performance of a tracking system is greatly increased if multiple types of sensors are combined to achieve the objective of the tracking instead of relying on single type of sensor. To conduct the multi-modal tracking, we have previously developed a multi-modal sensor-based tracking model where acoustic sensors mainly track the objects and visual sensors compensate the tracking errors [1]. In this paper, we find a network synchronization problem appearing in the developed tracking system. The problem is caused by the different location and traffic characteristics of multi-modal sensors and non-synchronized arrival of the captured sensor data at a processing server. To effectively deliver the sensor data, we propose a time-based packet aggregation algorithm where the acoustic sensor data are aggregated based on the sampling time and sent to the server. The delivered acoustic sensor data is then compensated by visual images to correct the tracking errors and such a compensation process improves the tracking accuracy in ideal case. However, in real situations, the tracking improvement from visual compensation can be severely degraded due to the aforementioned network synchronization problem, the impact of which is analyzed by simulations in this paper. To resolve the network synchronization problem, we differentiate the service level of sensor traffic based on Weight Round Robin (WRR) scheduling at the routers. The weighting factor allocated to each queue is calculated by a proposed Delay-based Weight Allocation (DWA) algorithm. From the simulations, we show the traffic differentiation model can mitigate the non-synchronization of sensor data. Finally, we analyze expected traffic behaviors of the tracking system in terms of acoustic sampling interval and visual image size.