• Title/Summary/Keyword: Sensor Fusion

Search Result 809, Processing Time 0.033 seconds

Sensor Data Fusion for Navigation of Mobile Robot With Collision Avoidance and Trap Recovery

  • Jeon, Young-Su;Ahn, Byeong-Kyu;Kuc, Tae-Yong
    • 제어로봇시스템학회:학술대회논문집
    • /
    • 2003.10a
    • /
    • pp.2461-2466
    • /
    • 2003
  • This paper presents a simple sensor fusion algorithm using neural network for navigation of mobile robots with obstacle avoidance and trap recovery. The multiple sensors input sensor data to the input layer of neural network activating the input nodes. The multiple sensors used include optical encoders, ultrasonic sensors, infrared sensors, a magnetic compass sensor, and GPS sensors. The proposed sensor fusion algorithm is combined with the VFH(Vector Field Histogram) algorithm for obstacle avoidance and AGPM(Adaptive Goal Perturbation Method) which sets adaptive virtual goals to escape trap situations. The experiment results show that the proposed low-level fusion algorithm is effective for real-time navigation of mobile robot.

  • PDF

A Study on the Fail Safety Logic of Smart Air Conditioner using Model based Design (모델 기반 설계 기법을 이용한 지능형 공조 장치의 이중 안전성 로직 연구)

  • Kim, Ji-Ho;Kim, Byeong-Woo
    • Journal of the Korean Society for Precision Engineering
    • /
    • v.28 no.12
    • /
    • pp.1372-1378
    • /
    • 2011
  • The smart air condition system is superior to conventional air condition system in the aspect of control accuracy, environmental preservation and it is foundation for intelligent vehicle such as electric vehicle, fuel cell vehicle. In this paper, failure analyses of smart air condition system will be performed and then sensor fusion technique will be proposed for fail safety of smart air condition system. A sensor fusion logic of air condition system by using CO sensor, $CO_2$ sensor and VOC, $NO_x$ sensor will be developed and simulated by fault injection simulation. The fusion technology of smart air condition system is generated in an experiment and a performance analysis is conducted with fusion algorithms. The proposed algorithm adds the error characteristic of each sensor as a conditional probability value, and ensures greater accuracy by performing the track fusion with the sensors with the most reliable performance.

Attitude Estimation for the Biped Robot with Vision and Gyro Sensor Fusion (비전 센서와 자이로 센서의 융합을 통한 보행 로봇의 자세 추정)

  • Park, Jin-Seong;Park, Young-Jin;Park, Youn-Sik;Hong, Deok-Hwa
    • Journal of Institute of Control, Robotics and Systems
    • /
    • v.17 no.6
    • /
    • pp.546-551
    • /
    • 2011
  • Tilt sensor is required to control the attitude of the biped robot when it walks on an uneven terrain. Vision sensor, which is used for recognizing human or detecting obstacles, can be used as a tilt angle sensor by comparing current image and reference image. However, vision sensor alone has a lot of technological limitations to control biped robot such as low sampling frequency and estimation time delay. In order to verify limitations of vision sensor, experimental setup of an inverted pendulum, which represents pitch motion of the walking or running robot, is used and it is proved that only vision sensor cannot control an inverted pendulum mainly because of the time delay. In this paper, to overcome limitations of vision sensor, Kalman filter for the multi-rate sensor fusion algorithm is applied with low-quality gyro sensor. It solves limitations of the vision sensor as well as eliminates drift of gyro sensor. Through the experiment of an inverted pendulum control, it is found that the tilt estimation performance of fusion sensor is greatly improved enough to control the attitude of an inverted pendulum.

Sensor Fusion and Neural Network Analysis for Drill-Wear Monitoring (센서퓨젼 기반의 인공신경망을 이용한 드릴 마모 모니터링)

  • Prasopchaichana, Kritsada;Kwon, Oh-Yang
    • Transactions of the Korean Society of Machine Tool Engineers
    • /
    • v.17 no.1
    • /
    • pp.77-85
    • /
    • 2008
  • The objective of the study is to construct a sensor fusion system for tool-condition monitoring (TCM) that will lead to a more efficient and economical drill usage. Drill-wear monitoring has an important attribute in the automatic machining processes as it can help preventing the damage of tools and workpieces, and optimizing the drill usage. In this study, we present the architectures of a multi-layer feed-forward neural network with Levenberg-Marquardt training algorithm based on sensor fusion for the monitoring of drill-wear condition. The input features to the neural networks were extracted from AE, vibration and current signals using the wavelet packet transform (WPT) analysis. Training and testing were performed at a moderate range of cutting conditions in the dry drilling of steel plates. The results show good performance in drill- wear monitoring by the proposed method of sensor fusion and neural network analysis.

Design and Performance Analysis of Energy-Aware Distributed Detection Systems with Multiple Passive Sonar Sensors (다중 수동 소나 센서 기반 에너지 인식 분산탐지 체계의 설계 및 성능 분석)

  • Kim, Song-Geun;Hong, Sun-Mog
    • Journal of the Korea Institute of Military Science and Technology
    • /
    • v.13 no.1
    • /
    • pp.9-21
    • /
    • 2010
  • In this paper, optimum design of distributed detection is considered for a parallel sensor network system consisting of a fusion center and multiple passive sonar nodes. Nonrandom fusion rules are employed as the fusion rules of the sensor network. For the nonrandom fusion rules, it is shown that a threshold rule of each sensor node has uniformly most powerful properties. Optimum threshold for each sensor is investigated that maximizes the probability of detection under a constraint on energy consumption due to false alarms. It is also investigated through numerical experiments how signal strength, false alarm probability, and the distance between three sensor nodes affect the system detection performances.

Centralized Kalman Filter with Adaptive Measurement Fusion: its Application to a GPS/SDINS Integration System with an Additional Sensor

  • Lee, Tae-Gyoo
    • International Journal of Control, Automation, and Systems
    • /
    • v.1 no.4
    • /
    • pp.444-452
    • /
    • 2003
  • An integration system with multi-measurement sets can be realized via combined application of a centralized and federated Kalman filter. It is difficult for the centralized Kalman filter to remove a failed sensor in comparison with the federated Kalman filter. All varieties of Kalman filters monitor innovation sequence (residual) for detection and isolation of a failed sensor. The innovation sequence, which is selected as an indicator of real time estimation error plays an important role in adaptive mechanism design. In this study, the centralized Kalman filter with adaptive measurement fusion is introduced by means of innovation sequence. The objectives of adaptive measurement fusion are automatic isolation and recovery of some sensor failures as well as inherent monitoring capability. The proposed adaptive filter is applied to the GPS/SDINS integration system with an additional sensor. Simulation studies attest that the proposed adaptive scheme is effective for isolation and recovery of immediate sensor failures.

Improvement of Control Performance by Data Fusion of Sensors

  • Na, Seung-You;Shin, Dae-Jung
    • International Journal of Fuzzy Logic and Intelligent Systems
    • /
    • v.4 no.1
    • /
    • pp.63-69
    • /
    • 2004
  • In this paper, we propose a general framework for sensor data fusion applied to control systems. Since many kinds of disturbances are introduced to a control system, it is necessary to rely on multisensor data fusion to improve control performance in spite of the disturbances. Multisensor data fusion for a control system is considered a sequence of making decisions for a combination of sensor data to make a proper control input in uncertain conditions of disturbance effects on sensors. The proposed method is applied to a typical control system of a flexible link system in which reduction of oscillation is obtained using a photo sensor at the tip of the link. But the control performance depends heavily on the environmental light conditions. To overcome the light disturbance difficulties, an accelerometer is used in addition to the existing photo sensor. Improvement of control performance is possible by utilizing multisensor data fusion for various output responses to show the feasibility of the proposed method in this paper.

Command Fusion for Navigation of Mobile Robots in Dynamic Environments with Objects

  • Jin, Taeseok
    • Journal of information and communication convergence engineering
    • /
    • v.11 no.1
    • /
    • pp.24-29
    • /
    • 2013
  • In this paper, we propose a fuzzy inference model for a navigation algorithm for a mobile robot that intelligently searches goal location in unknown dynamic environments. Our model uses sensor fusion based on situational commands using an ultrasonic sensor. Instead of using the "physical sensor fusion" method, which generates the trajectory of a robot based upon the environment model and sensory data, a "command fusion" method is used to govern the robot motions. The navigation strategy is based on a combination of fuzzy rules tuned for both goal-approach and obstacle-avoidance based on a hierarchical behavior-based control architecture. To identify the environments, a command fusion technique is introduced where the sensory data of the ultrasonic sensors and a vision sensor are fused into the identification process. The result of experiment has shown that highlights interesting aspects of the goal seeking, obstacle avoiding, decision making process that arise from navigation interaction.

Study of Sensor Fusion for Attitude Control of a Quad-rotor (쿼드로터 자세제어를 위한 센서융합 연구)

  • Yu, Dong-Hyeon;Lim, Dae Young;Sel, Nam O;Park, Jong Ho;Chong, Kil to
    • Journal of Institute of Control, Robotics and Systems
    • /
    • v.21 no.5
    • /
    • pp.453-458
    • /
    • 2015
  • We presented a quad-rotor controlling algorithm design by using sensor fusion in this paper. The controller design technique was performed by a PD controller with a Kalman filter and compensation algorithm for increasing the stability and reliability of the quad-rotor attitude. In this paper, we propose an attitude estimation algorithm for quad-rotor based sensor fusion by using the Kalman filter. For this reason, firstly, we studied the platform configuration and principle of the quad-rotor. Secondly, the bias errors of a gyro sensor, acceleration and geomagnetic sensor are compensated. The measured values of each sensor are then fused via a Kalman filter. Finally, the performance of the proposed algorithm is evaluated through experimental data of attitude estimation. As a result, the proposed sensor fusion algorithm showed superior attitude estimation performance, and also proved that robust attitude estimation is possible even in disturbance.

Implementation of a sensor fusion system for autonomous guided robot navigation in outdoor environments (실외 자율 로봇 주행을 위한 센서 퓨전 시스템 구현)

  • Lee, Seung-H.;Lee, Heon-C.;Lee, Beom-H.
    • Journal of Sensor Science and Technology
    • /
    • v.19 no.3
    • /
    • pp.246-257
    • /
    • 2010
  • Autonomous guided robot navigation which consists of following unknown paths and avoiding unknown obstacles has been a fundamental technique for unmanned robots in outdoor environments. The unknown path following requires techniques such as path recognition, path planning, and robot pose estimation. In this paper, we propose a novel sensor fusion system for autonomous guided robot navigation in outdoor environments. The proposed system consists of three monocular cameras and an array of nine infrared range sensors. The two cameras equipped on the robot's right and left sides are used to recognize unknown paths and estimate relative robot pose on these paths through bayesian sensor fusion method, and the other camera equipped at the front of the robot is used to recognize abrupt curves and unknown obstacles. The infrared range sensor array is used to improve the robustness of obstacle avoidance. The forward camera and the infrared range sensor array are fused through rule-based method for obstacle avoidance. Experiments in outdoor environments show the mobile robot with the proposed sensor fusion system performed successfully real-time autonomous guided navigation.