• 제목/요약/키워드: Sensor-Fusion

검색결과 811건 처리시간 0.025초

비행시험시스템용 다중센서 자료융합필터 설계 (Design of Multi-Sensor Data Fusion Filter for a Flight Test System)

  • 이용재;이자성
    • 대한전기학회논문지:시스템및제어부문D
    • /
    • 제55권9호
    • /
    • pp.414-419
    • /
    • 2006
  • This paper presents a design of a multi-sensor data fusion filter for a Flight Test System. The multi-sensor data consist of positional information of the target from radars and a telemetry system. The data fusion filter has a structure of a federated Kalman filter and is based on the Singer dynamic target model. It consists of dedicated local filter for each sensor, generally operating in parallel, plus a master fusion filter. A fault detection and correction algorithms are included in the local filter for treating bad measurements and sensor faults. The data fusion is carried out in the fusion filter by using maximum likelihood estimation algorithm. The performance of the designed fusion filter is verified by using both simulation data and real data.

협동 센서 융합 기반 화자 성별 분류를 위한 무선 센서네트워크 개발 (A Development of Wireless Sensor Networks for Collaborative Sensor Fusion Based Speaker Gender Classification)

  • 권호민
    • 융합신호처리학회논문지
    • /
    • 제12권2호
    • /
    • pp.113-118
    • /
    • 2011
  • 본 논문에서는 무선센서네트워크에서 이루어지는 협동적 센서융합을 이용한 화자성별분류를 제안하였다. 센서노드들은 BER(Band Energy Ratio) 기반 음성활동검출을 수행함으로써 불필요한 입력 데이터는 제거하고 관련성이 높은 데이터만을 처리 및 경판정한다. 개별적 센서노드에서 생성된 경판정 값들은 융합센터로 송신되고 전역적 결정 융합을 구축하기 때문에 전력 소모를 줄이고 네크워크 자원을 절약한다. 화자성별분류를 위한 센서융합기법으로써 베이시안(Bayesian) 센서융합 및 전역적 가중결정융합가법들이 제안되었다. 베이시안 센서융합의 경우, 배치되는 센서노드 수 변화에 따른 ROC(Receiver Operating Characteristic) 커브의 동작점을 통해 개별 센서노드 레벨에서 얻어진 경판정 값들을 처리하고 최적의 분류 융합을 결정한다. 전역적 결정을 위한 가중치로써 BER 및 MCL(Mutual Confidence Level)을 채택하여 개별적 지역 경판정 값들을 효율적으로 결합 및 융합시킨다. 센서 노드의 수가 증가함에 따라 분류화 성능이 개선되어졌으며 특히 낮은 SNH(Signal to Noise Ratio) 환경에서 성능 개선폭이 더 높게 나타남을 실험적으로 확인하였다.

Visual Control of Mobile Robots Using Multisensor Fusion System

  • Kim, Jung-Ha;Sugisaka, Masanori
    • 제어로봇시스템학회:학술대회논문집
    • /
    • 제어로봇시스템학회 2001년도 ICCAS
    • /
    • pp.91.4-91
    • /
    • 2001
  • In this paper, a development of the sensor fusion algorithm for a visual control of mobile robot is presented. The output data from the visual sensor include a time-lag due to the image processing computation. The sampling rate of the visual sensor is considerably low so that it should be used with other sensors to control fast motion. The main purpose of this paper is to develop a method which constitutes a sensor fusion system to give the optimal state estimates. The proposed sensor fusion system combines the visual sensor and inertial sensor using a modified Kalman filter. A kind of multi-rate Kalman filter which treats the slow sampling rate ...

  • PDF

Robust Hierarchical Data Fusion Scheme for Large-Scale Sensor Network

  • Song, Il Young
    • 센서학회지
    • /
    • 제26권1호
    • /
    • pp.1-6
    • /
    • 2017
  • The advanced driver assistant system (ADAS) requires the collection of a large amount of information including road conditions, environment, vehicle status, condition of the driver, and other useful data. In this regard, large-scale sensor networks can be an appropriate solution since they have been designed for this purpose. Recent advances in sensor network technology have enabled the management and monitoring of large-scale tasks such as the monitoring of road surface temperature on a highway. In this paper, we consider the estimation and fusion problems of the large-scale sensor networks used in the ADAS. Hierarchical fusion architecture is proposed for an arbitrary topology of the large-scale sensor network. A robust cluster estimator is proposed to achieve robustness of the network against outliers or failure of sensors. Lastly, a robust hierarchical data fusion scheme is proposed for the communication channel between the clusters and fusion center, considering the non-Gaussian channel noise, which is typical in communication systems.

다중센서 융합을 통한 전투차량의 위치추정 성능 개선에 관한 연구 (A Study on the Performance Improvement of Position Estimation using the Multi-Sensor Fusion in a Combat Vehicle)

  • 남윤욱;김성호;김기태;김형남
    • 품질경영학회지
    • /
    • 제49권1호
    • /
    • pp.1-15
    • /
    • 2021
  • Purpose: The purpose of this study was to propose a sensor fusion algorithm that integrates vehicle motion sensor(VMS) into the hybrid navigation system. Methods: How to evaluate the navigation performance was comparison test with the hybrid navigation system and the sensor fusion method. Results: The results of this study are as follows. It was found that the effects of the sensor fusion method and α value estimation were significant. Applying these greatly improves the navigation performance. Conclusion: For improving the reliability of navigation system, the sensor fusion method shows that the proposed method improves the navigation performance in a combat vehicle.

AVM 카메라와 융합을 위한 다중 상용 레이더 데이터 획득 플랫폼 개발 (Development of Data Logging Platform of Multiple Commercial Radars for Sensor Fusion With AVM Cameras)

  • 진영석;전형철;신영남;현유진
    • 대한임베디드공학회논문지
    • /
    • 제13권4호
    • /
    • pp.169-178
    • /
    • 2018
  • Currently, various sensors have been used for advanced driver assistance systems. In order to overcome the limitations of individual sensors, sensor fusion has recently attracted the attention in the field of intelligence vehicles. Thus, vision and radar based sensor fusion has become a popular concept. The typical method of sensor fusion involves vision sensor that recognizes targets based on ROIs (Regions Of Interest) generated by radar sensors. Especially, because AVM (Around View Monitor) cameras due to their wide-angle lenses have limitations of detection performance over near distance and around the edges of the angle of view, for high performance of sensor fusion using AVM cameras and radar sensors the exact ROI extraction of the radar sensor is very important. In order to resolve this problem, we proposed a sensor fusion scheme based on commercial radar modules of the vendor Delphi. First, we configured multiple radar data logging systems together with AVM cameras. We also designed radar post-processing algorithms to extract the exact ROIs. Finally, using the developed hardware and software platforms, we verified the post-data processing algorithm under indoor and outdoor environments.

Sensor Fusion을 이용한 전자식 조향장치의 Fail Safety 연구 (A Study on the Fail Safety of Electronics Power Steering Using Sensor Fusion)

  • 김병우;허진;조현덕;이영석
    • 전기학회논문지
    • /
    • 제57권8호
    • /
    • pp.1371-1376
    • /
    • 2008
  • A Steer-by-Wire system has so many advantages comparing with conventional mechanical steering system that it is expected to take key role in future environment friendly vehicle and intelligent transportation system. The mechanical connection between the hand wheel and the front axle will become obsolete. SBW system provides many benefits in terms of functionality, and at the same time present significant challenges - fault tolerant, fail safety - too. In this paper, failure analysis of SBW system will be performed and than sensor fusion technique will be proposed for fail safety of SBW system. A sensor fusion logic of steering angle sensor by using steering angle sensor, torque sensor and rack position sensor will be developed and simulated by fault injection simulation.

Sliding Window Filtering for Ground Moving Targets with Cross-Correlated Sensor Noises

  • Song, Il Young;Song, Jin Mo;Jeong, Woong Ji;Gong, Myoung Sool
    • 센서학회지
    • /
    • 제28권3호
    • /
    • pp.146-151
    • /
    • 2019
  • This paper reports a sliding window filtering approach for ground moving targets with cross-correlated sensor noise and uncertainty. In addition, the effect of uncertain parameters during a tracking error on the model performance is considered. A distributed fusion sliding window filter is also proposed. The distributed fusion filtering algorithm represents the optimal linear combination of local filters under the minimum mean-square error criterion. The derivation of the error cross-covariances between the local sliding window filters is the key to the proposed method. Simulation results of the motion of the ground moving target a demonstrate high accuracy and computational efficiency of the distributed fusion sliding window filter.

Landmark Detection Based on Sensor Fusion for Mobile Robot Navigation in a Varying Environment

  • Jin, Tae-Seok;Kim, Hyun-Sik;Kim, Jong-Wook
    • International Journal of Fuzzy Logic and Intelligent Systems
    • /
    • 제10권4호
    • /
    • pp.281-286
    • /
    • 2010
  • We propose a space and time based sensor fusion method and a robust landmark detecting algorithm based on sensor fusion for mobile robot navigation. To fully utilize the information from the sensors, first, this paper proposes a new sensor-fusion technique where the data sets for the previous moments are properly transformed and fused into the current data sets to enable an accurate measurement. Exploration of an unknown environment is an important task for the new generation of mobile robots. The mobile robots may navigate by means of a number of monitoring systems such as the sonar-sensing system or the visual-sensing system. The newly proposed, STSF (Space and Time Sensor Fusion) scheme is applied to landmark recognition for mobile robot navigation in an unstructured environment as well as structured environment, and the experimental results demonstrate the performances of the landmark recognition.

Distributed Fusion Estimation for Sensor Network

  • Song, Il Young;Song, Jin Mo;Jeong, Woong Ji;Gong, Myoung Sool
    • 센서학회지
    • /
    • 제28권5호
    • /
    • pp.277-283
    • /
    • 2019
  • In this paper, we propose a distributed fusion estimation for sensor networks using a receding horizon strategy. Communication channels were modelled as Markov jump systems, and a posterior probability distribution for communication channel characteristics was calculated and incorporated into the filter to allow distributed fusion estimation to handle path loss observation situations automatically. To implement distributed fusion estimation, a Kalman-Consensus filter was then used to obtain the average consensus, based on the estimates of sensors randomly distributed across sensor networks. The advantages of the proposed algorithms were then verified using a large-scale sensor network example.