• Title/Summary/Keyword: parallel inference

Search Result 71, Processing Time 0.022 seconds

Parallel Fuzzy Inference Method for Large Volumes of Satellite Images

  • Lee, Sang-Gu
    • International Journal of Fuzzy Logic and Intelligent Systems
    • /
    • v.1 no.1
    • /
    • pp.119-124
    • /
    • 2001
  • In this pattern recognition on the large volumes of remote sensing satellite images, the inference time is much increased. In the case of the remote sensing data [5] having 4 wavebands, the 778 training patterns are learned. Each land cover pattern is classified by using 159, 900 patterns including the trained patterns. For the fuzzy classification, the 778 fuzzy rules are generated. Each fuzzy rule has 4 fuzzy variables in the condition part. Therefore, high performance parallel fuzzy inference system is needed. In this paper, we propose a novel parallel fuzzy inference system on T3E parallel computer. In this, fuzzy rules are distributed and executed simultaneously. The ONE_To_ALL algorithm is used to broadcast the fuzzy input to the all nodes. The results of the MIN/MAX operations are transferred to the output processor by the ALL_TO_ONE algorithm. By parallel processing of the fuzzy rules, the parallel fuzzy inference algorithm extracts match parallelism and achieves a good speed factor. This system can be used in a large expert system that ha many inference variables in the condition and the consequent part.

  • PDF

Fuzzy Inference of Large Volumes in Parallel Computing Environment (병렬컴퓨팅 환경에서의 대용량 퍼지 추론)

  • 김진일;박찬량;이동철;이상구
    • Proceedings of the Korean Institute of Intelligent Systems Conference
    • /
    • 2000.05a
    • /
    • pp.13-16
    • /
    • 2000
  • In fuzzy expert systems or database systems that have huge volumes of fuzzy data or large fuzzy rules, the inference time is much increased. Therefore, a high performance parallel fuzzy computing environment is needed. In this paper, we propose a parallel fuzzy inference mechanism in parallel computing environment. In this, fuzzy rules are distributed and executed simultaneously. The ONE_TO_ALL algorithm is used to broadcast the fuzzy input vector to the all nodes. The results of the MIN/MAX operations are transferred to the output processor by the ALL_TO_ONE algorithm. By parallel processing of fuzzy rules or data, the parallel fuzzy inference algorithm extracts effective parallel ism and achieves a good speed factor.

  • PDF

Fuzzy Inference of Large Volumes in Parallel Computing Environments (병렬컴퓨팅 환경에서의 대용량 퍼지 추론)

  • 김진일;이상구
    • Journal of the Korean Institute of Intelligent Systems
    • /
    • v.10 no.4
    • /
    • pp.293-298
    • /
    • 2000
  • In fuzzy expert systems or database systems that have volumes of fuzzy data or large fuzzy rules, the inference time is much increased. Therefore, a high performance parallel fuzzy computing environment is needed. In this paper, we propose a parallel fuzzy inference mechanism in parallel computing environments. In this, fuzzy rules are distributed and executed simultaneously. The ONE_TO_ALL algorithm is used to broadcast the fuzzy input input vector to the all nodes. The results of the MIN/MAX operations are transferred to the output processor by the ALL_TO_ONE algorithm. By parallel processing of fuzzy or data, the parallel fuzzy inference algortihm extracts effective and achieves and achieves a good speed factor.

  • PDF

Design of Fault Diagnostic System based on Neuro-Fuzzy Scheme (퍼지-신경망 기반 고장진단 시스템의 설계)

  • Kim, Sung-Ho;Kim, Jung-Soo;Park, Tae-Hong;Lee, Jong-Ryeol;Park, Gwi-Tae
    • The Transactions of the Korean Institute of Electrical Engineers A
    • /
    • v.48 no.10
    • /
    • pp.1272-1278
    • /
    • 1999
  • A fault is considered as a variation of physical parameters; therefore the design of fault detection and identification(FDI) can be reduced to the parameter identification of a non linear system and to the association of the set of the estimated parameters with the mode of faults. Neuro-Fuzzy Inference System which contains multiple linear models as consequent part is used to model nonlinear systems. Generally, the linear parameters in neuro-fuzzy inference system can be effectively utilized to fault diagnosis. In this paper, we proposes an FDI system for nonlinear systems using neuro-fuzzy inference system. The proposed diagnostic system consists of two neuro-fuzzy inference systems which operate in two different modes (parallel and series-parallel mode). It generates the parameter residuals associated with each modes of faults which can be further processed by additional RBF (Radial Basis Function) network to identify the faults. The proposed FDI scheme has been tested by simulation on two-tank system.

  • PDF

Design and Implementation of a PCI-based Parallel Fuzzy Inference System (PCI 기반 병렬 퍼지추론 시스템과 설계 및 구현)

  • 이병권;이상구
    • Journal of the Korean Institute of Intelligent Systems
    • /
    • v.11 no.8
    • /
    • pp.764-770
    • /
    • 2001
  • In this paper, we propose a novel PCI bus based parallel fuzzy inference system for transferring and inferencing the large volumes of fuzzy data in high speed. For this, the PCI 9050 interface chip is used to connect a local bus design as a PCI target core using FPGA to the PCI bus. We design and implement the PCI target core by using VHDL to be processed in parallel by considering the points of parallelyzing each element of the membership functions and each block of the condition and/or consequent parts. The proposed system can be used in a system requiring a rapid inference time in a real-time system or pattern recognition on the large volume of satellite images that have many inference variables in the condition and consequent parts.

  • PDF

Integrated GUI Environment of Parallel Fuzzy Inference System for Pattern Classification of Remote Sensing Images

  • Lee, Seong-Hoon;Lee, Sang-Gu;Son, Ki-Sung;Kim, Jong-Hyuk;Lee, Byung-Kwon
    • International Journal of Fuzzy Logic and Intelligent Systems
    • /
    • v.2 no.2
    • /
    • pp.133-138
    • /
    • 2002
  • In this paper, we propose an integrated GUI environment of parallel fuzzy inference system fur pattern classification of remote sensing data. In this, as 4 fuzzy variables in condition part and 104 fuzzy rules are used, a real time and parallel approach is required. For frost fuzzy computation, we use the scan line conversion algorithm to convert lines of each fuzzy linguistic term to the closest integer pixels. We design 4 fuzzy processor unit to be operated in parallel by using FPGA. As a GUI environment, PCI transmission, image data pre-processing, integer pixel mapping and fuzzy membership tuning are considered. This system can be used in a pattern classification system requiring a rapid inference time in a real-time.

High-speed Fuzzy Inference System in Integrated GUI Environment

  • Lee, Sang-Gu
    • International Journal of Fuzzy Logic and Intelligent Systems
    • /
    • v.4 no.1
    • /
    • pp.50-55
    • /
    • 2004
  • We propose an intgrated Gill environment system having only integer fuzzy operations in the consequent part and the defuzzification stage. In this paper, we also propose an integrated Gill environment system with 4 parallel fuzzy processing units to be operated in parallel on the classification of the sensed image data. In this, we solve the problems of taking longer times as the fuzzy real computations of [0, 1] by using the integer pixel conversion algorithm to convert lines of each fuzzy linguistic term to the closest integer pixels. This procedure is performed automatically in the GUI application program. As a Gill environment, PCI transmission, image data pre-processing, integer pixel mapping and fuzzy membership tuning are considered. This system can be operated in parallel manner for MIMO or MISO systems.

Parallel Fuzzy Information Processing System - KAFA : KAist Fuzzy Accelerator -

  • Kim, Young-Dal;Lee, Hyung-Kwang;Park, Kyu-Ho
    • Proceedings of the Korean Institute of Intelligent Systems Conference
    • /
    • 1993.06a
    • /
    • pp.981-984
    • /
    • 1993
  • During the past decade, several specific hardwares for fast fuzzy inference have been developed. Most of them are dedicated to a specific inference method and thus cannot support other inference methods. In this paper, we present a hardware architecture called KAFA(KAist Fuzzy Accelerator) which provides various fuzzy inference methods and fuzzy set operators. The architecture has SIMD structure, which consists of two parts; system control/interface unit(Main Controller) and arithmetic units(FPEs). Using the parallel processing technology, the KAFA has the high performance for fuzzy information processing. The speed of the KAFA holds promise for the development of the new fuzzy application systems.

  • PDF

Rule-based parallel inference system (반도체 지능형 코드관리 시스템)

  • 유명관;정봉주;박성근
    • Proceedings of the Korean Operations and Management Science Society Conference
    • /
    • 1995.04a
    • /
    • pp.492-495
    • /
    • 1995
  • 기존의 반도체 공장에서 사용되어진 제품의 코드는 필요에 의해 각 부문의 코드 담당자가 필요한 제품의 특성을 표현할 수 있도록 생성시켜 사용하여 왔으므로 여러 제품의 특성에 따라, 유사하나 독자적인 생성규칙이 존재하고 있으며 종합적인 관리체계가 이루어질 수 없었다. 이러한 문제점을 해결하고 코드를 자동으로 생성시켜 생산라인에 제공할 수 있는 전문가 시스템을 구축하기 위하여 각각의 규칙을 정의하여 병렬로 처리할 수 있는 지능형 코드관리 시스템을 개발하였다.

  • PDF

Fuzzy-Neural Networks with Parallel Structure and Its Application to Nonlinear Systems (병렬구조 FNN과 비선형 시스템으로의 응용)

  • Park, Ho-Sung;Yoon, Ki-Chan;Oh, Sung-Kwun
    • Proceedings of the KIEE Conference
    • /
    • 2000.07d
    • /
    • pp.3004-3006
    • /
    • 2000
  • In this paper, we propose an optimal design method of Fuzzy-Neural Networks model with parallel structure for complex and nonlinear systems. The proposed model is consists of a multiple number of FNN connected in parallel. The proposed FNNs with parallel structure is based on Yamakawa's FNN and it uses simplified inference as fuzzy inference method and Error Back Propagation Algorithm as learning rules. We use a HCM clustering and GAs to identify the structure and the parameters of the proposed model. Also, a performance index with a weighting factor is presented to achieve a sound balance between approximation and generalization abilities of the model. To evaluate the performance of the proposed model. we use the time series data for gas furnace and the numerical data of nonlinear function.

  • PDF