• Title/Summary/Keyword: normalization

Search Result 1,400, Processing Time 0.033 seconds

A Comparison on the Image Normalizations for Image Information Estimation

  • Kang, Hwan-Il;Lim, Seung-Chul;Kim, Kab-Il;Son, Young-I
    • 제어로봇시스템학회:학술대회논문집
    • /
    • 2005.06a
    • /
    • pp.2385-2388
    • /
    • 2005
  • In this paper, we propose the estimation method for the image affine information for computer vision. The first estimation method is given based on the XYS image normalization and the second estimation method is based on the image normalization by Pei and Lin. The XYS normalization method turns out to have better performance than the method by Pei and Lin. In addition, we show that rotation and aspect ratio information can be obtained using the central moments of both the original image and the sensed image. Finally, we propose the modified version of the normalization method so that we may control the size of the image.

  • PDF

Design and Implementation of Binary Image Normalization Hardware for High Speed Processing (고속 처리를 위한 이진 영상 정규화 하드웨어의 설계 및 구현)

  • 김형구;강선미;김덕진
    • Journal of the Korean Institute of Telematics and Electronics B
    • /
    • v.31B no.5
    • /
    • pp.162-167
    • /
    • 1994
  • The binary image normalization method in image processing can be used in several fields, Especially, its high speed processing method and its hardware implmentation is more useful, A normalization process of each character in character recognition requires a lot of processing time. Therefore, the research was done as a part of high speed process of OCR (optical character reader) implementation as a pipeline structure with host computer in hardware to give temporal parallism. For normalization process, general purpose CPU,MC68000, was used to implement it. As a result of experiment, the normalization speed of the hardware is sufficient to implement high speed OCR which the recognition speed is over 140 characters per second.

  • PDF

Physical Artifact Correction in Nuclear Medicine Imaging: Normalization and Attenuation Correction (핵의학 영상의 물리적 인공산물보정: 정규화보정 및 감쇠보정)

  • Kim, Jin-Su;Lee, Jae-Sung;Cheon, Gi-Jeong
    • Nuclear Medicine and Molecular Imaging
    • /
    • v.42 no.2
    • /
    • pp.112-117
    • /
    • 2008
  • Artifact corrections including normalization and attenuation correction were important for quantitative analysis in Nuclear Medicine Imaging. Normalization is the process of ensuring that all lines of response joining detectors in coincidence have the same effective sensitivity. Failure to account for variations in LOR sensitivity leads to bias and high-frequency artifacts in the reconstructed images. Attenuation correction is the process of the correction of attenuation phenomenon lies in the natural property that photons emitted by the radiopharmaceutical will interact with tissue and other materials as they pass through the body. In this paper, we will review the several approaches for normalization and attenuation correction strategies.

Normalization and Search of the UV/VIS Spectra Measured from TLC/HPTLC (TLC/HPTLC에서 측정된 자외/가시부 스펙트럼의 표준화 및 검색)

  • Kang, Jong-Seong
    • YAKHAK HOEJI
    • /
    • v.38 no.4
    • /
    • pp.366-371
    • /
    • 1994
  • To improve the identification power of TLC/HPTLC the in situ reflectance spectra obtained directly from plates with commercial scanner are used. The spectrum normalization should be carried out prior to comparing and searching the spectra from library for the identification of compounds. Because the reflectance does not obey the Lambert-Beer's law, there arise some problems in normalization. These problems could be solved to some extent by normalizing the spectra with regression methods. The spectra are manipulated with the regression function of a curve obtained from the correlation plot. When the parabola was used as the manipulating function, the spectra were identified with the accuracy of 97% and this result was better than that of conventionally used the point and area normalization method.

  • PDF

Selective pole filtering based feature normalization for performance improvement of short utterance recognition in noisy environments (잡음 환경에서 짧은 발화 인식 성능 향상을 위한 선택적 극점 필터링 기반의 특징 정규화)

  • Choi, Bo Kyeong;Ban, Sung Min;Kim, Hyung Soon
    • Phonetics and Speech Sciences
    • /
    • v.9 no.2
    • /
    • pp.103-110
    • /
    • 2017
  • The pole filtering concept has been successfully applied to cepstral feature normalization techniques for noise-robust speech recognition. In this paper, it is proposed to apply the pole filtering selectively only to the speech intervals, in order to further improve the recognition performance for short utterances in noisy environments. Experimental results on AURORA 2 task with clean-condition training show that the proposed selectively pole-filtered cepstral mean normalization (SPFCMN) and selectively pole-filtered cepstral mean and variance normalization (SPFCMVN) yield error rate reduction of 38.6% and 45.8%, respectively, compared to the baseline system.

Normalization Factor for Three-Level Hierarchical 64QAM Scheme (3-level 계층 64QAM 기법의 정규화 인수)

  • You, Dongho;Kim, Dong Ho
    • The Journal of Korean Institute of Communications and Information Sciences
    • /
    • v.41 no.1
    • /
    • pp.77-79
    • /
    • 2016
  • In this paper, we consider hierarchical modulation (HM), which has been widely exploited in digital broadcasting systems. In HM, each independent data stream is mapped to the modulation symbol with different transmission power and normalization factors of conventional M-QAM cannot be used. In this paper, we derive the method and formula for exact normalization factor of three-level hierarchical 64QAM.

Word Similarity Calculation by Using the Edit Distance Metrics with Consonant Normalization

  • Kang, Seung-Shik
    • Journal of Information Processing Systems
    • /
    • v.11 no.4
    • /
    • pp.573-582
    • /
    • 2015
  • Edit distance metrics are widely used for many applications such as string comparison and spelling error corrections. Hamming distance is a metric for two equal length strings and Damerau-Levenshtein distance is a well-known metrics for making spelling corrections through string-to-string comparison. Previous distance metrics seems to be appropriate for alphabetic languages like English and European languages. However, the conventional edit distance criterion is not the best method for agglutinative languages like Korean. The reason is that two or more letter units make a Korean character, which is called as a syllable. This mechanism of syllable-based word construction in the Korean language causes an edit distance calculation to be inefficient. As such, we have explored a new edit distance method by using consonant normalization and the normalization factor.

Local-Based Iterative Histogram Matching for Relative Radiometric Normalization

  • Seo, Dae Kyo;Eo, Yang Dam
    • Journal of the Korean Society of Surveying, Geodesy, Photogrammetry and Cartography
    • /
    • v.37 no.5
    • /
    • pp.323-330
    • /
    • 2019
  • Radiometric normalization with multi-temporal satellite images is essential for time series analysis and change detection. Generally, relative radiometric normalization, which is an image-based method, is performed, and histogram matching is a representative method for normalizing the non-linear properties. However, since it utilizes global statistical information only, local information is not considered at all. Thus, this paper proposes a histogram matching method considering local information. The proposed method divides histograms based on density, mean, and standard deviation of image intensities, and performs histogram matching locally on the sub-histogram. The matched histogram is then further partitioned and this process is performed again, iteratively, controlled with the wasserstein distance. Finally, the proposed method is compared to global histogram matching. The experimental results show that the proposed method is visually and quantitatively superior to the conventional method, which indicates the applicability of the proposed method to the radiometric normalization of multi-temporal images with non-linear properties.

A Local Alignment Algorithm using Normalization by Functions (함수에 의한 정규화를 이용한 local alignment 알고리즘)

  • Lee, Sun-Ho;Park, Kun-Soo
    • Journal of KIISE:Computer Systems and Theory
    • /
    • v.34 no.5_6
    • /
    • pp.187-194
    • /
    • 2007
  • A local alignment algorithm does comparing two strings and finding a substring pair with size l and similarity s. To find a pair with both sufficient size and high similarity, existing normalization approaches maximize the ratio of the similarity to the size. In this paper, we introduce normalization by functions that maximizes f(s)/g(l), where f and g are non-decreasing functions. These functions, f and g, are determined by experiments comparing DNA sequences. In the experiments, our normalization by functions finds appropriate local alignments. For the previous algorithm, which evaluates the similarity by using the longest common subsequence, we show that the algorithm can also maximize the score normalized by functions, f(s)/g(l) without loss of time.