• Title/Summary/Keyword: New Weighted Variance

Search Result 21, Processing Time 0.034 seconds

WEIGHTED POSSIBILISTIC VARIANCE AND MOMENTS OF FUZZY NUMBERS

  • Pasha, E.;Asady, B.;Saeidifar, A.
    • Journal of applied mathematics & informatics
    • /
    • v.26 no.5_6
    • /
    • pp.1169-1183
    • /
    • 2008
  • In this paper, a method to find the weighted possibilistic variance and moments about the mean value of fuzzy numbers via applying a difuzzification using minimizer of the weighted distance between two fuzzy numbers is introduced. In this way, we obtain the nearest weighted point with respect to a fuzzy number, this main result is a new and interesting alternative justification to define of weighted mean of a fuzzy number. Considering this point and the weighted distance quantity, we introduce the weighted possibilistic mean (WPM) value and the weighted possibilistic variance(WPV) of fuzzy numbers. This paper shows that WPM is the nearest weighted point to fuzzy number and the WPV of fuzzy number is preserved more properties of variance in probability theory so that it can simply introduce the possibilistic moments about the mean of fuzzy numbers without problem. The moments of fuzzy numbers play an important role to estimate of parameters, skewness, kurtosis in many of fuzzy times series models.

  • PDF

Heuristic Process Capability Indices Using Distribution-decomposition Methods (분포분할법을 이용한 휴리스틱 공정능력지수의 비교 분석)

  • Chang, Youngsoon
    • Journal of Korean Society for Quality Management
    • /
    • v.41 no.2
    • /
    • pp.233-248
    • /
    • 2013
  • Purpose: This study develops heuristic process capability indices (PCIs) using distribution-decomposition methods and evaluates the performances. The heuristic methods decompose the variation of a quality characteristic into upper and lower deviations and adjust the value of the PCIs using decomposed deviations in accordance with the skewness. The weighted variance(WV), new WV(NWV), scaled WV(SWV), and weighted standard deviation(WSD) methods are considered. Methods: The performances of the heuristic PCIs are investigated under the varied situations such as various skewed distributions, sample sizes, and specifications. Results: WV PCI is the best under the normal populations, WSD and SWV PCIs are the best under the low skewed populations, NWV PCI is the best under the moderate and high skewed populations. Conclusion: Comprehensive analysis shows that the NWV method is most adequate for a practical use.

Evaluation of Non - Normal Process Capability by Johnson System (존슨 시스템에 의한 비정규 공정능력의 평가)

  • 김진수;김홍준
    • Journal of the Korea Safety Management & Science
    • /
    • v.3 no.3
    • /
    • pp.175-190
    • /
    • 2001
  • We propose, a new process capability index $C_{psk}$(WV) applying the weighted variance control charting method for non-normally distributed. The main idea of the weighted variance method(WVM) is to divide a skewed or asymmetric distribution into two normal distributions from its mean to create two new distributions which have the same mean but different standard deviations. In this paper we propose an example, a distributions generated from the Johnson family of distributions, to demonstrate how the weighted variance-based process capability indices perform in comparison with another two non-normal methods, namely the Clements and the Wright methods. This example shows that the weighted valiance-based indices are more consistent than the other two methods in terms of sensitivity to departure to the process mean/median from the target value for non-normal processes. Second method show using the percentage nonconforming by the Pearson, Johnson and Burr systems. This example shows a little difference between the Pearson system and Burr system, but Johnson system underestimated than the two systems for process capability.

  • PDF

A Study on a Measure for Non-Normal Process Capability (비정규 공정능력 측도에 관한 연구)

  • 김홍준;김진수;조남호
    • Proceedings of the Korean Reliability Society Conference
    • /
    • 2001.06a
    • /
    • pp.311-319
    • /
    • 2001
  • All indices that are now in use assume normally distributed data, and any use of the indices on non-normal data results in inaccurate capability measurements. Therefore, $C_{s}$ is proposed which extends the most useful index to date, the Pearn-Kotz-Johnson $C_{pmk}$, by not only taking into account that the process mean may not lie midway between the specification limits and incorporating a penalty when the mean deviates from its target, but also incorporating a penalty for skewness. Therefore we propose, a new process capability index $C_{psk}$( WV) applying the weighted variance control charting method for non-normally distributed. The main idea of the weighted variance method(WVM) is to divide a skewed or asymmetric distribution into two normal distribution from its mean to create two new distributions which have the same mean but different standard distributions. In this paper we propose an example, a distribution generated from the Johnson family of distributions, to demonstrate how the weighted variance-based process capability indices perform in comparison with another two non-normal methods, namely the Clements and the Wright methods. This example shows that the weighted valiance-based indices are more consistent than the other two methods In terms of sensitivity to departure to the process mean/median from the target value for non-normal process.s.s.s.

  • PDF

An Analysis and Comparison on Efficiency of Load Distribution Algorithm in a Clustered System (클러스터 시스템의 부하분산 알고리즘의 효율성 비교분석)

  • Kim, Seok-Chan;Rhee, Young
    • Journal of KIISE:Computing Practices and Letters
    • /
    • v.12 no.2
    • /
    • pp.111-118
    • /
    • 2006
  • In this thesis, we analyze the efficiency of the algorithm to distribute the load in the clustered system, by comparing with the existed algorithm. PWLC algorithm detects each server's load in the system at weighted period, and following the detection of the loads, a set of weights is given to each server. The system allocates new loads to each server according to its weight. PWLC algorithm is compared with DWRR algorithm in terms of variance, waiting time by varying weighted Period. When the weighted period is too short, the system bears a heavy load for detecting load over time. On the other hand, when the weighted period is too long, the load balancing control of the system becomes ineffective. The analysis shows PWLC algorithm is more efficient than DWRR algorithm for the variance and waiting time.

A Comparative Study for Several Bayesian Estimators Under Squared Error Loss Function

  • Kim, Yeong-Hwa
    • Journal of the Korean Data and Information Science Society
    • /
    • v.16 no.2
    • /
    • pp.371-382
    • /
    • 2005
  • The paper compares the performance of some widely used Bayesian estimators such as Bayes estimator, empirical Bayes estimator, constrained Bayes estimator and constrained Bayes estimator by means of a new measurement under squared error loss function for the typical normal-normal situation. The proposed measurement is a weighted sum of the precisions of first and second moments. As a result, one can gets the criterion according to the size of prior variance against the population variance.

  • PDF

A study on optimal variable pole assignment self-tuning control (최적 가변 극점 배치 자기동조 제어에 관한 연구)

  • 전종암;조병선;박민용;이상배
    • 제어로봇시스템학회:학술대회논문집
    • /
    • 1988.10a
    • /
    • pp.246-249
    • /
    • 1988
  • In this paper, a new design technique which uses weighted least-sqare approach for the solution of the pole assignment problem is represented. This technique maybe used to assign some closed loop poles to places which reduce the large system input and output variance due to near pole-zero condition. The least-square approach is also applied to the design of servo self-tuning controller with integrator.

  • PDF

Large-Scale Phase Retrieval via Stochastic Reweighted Amplitude Flow

  • Xiao, Zhuolei;Zhang, Yerong;Yang, Jie
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • v.14 no.11
    • /
    • pp.4355-4371
    • /
    • 2020
  • Phase retrieval, recovering a signal from phaseless measurements, is generally considered to be an NP-hard problem. This paper adopts an amplitude-based nonconvex optimization cost function to develop a new stochastic gradient algorithm, named stochastic reweighted phase retrieval (SRPR). SRPR is a stochastic gradient iteration algorithm, which runs in two stages: First, we use a truncated sample stochastic variance reduction algorithm to initialize the objective function. The second stage is the gradient refinement stage, which uses continuous updating of the amplitude-based stochastic weighted gradient algorithm to improve the initial estimate. Because of the stochastic method, each iteration of the two stages of SRPR involves only one equation. Therefore, SRPR is simple, scalable, and fast. Compared with the state-of-the-art phase retrieval algorithm, simulation results show that SRPR has a faster convergence speed and fewer magnitude-only measurements required to reconstruct the signal, under the real- or complex- cases.

Weighted Hω and New Paradox of κ (가중 합치도 Hω와 κ의 새로운 역설)

  • Kwon, Na-Young;Kim, Jin-Gon;Park, Yong-Gyu
    • The Korean Journal of Applied Statistics
    • /
    • v.22 no.5
    • /
    • pp.1073-1084
    • /
    • 2009
  • For ordinal categorical $R{\times}R$ tables, a weighted measure of association, $H_{\omega}$, was proposed and its maximum likelihood estimator and asymptotic variance were drived. We redefined the last paradox of ${\kappa}$ and proved its relation to marginal distributions. We also introduced the new paradox of ${\kappa}$ and summaried the general relationships between ${\kappa}$ and marginal distributions.

Optimal Portfolio Models for an Inefficient Market

  • GINTING, Josep;GINTING, Neshia Wilhelmina;PUTRI, Leonita;NIDAR, Sulaeman Rahman
    • The Journal of Asian Finance, Economics and Business
    • /
    • v.8 no.2
    • /
    • pp.57-64
    • /
    • 2021
  • This research attempts to formulate a new mean-risk model to replace the Markowitz mean-variance model by altering the risk measurement using ARCH variance instead of the original variance. In building the portfolio, samples used are closing prices of Indonesia Composite Stock Index and Indonesia Composite Bonds Index from 2013 to 2018. This study is a qualitative study using secondary data from the Indonesia Stock Exchange and Indonesia Bonds Pricing Agency. This research found that Markowitz's model is still superior when utilized in daily data, while the mean-ARCH model is appropriate with wider gap data like monthly observation. The Historical return has also proven to be more appropriate as a benchmark in selecting an optimal portfolio rather than a risk-free rate in an inefficient market. Therefore Mean-ARCH is more appropriate when utilized under data that have a wider gap between the period. The research findings show that the portfolio combination produced is inefficient due to the market inefficiency indicated by the meager return of the stock, while bears notable standard deviation. Therefore, the researcher of this study proposed to replace the risk-free rate as a benchmark with the historical return. The Historical return proved to be more realistic than the risk-free rate in inefficient market conditions.