Go to the main menu
Skip to content
Go to bottom
REFERENCE LINKING PLATFORM OF KOREA S&T JOURNALS
> Journal Vol & Issue
Journal of the Korean Society for Quality Management
Journal Basic Information
Journal DOI :
The Korean Society for Quality Management
Editor in Chief :
Volume & Issues
Volume 26, Issue 4 - Dec 1998
Volume 26, Issue 3 - Sep 1998
Volume 26, Issue 2 - Jun 1998
Volume 26, Issue 1 - Mar 1998
Selecting the target year
A Quality Measure for Supplier Selection
Journal of the Korean Society for Quality Management, volume 26, issue 1, 1998, Pages 1~10
Su, pp.ier quality plays a major role in the evaluation of su, pp.iers, making it very necessary to develop a proper quality measure useful in selecting su, pp.iers that is able to meet quality specification of the customer. In this paper, we present a measure of the overall quality performance which is a weighted geometric mean of the process capability indices of the quality characteristics of a su, pp.ier. This measure can be used both as a measure of su, pp.ier selection for the customer and as a measure for the self-analysis of the quality performance for the su, pp.ier.
A Studyon Relationship among the Key Dimensions of Quality Management
Journal of the Korean Society for Quality Management, volume 26, issue 1, 1998, Pages 11~26
A key aspect of theoretical and practical development of quality management is articulation of the distinction between quality management practices(input) and quality performance (output), which to date has been blurred under the broad heading of 'quality'. Recent literature primarily addressed measurement of quality performance but very little empirical research focuses specifically on quality management practices. In order to understand quality cause and effect, it is necessary to specify and measure quality management practices (cause) as well as quality performance (effect). This paper identifies and substantiates the key areas of quality management that must be implemented to achieve effective quality management, then analyzes relationships among those areas. In doing so, it establishes the basis for subsequent research by producing a profile of organization-wide quality manage-ment and provides the direction and priority of quality management practices for practitioners.
Implementation of Quality Manageemtn Policy and ISO 9000 Series under Product Liability Law
Journal of the Korean Society for Quality Management, volume 26, issue 1, 1998, Pages 27~47
The primary objective of this research is to provide the basis of total quality management policies by reviewing previous studies which investigated effective ways to reduce product liability exposure. Specifically, the present study intends (1) to examine recent patterns and trends of product liability law in OECD countries, (2) to propose quality management policies for preventing product liability litigations, and (3) to guide the proper implementation process of ISO 9000 certification programs. The survey results show that the shift from a negligence law theory to strict liability is evident in most countries. The trend has led to make it easier for consumers to bring product liability lawsuits. Furthermore, the damage awards won by consumers have been drastically increasing. To minimize product liability exposure, manufactures should reflect comprehensive product safety concepts in establishing total quality management policies. Cooperative activities are also required between departments in companies to reach safe and satisfactory quality level. These quality management activities should be performed consistently during the total product life cycle. Failure to comply with the ISO 9000 certifications might be used as an evidence of negligence or as evidence of a design defect in court. Previous lawsuit cases, however, reveal that ISO 9001-9003 registration process alone is not sufficient in terms of product liability pervention perspectives. Therefore, manufactures should take into account ISO 9004 before implementing any other section of ISO 9000 standards.
A New Measure of Process Capability for Non-Normal Process :
Journal of the Korean Society for Quality Management, volume 26, issue 1, 1998, Pages 48~60
This paper proposes a fourth generation index
, constructed from
, by introducing the factor｜
-T｜ in the numerator as an extra penalty for the departure of the process mean from the preassigned target value T. The motivation behind the introduction of
is that when
process shifts away from target are evaluated without respect to direction. All indices that are now in use assume normally distributed data, and any use of the indices on non-normal data results in inaccurate capability measurements. In this paper, a new process capability index
is introduced for non-normal process. The Pearson curve and the Johnson curve are selected for capability index calculation and data modeling the normal-based index
is used as the model for non-normal process. A significant result of this research find that the ranking of the six indices,
in terms of sensitivity to departure of the process median from the target value from the most sensitive one up to the least sensitive are
Determination of the Optimal Process Mean and Upper Limit with considering the rpm(rate per minute)
Journal of the Korean Society for Quality Management, volume 26, issue 1, 1998, Pages 61~73
The quality control literature contains a substantial number of articles concerned with how to optimally choose control limits in order to minimize production cost. The purpose of the this study is to determine the economic setting for the process mean of an industrial process. In this study it is assumed that the lower control limit is set by government regulations and the u, pp.r limit and process mean are chosen based on economic considerations. Much research has been conducted on this problem under the condition of the fixed rpm(rate per minute). However a variance can be increased in proportion to the level of rpm and the increase of the variance can change the optimal process mean. Therefore, it is desirable to determine both the process mean and the level of rpm simultaneously. In this paper, a mathematical model is presented which considers the u, pp.r limit and the rpm as variables.
Optimal Design of Constant Stress Accelerated Life Tests Using Degradation Phenomenon Based on a Brownian Motion
Journal of the Korean Society for Quality Management, volume 26, issue 1, 1998, Pages 74~87
This study considers optimal design of accelerated life tests under constant stress using that the first passage time to cross a critical boundary through amount of accumulated degradation has an inverse Gaussian distribution when the degradation process follows to a Brownian motion with positive drift of log linear function of stress. Optimum plans for Type I censoring are derived by minimizing the asymptotic variance of estimated quantiles at the use stress. Sensitivity analyses are also conducted to see how sensitive the optimality criterion is with respect to the uncertainties involved in the guessed values.
Some Process Capability Indices Using Gibbs Sampling
Journal of the Korean Society for Quality Management, volume 26, issue 1, 1998, Pages 88~98
Process capability indices are used to determine whether a production process is capable of producing items within a specified tolerance. Using conditional distribution, we study some process capability indices
under conjugate prior distribution. We consider some process capability indices with Gibbs sampling method. Also, we examine some small sample properties related to these estimaters by some simulations.
Modeling Software Relability with Multiple Failure types and Imperfect Debugging
Journal of the Korean Society for Quality Management, volume 26, issue 1, 1998, Pages 99~107
This paper presents a software reliability model that is based on a nonhomogeneous poisson process. The major contribution of this model is combining multiple failure types with imperfect debugging by use of S-shaped mean value function. The software reliability model allows for three different types of errors: Critical errors are the most difficult to detect and the most expensive to remove. Major errors are moderately difficult to detect and fairly expensive to remove. Minor errors are easy to detect and inexpensive to remove. The model also allows for the introduction of any of these types of errors during the removal of an error. A numerical example is provided to illustrate the above techniques.
EWMA Control Chart for Monitoring a Process Correlation Coefficient
Journal of the Korean Society for Quality Management, volume 26, issue 1, 1998, Pages 108~125
The EWMA(Exponentially Weighted Moving Average) has recently received a great deal of attention in the quality control literature as a process monitoring tool on the shop floor of manufacturing industires, since it is easy to plot, to interpret, and its control limits are easy to obtain. Most a, pp.ications of the EWMA for process monitoring have concentrated on the problem of detecting shifts of a process mean and a process standard deviation with ARL(Average Run Length) properties. But there may be the necessity of controlling linearity on product quality such as the correlation coefficient to the process operator. Control managers may want to protect the increase of a process correlation coefficient value, such as 0, between two variables of interest. However, there are few studies concerned on this part. Therefore, we propose EWMA models for a process correlation coefficient using two transformed statistics, T-statistic and (Fisher's) Z-statistic. We also present some results of simulation by SAS/IML and compare two models.
CUSUM of Squares Chart for the Detection of Variance Change in the Process
Lee, Jeong-Hyeong ; Cho, Sin-Sup ; Kim, Jae-Joo ;
Journal of the Korean Society for Quality Management, volume 26, issue 1, 1998, Pages 126~142
Traditional statistical process control(SPC) assumes that consective observations from a process are independent. In industrial practice, however, observations are ofter serially correlated. A common a, pp.oach to building control charts for autocorrelatd data is to a, pp.y classical SPC to the residuals from a time series model fitted. Unfortunately, one cannot completely escape the effects of autocorrelation by using charts based on residuals of time series model. For the detection of variance change in the process we propose a CUSUM of squares control chart which does not require the model identification. The proposed CUSUM of squares chart and the conventional control charts are compared by a Monte Carlo simulation. It is shown that the CUSUM of squares chart is more effective in the presence of dependency in the processes.
A Bayesian Method for Narrowing the Scope of Variable Selection in Binary Response Logistic Regression
Kim, Hea-Jung ; Lee, Ae-Kyung ;
Journal of the Korean Society for Quality Management, volume 26, issue 1, 1998, Pages 143~160
This article is concerned with the selection of subsets of predictor variables to be included in bulding the binary response logistic regression model. It is based on a Bayesian aproach, intended to propose and develop a procedure that uses probabilistic considerations for selecting promising subsets. This procedure reformulates the logistic regression setup in a hierarchical normal mixture model by introducing a set of hyperparameters that will be used to identify subset choices. It is done by use of the fact that cdf of logistic distribution is a, pp.oximately equivalent to that of
/.634 distribution. The a, pp.opriate posterior probability of each subset of predictor variables is obtained by the Gibbs sampler, which samples indirectly from the multinomial posterior distribution on the set of possible subset choices. Thus, in this procedure, the most promising subset of predictors can be identified as that with highest posterior probability. To highlight the merit of this procedure a couple of illustrative numerical examples are given.
A Study on Estimating Mean Lifetime After Modifying Censored Observations
Kim, Jinh-eum ; Kim, Jee-hoon ;
Journal of the Korean Society for Quality Management, volume 26, issue 1, 1998, Pages 161~171
Kim and Kim (1997) developed a method of estimating the mean lifetime based on the augmented data after imputing censored observations. Assuming the linear relationship between lifetime and covariates, and then introducing the procedure of Buckley and James (1979) to estimate the mean lifetimes of censored observations, they proposed a mean lifetime estimator and its consistency under the regularity conditions. In this article, the Kim and Kim's estimator is compared with the estimator introduced by Gill (1983) through simulations under the various configurations. Also, their estimator is illustrated with two real data sets.
Dimension-Tolerance Design with Cost Factors
Journal of the Korean Society for Quality Management, volume 26, issue 1, 1998, Pages 172~191
In this paper, dimension tolerance design for components is studied. Three cost factors are considered: machining cost, rework cost, and loss related to product quality which is affected by the tolerances of components. We propose a procedure to determine the optimal tolerances of components and a, pp.y the procedure to design the tolerances of fine motion stage in semicoduct machine. We compare the proposed procedure with the existing model for determining tolerance economically.
Quickest Path Algorithm for Improving Quality of Services in Communication Networks
Journal of the Korean Society for Quality Management, volume 26, issue 1, 1998, Pages 192~200
The quickest path problem is one of the important things for quality of services in communication networks. It is to find a path to send a given amount of data from the source to the sink with minimum transmission time, where the transmission time is dependent on both the capacities and the traversal times of the arcs in the network. This is found under the networks that the capacity and the lead time of each ring are predetermined. It is general to solve the quickest path problem using shortest path algorithms. The relevant algorithms proposed till now are based on the capacity of rings in distributed environments. When the configuration of networks is changed, there can be two a, pp.oaches to find the quickest paths. The one is to find new quickest paths, and the other is to update the current quickest paths. As one of the algorithms for the latter, the distributed quickest path update algorithm was proposed. This paper aims to propose the distributed algorithm a, pp.icable to find the quickest path, when the configuration of networks is changed, using the quickest path tree update altorithm, and to verify its possibility of a, pp.ication by analyzing the transmission amount of data from one node to another from the theoretical point of view.
Predicting the Tritium Release Accident in a Nuclear Fusion Plant
Journal of the Korean Society for Quality Management, volume 26, issue 1, 1998, Pages 201~212
A methodology of the safety analysis on the fusion power plant is introduced. It starts with the understanding of the physics and engineering of the plant followed by the assessment of the tritium inventory and flow rate. We a, pp.y the probabilistic risk assessment. An event tree that explains the propagation of the accident is constructed and then it is translated in to an influence diagram, that is accident is constructed and then it is translated in to an influence diagram, that is statistically equivalent so far as the parameter updating is concerned. We follow the Bayesian a, pp.oach where model parameters are treated as random variables. We briefly discuss the parameter updating scheme, and finally develop the methodology to obtain the predictive distribution of time to next severe accident.