Go to the main menu
Skip to content
Go to bottom
REFERENCE LINKING PLATFORM OF KOREA S&T JOURNALS
> Journal Vol & Issue
Journal of the Korean Operations Research and Management Science Society
Journal Basic Information
Journal DOI :
The Korean Operations and Management Science Society
Editor in Chief :
Volume & Issues
Volume 21, Issue 3 - Dec 1996
Volume 21, Issue 2 - Aug 1996
Volume 21, Issue 1 - Apr 1996
Selecting the target year
On the Efficiency of Imbalance in a Class of Manufacturing Systems
Journal of the Korean Operations Research and Management Science Society, volume 21, issue 3, 1996, Pages 1~10
In this paper, the problem of simultaneously allocating servers and loadings of stations in a class of manufacturing systems modelled as network of queues is considered. The throughput function of the closed network of queues is demonstrated as a Schur convex function of server allocation, that is, increasing the server allocation vector under majorization increases the performance in the ship in terms of the throughput. It also reduces the congestion in the open network of queues in terms of reducing the total number of jobs in the sense of likelihood ratio ordering. These are the extentions of the numerical results of Green and Guha (1995) in the service system with independent M/M/c systems to the network of queues. The results can be used to support production planning in certain manufacturing systems.
A Study on the Relationship between New Product Development Strategies and New Product Outcomes
Journal of the Korean Operations Research and Management Science Society, volume 21, issue 3, 1996, Pages 11~46
The objectives of this research paper are to identify the types of the firm's new product development strategy and their characteristics about business strategy, to examine the effect of each type on new product outcomes, and to explore the contingency variable influencing the relationship between these types and new product outcomes. The result of the research are summarized as follows : First, in terms of both the resource allocation for product innovativeness and technology acquisition method, this study suggests 9 types of the firm's new product development strategies- Type 1 (pursuing low innovative products/relying on external technology), Type 2 (pursuing low innovative products oriented/relying on internal technology), Type 3 (pursuing low innovative products/relying on mixed technology), Type 4 (pursuing high innovative products/relying on internal technology), Type 6(pursuing high innovative products /relying on mixed technology), Type 7 (balancing low and high innovative products/relying on external technology), Type 8 (balancing low and high innovative products/relying on internal technology), Type 9 (balancing low and high innovative products/relying on mixed technology). Second, these 9 types are deeply associated with the firm's business strategic variables such as product differentiation and market differentiation, and exhibit different level of both technical and commercial performance of new products. Finally, the effects of these types on new product outcomes are different according to industrial environment and firms' characteristics with respect to size and technological capability.
A Hybrid Simulation Technique for Cell Loss Probability Estimation of ATM Switch
Journal of the Korean Operations Research and Management Science Society, volume 21, issue 3, 1996, Pages 47~61
An ATM switch must deal with various kinds of input sources having different traffic characteristics and it must guarantee very small value of cel loss probability, about 10
, to deal with loss-sensitive traffics. In order to estimate such a rate event probability with simulation procedure, a variance reduction technique is essential for obtaining an appropriate level of precision with reduced cost. In this paper, we propose a hybrid simulation technique to achieve reduction of variance of cell loss probability estimator, where hybrid means the combination of analytical method and simulation procedure. A discrete time queueing model with multiple input sources and a finite shared buffer is considered, where the arrival process at an input source and a finite shared buffer is considered, where the arrival process at an input source is governed by an Interrupted Bernoulli Process and the service rate is constant. We deal with heterogeneous input sources as well as homogeneous case. The performance of the proposed hybrid simulation estimator is compared with those of the raw simulation estimator and the importance sampling estimator in terms of variance reduction ratios.
An Efficient Ordering Method and Data Structure of the Interior Point Method (Putting Emphasis on the Minimum Deficiency Ordering
Journal of the Korean Operations Research and Management Science Society, volume 21, issue 3, 1996, Pages 63~74
Ordering plays an important role in solving an LP problem with sparse matrix by the interior point method. Since ordering is NP-complete, we try to find an efficient method. The objective of this paper is to present an efficient heuristic ordering method for implementation of the minimum deficiency method. Both the ordering method and the data structure play important roles in implementation. First we define a new heuristic pseudo-deficiency ordering method and a data structure for the method-quotient graph and cliqued storage. Next we show an experimental result in terms of time and nonzero numbers by NETLIB problems.
An Efficient Heuristic Technique for Job Shop Scheduling with Due Dates
Journal of the Korean Operations Research and Management Science Society, volume 21, issue 3, 1996, Pages 75~88
This paper presents an efficient heuristic technique for minimizing the objectives related to tardiness such as total tardiness, maximum tardiness and root mean of tardiness in the job shop scheduling. The heuristic technique iteratively improves an active schedule through exploring the schedule's neighborrhood, using operation move methods. The move operatio is defined on an active chain of tardy jobs in the active schedule. To find the move operations which have a high probability of reducing tardiness, we develop move methods by exploiting the properties of active chains. Our technique is compared with the two existing heuristic techniques, that is, MEHA(Modified Exchange Heuristic Algorithm) and GSP(Global Scheduling Procedure) under the various environmental with the three levels of due date tightness and several sized problems. The experimental results show that the proposed technique outperforms the two exissting techiques in terms of solution quality and computation time.
The Effect of Factors in The Concurrent Engineering on the Efficiency of the Product Development Processes
Journal of the Korean Operations Research and Management Science Society, volume 21, issue 3, 1996, Pages 89~108
During the whole product design and development processes, the concurrent engineering relies on strong and permanent interactions between the departmental functions. Concurrent or simultaneous engineering in new product development is a new concept which needs to be redefined. This paper deals with a concurrent engineering model which represents how concurrency, as an organizational process, is related to a interfunctional project team. Four dimensions shaping the sucess of the concurrent engineering are suggested with detailed meausrement instruments. Moreover, an empirical study on the effects of the four dimensions on the efficiency of the product development processes is carried out in the field of electronic industries.
Application effect and limitation of AHP as a research methodology -A comparison of 3 statistical technique for evaluating MIS success factor-
Journal of the Korean Operations Research and Management Science Society, volume 21, issue 3, 1996, Pages 109~125
Biases and errors in the human being's reasoning process have been studied continuously by the researchers, especially psychlogists and social scientists. These bias phenomenon is classified on the basis of the origin, i. e. motivation and cognition. Furthermore the necessity of research on the bias in the management and management information system areas in increased more and more recently, which have their academic backgrounds in the psychology and social science. The biased information stream is transformed into the systematic error due to the motivation and cognitive bias of human-being, then its resulting phenomena are as follows; 1. the availability of salient information 2. preconceived ideas or theories about peoples and event 3. anchoring and perseverence phenomena. In order to reduce the information errors, Satty suggested the Analytic Hierarchy Process (AHP) that is the subject of this paper and that is widely used for evaluation of complex decision making alternatives. THerefore this paper studies AHP's effects and its limitations in applying to the management area. Thus this paper compared the performances of the 3 models : 1 the traditional additive regression model. 2 regression model using the factor score, and 3 the regression model with AHP. As a result, 3 models produce the different outcomes.
A Life Cycle Model for Computer Integrated Manufacturing Systems
Journal of the Korean Operations Research and Management Science Society, volume 21, issue 3, 1996, Pages 127~141
In this paper, we propose a 7-phase life cycle model which applies to Computer Integrated Manufacturing systems. The model emphasizes product design and manufacturing design activities of CIM to secure the critical success factors of CIM systems such as high quality, adaptability, productivity, and flexibility. It is argued that the product design aspect would be divided into three phases-conceptual design, embodiment design, and detialed design. The conceptual design phase is to build a conceptual model of the product based on requirements and specifications which reflect "the voice of the customer". THe embodiment design phase utilizes specific design tools such as DFM, CAE, and CAD, and results in a concrete model of the product and parts. The detailed design phase is to crete a working prototype of the product and design tools such as DFA. CAD and CAM are employed in this phase. The output of the product design activity is to be the input for the manufacturing design activity. Using the proposed model, one can effectively and efficiently manufacture a high-quality, low-cost product with short delivery time, and above all achieve customer'ssatisfaction.isfaction.
Lot Sizing and Quality Inspection Schedules with Machine Breakdown
Journal of the Korean Operations Research and Management Science Society, volume 21, issue 3, 1996, Pages 143~157
This paper addresses the effects of an imperfect production process on the optimal production quantity and quality inspection policies. The system is assumed to deteriorate during the production process. The result are either the production of a number of defective items or the breakdown of the production meachine. A simple rule has been suggested to determine whether multiple quality in spection is workth or not. Furthermore, when multiple inspection policy is adopted, the optimal in spection schedule is shown to be equally spaced throughout the production cycle. Exact solution and approximation of the optimal production quantity and approximation of the optimal number of inspection are provided. Finally , to better understand the model of this paper, comparisons between this model and classical EMQ model are provided.
Combining Judgments for Better Decisions: A Study for Investigating Effective Combining Schemes
Lee, Hoon-Young ;
Journal of the Korean Operations Research and Management Science Society, volume 21, issue 3, 1996, Pages 159~174
Facing decision-making tasks, managers frequently make judgments, However, since managers are human beings, the fficiency of their judgments is limited. Two major sources of inefficiency in their judgments have been recognized : one is systematic deviations from normatively preferred decisions, so called bias or incorrect intuition, and the other is inconsistency in their judgments, i. e. erratic decision making variance. Rather than bias, variance is really expensive or damaging. Thus, if the inconsistency inmanagers judgments is removed, performance could be by far improved by virtue of the reduced random variance. One of the approaches to improve managerial judgment is to simply bring managers together by effectively moderating the random variance due to inconsistency. Focusing on combining judgments, this paper addresses many relevant issues such as why combining and how to combine judgments, and suggests methods and models to effectively aggregate subjective judgments, We conduct an experiment to validata the effectiveness of combining jugements over individual judgments. Various combining schemes are also evaluated in terms of their prective accuracy. Among them, mean bias based wighting scheme turns out the best. However, when available information is not enough to estimate the expertise of judges, simple and robust equal weighting might be more efficient and productive. This urges an imperative future research on the issue of how many and which ones to combine from a large set of experts.
A Methodology for Deriving An Object Model by Using Structured Analysis Results
Journal of the Korean Operations Research and Management Science Society, volume 21, issue 3, 1996, Pages 175~195
In conventional analysis methods, data and process are loosely coupled for building information systems. Several object oriented approaches have been proposed to integrate data and process. However, object oriented analysis requires a radical paradigm and thus system analysts find difficulties in generating object models direcctly from end users. To alleviate these difficulties, this paper proposes a methodology for deriving an object model by using structured analysis results. Objects are obtianed primarily from entities in Entity-Relationship Diagram. Methods are obtained through the analysis of the relationship between processes and data stores in Data Flow Diagram Methods are assigned to the objects by using object/process matrices. A real-life case is illustrated to demonstrate the usefulness of the methodology.
A Study on the Management of Stock Data with an Object Oriented Database Management System
Journal of the Korean Operations Research and Management Science Society, volume 21, issue 3, 1996, Pages 197~214
Financial analysis of stock data usually involves extensive computation of large amount of time series data sets. To handle the large size of the data sets and complexity of the analyses, database management systems have been increasingly adaopted for efficient management of stock data. Specially, relational database management system is employed more widely due to its simplistic data management approach. However, the normalized two-dimensional tables and the structured query language of the relational system turn out to be less effective than expected in accommodating time series stock data as well as the various computational operations. This paper explores a new data management approach to stock data management on the basis of an object-oriented database management system (ODBMS), and proposes a data model supporting times series data storage and incorporating a set of financial analysis functions. In terms of functional stock data analysis, it specially focuses on a primitive set of operations such as variance of stock data. In accomplishing this, we first point out the problems of a relational approach to the management of stock data and show the strength of the ODBMS. We secondly propose an object model delineating the structural relationships among objects used in the stock data management and behavioral operations involved in the financial analysis. A prototype system is developed using a commercial ODBMS.
-chain Reductions for Computing 2-terminal Reliability in an Undirected Network
Journal of the Korean Operations Research and Management Science Society, volume 21, issue 3, 1996, Pages 215~225
For an undirected stochastic network G, the 2-terminal reliability of G, R(G) is the probability that the specific two nodes (called as terminal nodes) are connected in G. A. typical network reliability problem is to compute R(G). It has been shown that the computation problem of R(G) is NP-hard. So, any algorithm to compute R(G) has a runngin time which is exponential in the size of G. If by some means, the problem size, G is reduced, it can result in immense savings. The means to reduce the size of the problem are the reliability preserving reductions and graph decompositions. We introduce a net set of reliability preserving reductions : the
(complete graph of 4-nodes)-chain reductions. The total number of the different
types in R(G), is 6. We present the reduction formula for each
type. But in computing R(G), it is possible that homeomorphic graphs from
occur. We devide the homemorphic graphs from
into 3 types. We develop the reliability preserving reductions for s types, and show that the remaining one is divided into two subgraphs which can be reduced by
-chain reductions 7 polygon-to-chain reductions.
Constrained Integer Multiobjective Linear Fractional Programming Problem
Thirwani, Deepa ; Arora, S.R. ;
Journal of the Korean Operations Research and Management Science Society, volume 21, issue 3, 1996, Pages 227~236
In this paper an algorithm based on cutting plane approach is developed which constructs all the efficient p-tuples of multiobjective integer linear fractional programming problem. The integer solution is constrained to satisfy and h out of n additional constraint sets. A numerical illustration in support of the proposed algorithm is developed.