• Title/Summary/Keyword: Pareto principle

Search Result 20, Processing Time 0.02 seconds

A Study on Inequality Analysis of Academic Information Sharing in University Libraries using Gini's Coefficient and Pareto Ratio (지니계수와 파레토 비율을 활용한 학술정보공유 기여에 대한 대학도서관 격차 분석)

  • Cho, Jane
    • Journal of the Korean BIBLIA Society for library and Information Science
    • /
    • v.31 no.1
    • /
    • pp.237-255
    • /
    • 2020
  • Pareto principle states that, for many events, roughly 80% of the effects come from 20% of the causes. This study clarified if there is Pareto principle in Korean universities' academic information resource sharing network and calculates the Gini efficient about inequality in sharing academic resources. As a result, top 20% libraries led 80% of performance and inequality degree showed 0.8 as very serious condition. Relative Gini efficient which is recalculated considering scale of each libraries stay 0.7 that is adjusted slightly down. It means that such phenomenon is not caused by the difference of each universities scale with high contribution of big university and low contribution of small university. And in comparison of inequality between university's types, inequality between community colleges and private universities is more serious than four-year-course college and national university respectfully. Finally, as a result of visualizing the distribution of participating libraries, there were libraries with overwhelming contributions, and libraries with small but relatively high contribution levels were also distributed.

Multi-objective topology and geometry optimization of statically determinate beams

  • Kozikowska, Agata
    • Structural Engineering and Mechanics
    • /
    • v.70 no.3
    • /
    • pp.367-380
    • /
    • 2019
  • The paper concerns topology and geometry optimization of statically determinate beams with arbitrary number of supports. The optimization problem is treated as a bi-criteria one, with the objectives of minimizing the absolute maximum bending moment and the maximum deflection for a uniform gravity load. The problem is formulated and solved using the Pareto optimality concept and the lexicographic ordering of the objectives. The non-dominated sorting genetic algorithm NSGA-II and the local search method are used for the optimization in the Pareto sense, whereas the genetic algorithm and the exhaustive search method for the lexicographic optimization. Trade-offs between objectives are examined and sets of Pareto-optimal solutions are provided for different topologies. Lexicographically optimal beams are found assuming that the maximum moment is a more important criterion. Exact formulas for locations and values of the maximum deflection are given for all lexicographically optimal beams of any topology and any number of supports. Topologies with lexicographically optimal geometries are classified into equivalence classes, and specific features of these classes are discussed. A qualitative principle of the division of topologies equivalent in terms of the maximum moment into topologies better and worse in terms of the maximum deflection is found.

Design of Small Antennas with Inductively Coupled Feed Using a Pareto Genetic Algorithm (Pareto 유전자 알고리즘을 이용한 초소형 유도결합 안테나 설계)

  • Cho Chihyun;Choo Hosung;Park Ikmo;Kim Youngkil
    • The Journal of Korean Institute of Electromagnetic Engineering and Science
    • /
    • v.16 no.1 s.92
    • /
    • pp.40-48
    • /
    • 2005
  • In this paper, we explore the inductively coupled concept and propose a class of electrically small planar antennas. The antennas are optimized using NEC in conjunction with a Pareto GA. These antennas show good efficiency and bandwidth performance without any additional matching network. Several optimized designs are fabricated and measured. We explain the operating principle of these antennas using a simple lumped element circuit model. The proposed antennas are translated as printed lines on Duroid for RFID tag antennas.

Pareto Analysis of Experimental Data by L18(2 X 37) Orthogonal Array (L18(2 X 37) 직교배열표 실험자료에 대한 파레토 그림 분석)

  • 임용빈
    • The Korean Journal of Applied Statistics
    • /
    • v.17 no.3
    • /
    • pp.499-505
    • /
    • 2004
  • The Pareto diagram analysis of the experimental data by the two level orthogonal arrays has been used widely in practice since it is a graphical, quick and easy method to analyze experimental results, which does not use the analysis of variance to screen significant effects. For the analysis of the experimental data by $L_{18}(2 \times 3^7)$ orthogonal array, Park(1996) proposed Pareto ANOVA in which the size of effects is defined by the mean squares of effects and the Pareto principle is used. In this paper, a new approach of the Pareto diagram analysis of the experimental data by $L_{18}(2 \times 3^7)$ orthogonal array is proposed. The main idea is to partition the size of three level effects by that of linear and quadratic orthogonal contrasts of those effects.

Pareto Ratio and Inequality Level of Knowledge Sharing in Virtual Knowledge Collaboration: Analysis of Behaviors on Wikipedia (지식 공유의 파레토 비율 및 불평등 정도와 가상 지식 협업: 위키피디아 행위 데이터 분석)

  • Park, Hyun-Jung;Shin, Kyung-Shik
    • Journal of Intelligence and Information Systems
    • /
    • v.20 no.3
    • /
    • pp.19-43
    • /
    • 2014
  • The Pareto principle, also known as the 80-20 rule, states that roughly 80% of the effects come from 20% of the causes for many events including natural phenomena. It has been recognized as a golden rule in business with a wide application of such discovery like 20 percent of customers resulting in 80 percent of total sales. On the other hand, the Long Tail theory, pointing out that "the trivial many" produces more value than "the vital few," has gained popularity in recent times with a tremendous reduction of distribution and inventory costs through the development of ICT(Information and Communication Technology). This study started with a view to illuminating how these two primary business paradigms-Pareto principle and Long Tail theory-relates to the success of virtual knowledge collaboration. The importance of virtual knowledge collaboration is soaring in this era of globalization and virtualization transcending geographical and temporal constraints. Many previous studies on knowledge sharing have focused on the factors to affect knowledge sharing, seeking to boost individual knowledge sharing and resolve the social dilemma caused from the fact that rational individuals are likely to rather consume than contribute knowledge. Knowledge collaboration can be defined as the creation of knowledge by not only sharing knowledge, but also by transforming and integrating such knowledge. In this perspective of knowledge collaboration, the relative distribution of knowledge sharing among participants can count as much as the absolute amounts of individual knowledge sharing. In particular, whether the more contribution of the upper 20 percent of participants in knowledge sharing will enhance the efficiency of overall knowledge collaboration is an issue of interest. This study deals with the effect of this sort of knowledge sharing distribution on the efficiency of knowledge collaboration and is extended to reflect the work characteristics. All analyses were conducted based on actual data instead of self-reported questionnaire surveys. More specifically, we analyzed the collaborative behaviors of editors of 2,978 English Wikipedia featured articles, which are the best quality grade of articles in English Wikipedia. We adopted Pareto ratio, the ratio of the number of knowledge contribution of the upper 20 percent of participants to the total number of knowledge contribution made by the total participants of an article group, to examine the effect of Pareto principle. In addition, Gini coefficient, which represents the inequality of income among a group of people, was applied to reveal the effect of inequality of knowledge contribution. Hypotheses were set up based on the assumption that the higher ratio of knowledge contribution by more highly motivated participants will lead to the higher collaboration efficiency, but if the ratio gets too high, the collaboration efficiency will be exacerbated because overall informational diversity is threatened and knowledge contribution of less motivated participants is intimidated. Cox regression models were formulated for each of the focal variables-Pareto ratio and Gini coefficient-with seven control variables such as the number of editors involved in an article, the average time length between successive edits of an article, the number of sections a featured article has, etc. The dependent variable of the Cox models is the time spent from article initiation to promotion to the featured article level, indicating the efficiency of knowledge collaboration. To examine whether the effects of the focal variables vary depending on the characteristics of a group task, we classified 2,978 featured articles into two categories: Academic and Non-academic. Academic articles refer to at least one paper published at an SCI, SSCI, A&HCI, or SCIE journal. We assumed that academic articles are more complex, entail more information processing and problem solving, and thus require more skill variety and expertise. The analysis results indicate the followings; First, Pareto ratio and inequality of knowledge sharing relates in a curvilinear fashion to the collaboration efficiency in an online community, promoting it to an optimal point and undermining it thereafter. Second, the curvilinear effect of Pareto ratio and inequality of knowledge sharing on the collaboration efficiency is more sensitive with a more academic task in an online community.

Design of Crooked Wire Antennas for UHF Band RFID Reader (UHF 대역 RFID 리더용 Crooked Wire 안테나 설계)

  • Choo Jae-Yul;Choo Ho-Sung;Park Ik-Mo;Oh Yi-Sok
    • The Journal of Korean Institute of Electromagnetic Engineering and Science
    • /
    • v.16 no.5 s.96
    • /
    • pp.472-481
    • /
    • 2005
  • This paper reports the design of RFID reader antennas working in UHF band. The reader antennas were designed using a Pareto Genetic Algorithm(Pareto GA). Antennas were optimized to have circular polarization(CP) with less than 3 dB axial ratio, impedance matching with less than VSWR=2 within the frequency range of UHF, an adequate readable range, a restricted size(kr<2.22) considering the practical condition. After Pareto GA optimization, we selected and built the most suitable antenna design and compared the measured results to the simulations. Operating principle of the antenna was explained by investigating the amplitude and the phase of the induced current on the antenna body. We also researched the stability of the antenna with respect to the manufacturing error and studied the critical design parameters by applying the random error method on the antenna bent points.

Folded Loop Antennas for RFID Appilication (RFID 응용을 위한 폴디드-루프 안테나)

  • Choi, Tea-Il
    • The Journal of the Korea institute of electronic communication sciences
    • /
    • v.2 no.4
    • /
    • pp.199-202
    • /
    • 2007
  • In this paper, we examined the operating principle of a passive tag antenna for RFID system in UHF band. Based on the study, we proposed a novel RFID tag antenna which adopts the inductively coupled feeding structure to match antenna impedance to a capacitively loaded commercial tag chip. The proposed tag antenna consists of microstrip lines on a thin PET substrate for low-cost fabrication. The detail structure of the tag antenna were optimized using a full electromagnetic wave simulator of IE3D in conjunction with a Pareto genetic algorithm, and the size of the tag antenna can be reduced up to kr=0.27(2 cm2). We built some sample antennas and measured the antenna characteristics such as a return loss, an efficiency, and radiation patterns. The readable range of the tag antenna with a commercial RFID system showed about 1 to 3 m.

  • PDF

Optimization of productivity in the rehabilitation of building linked to BIM

  • Boulkenafet Nabil;Boudjellal Khaled;Bouabaz Mohamed
    • Advances in Computational Design
    • /
    • v.8 no.2
    • /
    • pp.179-190
    • /
    • 2023
  • In this paper, building information modelling (BIM) associated to the principle of significant items emerged at quantities and costs in the optimization of productivity related to the rehabilitation of the building where proposed and discussed. A quantitative and qualitative study related to the field of application based on some parameters such as pathology diagnosis, projects documents and bills of quantities were used for model development at the preliminary stage of this work. The study identified 14 quantities significant items specified to cost value based on the use of the 80/20 Pareto rule, through the integration of building information modelling (BIM) in the optimisation of labour productivity for rehabilitation of buildings. The results of this study reveal the reliability and the improvement of labour productivity using building information modelling process integrating quantities and cost significant items.

An Empirical Study on the Development Propensity and Quality of the Public Software Project (공공소프트웨어 사업의 개발 성향과 품질에 대한 실증적 연구)

  • Kim Yong Kyong;Kim Pyung Kee
    • Journal of Information Technology Applications and Management
    • /
    • v.11 no.4
    • /
    • pp.147-167
    • /
    • 2004
  • This study was empirically performed to demonstrate the development propensity and quality of the public software projects in Korea. Tile sample employed in this study contains 168 auditing reports on 107 public software projects which were carried out in the period of 1998 to 2003. The important findings of this study can be summarized as follows. The quality issue in the development process is getting more important with the lapse of time. In addition, the importance of end users' conveniency increases from year to year. Although the Pareto Principle(20 : 80 principle) is not applied strictly, most problems are caused by a few items. Finally, we find evidence that the overall Quality of public softwares is positively influenced by the information system auditing.

  • PDF

Modified inverse moment estimation: its principle and applications

  • Gui, Wenhao
    • Communications for Statistical Applications and Methods
    • /
    • v.23 no.6
    • /
    • pp.479-496
    • /
    • 2016
  • In this survey, we present a modified inverse moment estimation of parameters and its applications. We use a specific model to demonstrate its principle and how to apply this method in practice. The estimation of unknown parameters is considered. A necessary and sufficient condition for the existence and uniqueness of maximum-likelihood estimates of the parameters is obtained for the classical maximum likelihood estimation. Inverse moment and modified inverse moment estimators are proposed and their properties are studied. Monte Carlo simulations are conducted to compare the performances of these estimators. As far as the biases and mean squared errors are concerned, modified inverse moment estimator works the best in all cases considered for estimating the unknown parameters. Its performance is followed by inverse moment estimator and maximum likelihood estimator, especially for small sample sizes.