• Title/Summary/Keyword: Time Complexity

Search Result 3,044, Processing Time 0.03 seconds

WHAT CAN WE SAY ABOUT THE TIME COMPLEXITY OF ALGORITHMS \ulcorner

  • Park, Chin-Hong
    • Journal of applied mathematics & informatics
    • /
    • v.8 no.3
    • /
    • pp.959-973
    • /
    • 2001
  • We shall discuss one of some techniques needed to analyze algorithms. It is called a big-O function technique. The measures of efficiency of an algorithm have two cases. One is the time used by a computer to solve the problem using this algorithm when the input values are of a specified size. The other one is the amount of computer memory required to implement the algorithm when the input values are of a specified size. Mainly, we will restrict our attention to time complexity. To figure out the Time Complexity in nonlinear problems of Numerical Analysis seems to be almost impossible.

Software Complexity and Management for Real-Time Systems

  • Agarwal Ankur;Pandya A.S.;Lbo Young-Ubg
    • Journal of information and communication convergence engineering
    • /
    • v.4 no.1
    • /
    • pp.23-27
    • /
    • 2006
  • The discipline of software performance is very broad; it influences all aspects of the software development lifecycle, including architecture, design, deployment, integration, management, evolution and servicing. Thus, the complexity of software is an important aspect of development and maintenance activities. Much research has been dedicated to defining different software measures that capture what software complexity is. In most cases, the description of complexity is given to humans in forms of numbers. These quantitative measures reflect human-seen complexity with different levels of success. Software complexity growth has been recognized to be beyond human control. In this paper, we have focused our discussion on the increasing software complexity and the issue with the problems being faced in managing this complexity. This increasing complexity in turn affects the software productivity, which is declining with increase in its complexity.

Korean Maintainability Prediction Methodology Reflecting System Complexity (시스템 복잡도를 반영한 한국형 정비도 예측 방법론)

  • Kwon, Jae-Eon;Hur, Jang-Wook
    • Journal of the Korean Society of Manufacturing Process Engineers
    • /
    • v.20 no.4
    • /
    • pp.119-126
    • /
    • 2021
  • During the development of a weapon system, the concept of maintainability is used for quantitatively predicting and analyzing the maintenance time. However, owing to the complexity of a weapon system, the standard maintenance time predicted during the system's development differs significantly from the measured time during the operation of the equipment after the system's development. According to the analysis presented in this paper, the maintenance time can be predicted by considering the system's complexity on the basis of the military specifications, and the procedure can be Part B of Procedure II and Method B of Procedure V. The maintenance work elements affected by the system complexity were identified by the analytic hierarchy process technique, and the system-complexity-reflecting weights of the maintenance work elements were calculated by the Delphi method, which involves expert surveys. Based on MIL-HDBK-470A and MIL-HDBK-472, it is going to present a Korean-style maintainability prediction method that reflects system complexity of weapons systems.

Time Complexity Analysis of MSP Term Groupting Algorithm for Binary Neural Networks (이진신경회로망에서 MSP Term Grouping 알고리즘의 Time Complexity 분석)

  • 박병준;이정훈
    • Proceedings of the Korean Institute of Intelligent Systems Conference
    • /
    • 2000.11a
    • /
    • pp.85-88
    • /
    • 2000
  • 본 논문은 Threshold Logic Unit(TLU)를 기본 뉴런으로 하여 최소화된 이진신경회로망을 합성하는 방법인 MSP Term Grouping(MTG) 알고리즘의 time complexity를 분석하고자 한다. 이를 전체 패턴 탐색을 통한 이진신경회로망 합성의 경우와 비교하여 MTG 알고리즘의 효용성을 보여준다.

  • PDF

A Study on Representation of Ada Tasking Execution Time Complexity using ATSN (ATSN을 이용한 Ada Tasking 실행 시간 복잡도 표현에 관한 연구)

  • 이병복;유철중;김용성;장옥배
    • The Journal of Korean Institute of Communications and Information Sciences
    • /
    • v.18 no.5
    • /
    • pp.695-707
    • /
    • 1993
  • Marked Petri Net(MPN) is used to analyze communication complexity with respect to the maximum number of concurrently active rendezvous. But, because it cannot represent timed & probability characteristics explicitly, it is not used to analyze the new execution time of complexity with respect to the communication. Thus to effectively analyze that execution time complexity, we propose Ada Tasking Structure Nets (ATSN) introduced restrictive conditions : net reduction rules, execution time, and probability. Finally, we show the powers of analyze of communication complexity with ATSN.

  • PDF

Nonlinear Quality Indices Based on a Novel Lempel-Ziv Complexity for Assessing Quality of Multi-Lead ECGs Collected in Real Time

  • Zhang, Yatao;Ma, Zhenguo;Dong, Wentao
    • Journal of Information Processing Systems
    • /
    • v.16 no.2
    • /
    • pp.508-521
    • /
    • 2020
  • We compared a novel encoding Lempel-Ziv complexity (ELZC) with three common complexity algorithms i.e., approximate entropy (ApEn), sample entropy (SampEn), and classic Lempel-Ziv complexity (CLZC) so as to determine a satisfied complexity and its corresponding quality indices for assessing quality of multi-lead electrocardiogram (ECG). First, we calculated the aforementioned algorithms on six artificial time series in order to compare their performance in terms of discerning randomness and the inherent irregularity within time series. Then, for analyzing sensitivity of the algorithms to content level of different noises within the ECG, we investigated their change trend in five artificial synthetic noisy ECGs containing different noises at several signal noise ratios. Finally, three quality indices based on the ELZC of the multi-lead ECG were proposed to assess the quality of 862 real 12-lead ECGs from the MIT databases. The results showed the ELZC could discern randomness and the inherent irregularity within six artificial time series, and also reflect content level of different noises within five artificial synthetic ECGs. The results indicated the AUCs of three quality indices of the ELZC had statistical significance (>0.500). The ELZC and its corresponding three indices were more suitable for multi-lead ECG quality assessment than the other three algorithms.

The Complexity Evaluation System of Automobile Subassembly for Recycling (자원 재활용을 위한 자동차 조립군의 복잡도 평가시스템)

  • Mok, Hak-Soo;Moon, Kwang-Sup;Kim, Sung-Ho;Moon, Dae-Sung
    • Journal of the Korean Society for Precision Engineering
    • /
    • v.16 no.5 s.98
    • /
    • pp.132-144
    • /
    • 1999
  • In this study, the complexity of the product was evaluated quantitatively considering the product structure, assembly process and disassembly process. To evaluate the complexity of the product, subassemblies of automobile were analyzed and then characteristics of part and subassembly were determined according to product structure, assembly process and disassembly process. Evaluation criteria of complexity were determined considering each characteristics of part and subassembly. Experiential evaluation was accomplished by classified evaluation criteria and time-motion evaluation was accomplished by the relational motion factor with characteristics of part and subassembly in MTM(Methods Time Measurement) and WF(Work Factor). The total complexity of product was determined by experiential evaluation and time-motion evaluation.

  • PDF

A Study on the Propriety of the Medical Insurance Fee Schedule of Surgical Operations - In Regard to the Relative Price System and the Classification of the Price Unit of Insurance Fee Schedule - (수술수가의 적정성에 관한 연구 - 상대가격체계와 항목분류를 중심으로 -)

  • Oh Jin Joo
    • Journal of Korean Public Health Nursing
    • /
    • v.2 no.2
    • /
    • pp.21-44
    • /
    • 1988
  • In Korea, fee-for service reimbursement has been adopted from the begining of medical insurance system in 1977, and the importance of the relative value unit is currently being investigated. The purpose of this study was to find out the level of propriety of the difference in the fees for different surgical services, and the appropriateness of the classification of the insurance fee schedule. For the purpose of this study, specific subjects and the procedural methodology is shown as follows: 1. The propriety of the Relative Price System(RPS). 1) Choice of sample operations. In this study, sample operations were selected and classified by specialists in general surgery, and the number of items they classified were 32. For the same group of operations the Insurance Fee Schedule(IFS) classified the operations into 24 separate items. In order to investigate the propriety of the RPS, one of the purpose of this study, was to examine the 24 items classified by the IFS. 2) Evaluation of the complexity of surgery. The data used in this study was collected The data used in this study was collected from 94 specialists in general surgery by mail survey from November I to 15, 1986. Several independent variables (age, location, number of bed, university hospital, whether the medical institution adopt residents or not) were also investigated for analysis of the characteristics of surgical complexity. 3) Complexity and time calculations. Time data was collected from the records of the Seoul National University' Hospital, and the cost per operation was calculated through cost finding methods. 4) Analysis of the propriety of the Relative Price System of the Insurance Fee Schedule. The Relative Price System of the sample operation was regressed on the cost, time, comlexity relative ,value system (RVS) separately. The coefficient of determination indicates the degree of variation in the RPS of the Insurance Fee Schedule explained by the cost, time, complexity RVS separately. 2. The appropriateness of the classification of the Insurance Fee Schedule. 1) Choice of sample operations. The items which differed between the classification of the specialist and the classification of medical, Insurance Fee Schedule were chosen. 2) Comparisons of cost, time and complexity between the items were done to evaluate which classification was more appropriate. The findings of the study can be summarized as follows: 1. The coefficient of determination of the regression of the RPS on-cost RVS was 0.58, on time RVS was 0.65, and on complexity RVS was 0.72. This means that the RPS of Insurance Fee Schedule is improper with respect to the cost, time, complexity separately. Thus this indicates that RPS must be re-shaped according to the standard element. In this study, the correlation coefficients of cost, time, complexity Relative Value System were very high, and this suggests that RPS could be reshaped I according to anyone standard element. Considering of measurement, time was thought to be the most I appropriate. 2. The classifications of specialist and of the Insurance Fee Schedule were compared with respect to cost, time, and complexity separately. For complexity, ANOVA was done and the others were compared to the different values of different classifications. The result was that the classification of specialist was more reasonable and that the classification of Insurance Fee Schedule grouped inappropriately several into one price unit.

  • PDF

An Analysis of Effective Throughput in Distributed Wireless Scheduling

  • Radwan, Amr
    • Journal of Korea Multimedia Society
    • /
    • v.19 no.2
    • /
    • pp.155-162
    • /
    • 2016
  • Several distributed scheduling policies have been proposed with the objective of attaining the maximum throughput region or a guaranteed fraction throughput region. These policies consider only the theoretical throughput and do not account the lost in throughput due to the time complexity of implementing an algorithm in practice. Therefore, we propose a novel concept called effective throughput to characterize the actual throughput by taking into account the time complexity. Effective throughput can be viewed as the actual transmitted data without including the control message overhead. Numerical results demonstrate that in practical scheduling, time complexity significantly affects throughput. The performance of throughput degrades when the time complexity is high.

Low Complexity Decoder for Space-Time Turbo Codes

  • Lee Chang-Woo
    • The Journal of Korean Institute of Communications and Information Sciences
    • /
    • v.31 no.4C
    • /
    • pp.303-309
    • /
    • 2006
  • By combining the space-time diversity technique and iterative turbo codes, space-time turbo codes(STTCS) are able to provide powerful error correction capability. However, the multi-path transmission and iterative decoding structure of STTCS make the decoder very complex. In this paper, we propose a low complexity decoder, which can be used to decode STTCS as well as general iterative codes such as turbo codes. The efficient implementation of the backward recursion and the log-likelihood ratio(LLR) update in the proposed algorithm improves the computational efficiency. In addition, if we approximate the calculation of the joint LLR by using the approximate ratio(AR) algorithm, the computational complexity can be reduced even further. A complexity analysis and computer simulations over the Rayleigh fading channel show that the proposed algorithm necessitates less than 40% of the additions required by the conventional Max-Log-MAP algorithm, while providing the same overall performance.