• Title/Summary/Keyword: Validation

Search Result 8,073, Processing Time 0.034 seconds

Validation of Sterilization Process (멸균제제공정의 Validation)

  • 김길수
    • YAKHAK HOEJI
    • /
    • v.29 no.5
    • /
    • pp.305-315
    • /
    • 1985
  • Validation이란 용어는 1970년대에 우리나라에 GMP개념이 도입되면서 처음 사용되기 시작하여 지금은 흔히들 사용하면서도 그 개념은 아직도 정확히 이해되고 있지 않은것 같다. 제약공업에서의 validation의 적용범위는 아주 광범위하며 제조기기, 제조공정, 조직, 인원 등에 그 개념을 도입하여 관리하고 있다. 따라서 본 란에서는 validation의 일반적인 개념을 서술한 후 액상제제 특히 무균제제에 대한 calidation중 멸균공정에 대하여 상술하고자 한다.

  • PDF

Candidate Points and Representative Cross-Validation Approach for Sequential Sampling (후보점과 대표점 교차검증에 의한 순차적 실험계획)

  • Kim, Seung-Won;Jung, Jae-Jun;Lee, Tae-Hee
    • Transactions of the Korean Society of Mechanical Engineers A
    • /
    • v.31 no.1 s.256
    • /
    • pp.55-61
    • /
    • 2007
  • Recently simulation model becomes an essential tool for analysis and design of a system but it is often expensive and time consuming as it becomes complicate to achieve reliable results. Therefore, high-fidelity simulation model needs to be replaced by an approximate model, the so-called metamodel. Metamodeling techniques include 3 components of sampling, metamodel and validation. Cross-validation approach has been proposed to provide sequnatially new sample point based on cross-validation error but it is very expensive because cross-validation must be evaluated at each stage. To enhance the cross-validation of metamodel, sequential sampling method using candidate points and representative cross-validation is proposed in this paper. The candidate and representative cross-validation approach of sequential sampling is illustrated for two-dimensional domain. To verify the performance of the suggested sampling technique, we compare the accuracy of the metamodels for various mathematical functions with that obtained by conventional sequential sampling strategies such as maximum distance, mean squared error, and maximum entropy sequential samplings. Through this research we team that the proposed approach is computationally inexpensive and provides good prediction performance.

Finding Unexpected Test Accuracy by Cross Validation in Machine Learning

  • Yoon, Hoijin
    • International Journal of Computer Science & Network Security
    • /
    • v.21 no.12spc
    • /
    • pp.549-555
    • /
    • 2021
  • Machine Learning(ML) splits data into 3 parts, which are usually 60% for training, 20% for validation, and 20% for testing. It just splits quantitatively instead of selecting each set of data by a criterion, which is very important concept for the adequacy of test data. ML measures a model's accuracy by applying a set of validation data, and revises the model until the validation accuracy reaches on a certain level. After the validation process, the complete model is tested with the set of test data, which are not seen by the model yet. If the set of test data covers the model's attributes well, the test accuracy will be close to the validation accuracy of the model. To make sure that ML's set of test data works adequately, we design an experiment and see if the test accuracy of model is always close to its validation adequacy as expected. The experiment builds 100 different SVM models for each of six data sets published in UCI ML repository. From the test accuracy and its validation accuracy of 600 cases, we find some unexpected cases, where the test accuracy is very different from its validation accuracy. Consequently, it is not always true that ML's set of test data is adequate to assure a model's quality.

An Evaluation Study on Artificial Intelligence Data Validation Methods and Open-source Frameworks (인공지능 데이터 품질검증 기술 및 오픈소스 프레임워크 분석 연구)

  • Yun, Changhee;Shin, Hokyung;Choo, Seung-Yeon;Kim, Jaeil
    • Journal of Korea Multimedia Society
    • /
    • v.24 no.10
    • /
    • pp.1403-1413
    • /
    • 2021
  • In this paper, we investigate automated data validation techniques for artificial intelligence training, and also disclose open-source frameworks, such as Google's TensorFlow Data Validation (TFDV), that support automated data validation in the AI model development process. We also introduce an experimental study using public data sets to demonstrate the effectiveness of the open-source data validation framework. In particular, we presents experimental results of the data validation functions for schema testing and discuss the limitations of the current open-source frameworks for semantic data. Last, we introduce the latest studies for the semantic data validation using machine learning techniques.

Delegated Attribute Certificate Validation And Protocol (PMI 인증서 검증 위임 및 검증 프로토콜)

  • 이승훈;송주석
    • Journal of the Korea Institute of Information Security & Cryptology
    • /
    • v.13 no.1
    • /
    • pp.59-67
    • /
    • 2003
  • PMI(Privilege Management Infrastructure) certificates as well as Public-Key certificates must be validated before being used. Validation for a PMI certificate requires PMI certificate path validation, and PKC(Public-Key Certificate) path validations for each entity in the PMI certificate path. This validation work is quite complex and burdened to PMI certificate verifiers. Therefore, this paper suggests a delegated PMI certificate validation that uses specialized validation server, and defines a validation protocol which is used between validation server and client.

Improvement of Performance for Online Certificate Status Validation (실시간 인증서 상태검증의 성능개선)

  • Jung, Jai-Dong;Oh, Hae-Seok
    • The KIPS Transactions:PartC
    • /
    • v.10C no.4
    • /
    • pp.433-440
    • /
    • 2003
  • According as the real economic activities are carried out in the cyber world and the identity problem of a trade counterpart emerges, digital signature has been diffused. Due to the weakness for real-time validation using the validation method of digital signature, Certificate Revocation List, On-line Certificate Status Protocol was introduced. In this case, every transaction workload requested to verify digital signature is concentrated of a validation server node. Currently this method has been utilized on domestic financial transactions, but sooner or later the limitation will be revealed. In this paper, the validation method will be introduced which not only it can guarantee real-time validation but also the requesting node of certificate validation can maintain real-time certificate status information. This method makes the revocation management node update the certificate status information in real-time to the validation node while revoking certificate. The characteristic of this method is that the revocation management node should memorize the validation nodes which a certificate holder uses. If a certificate holder connects a validation node for the first time, the validation node should request its certificate status information to the above revocation management node and the revocation management node memorizes the validation node at the time. After that, the revocation management node inform the revocation information in real-time to all the validation node registered when a request of revocation happens. The benefits of this method are the fact that we can reduce the validation time because the certificate validation can be completed at the validation node and that we can avoid the concentration of requesting certificate status information to a revocation node.

Bandwidth selections based on cross-validation for estimation of a discontinuity point in density (교차타당성을 이용한 확률밀도함수의 불연속점 추정의 띠폭 선택)

  • Huh, Jib
    • Journal of the Korean Data and Information Science Society
    • /
    • v.23 no.4
    • /
    • pp.765-775
    • /
    • 2012
  • The cross-validation is a popular method to select bandwidth in all types of kernel estimation. The maximum likelihood cross-validation, the least squares cross-validation and biased cross-validation have been proposed for bandwidth selection in kernel density estimation. In the case that the probability density function has a discontinuity point, Huh (2012) proposed a method of bandwidth selection using the maximum likelihood cross-validation. In this paper, two forms of cross-validation with the one-sided kernel function are proposed for bandwidth selection to estimate the location and jump size of the discontinuity point of density. These methods are motivated by the least squares cross-validation and the biased cross-validation. By simulated examples, the finite sample performances of two proposed methods with the one of Huh (2012) are compared.

Geomechanical and hydrogeological validation of hydro-mechanical two-way sequential coupling in TOUGH2-FLAC3D linking algorithm with insights into the Mandel, Noordbergum, and Rhade effects

  • Lee, Sungho;Park, Jai-Yong;Kihm, Jung-Hwi;Kim, Jun-Mo
    • Geomechanics and Engineering
    • /
    • v.28 no.5
    • /
    • pp.437-454
    • /
    • 2022
  • The hydro-mechanical (HM) two-way sequential coupling in the TOUGH2-FLAC3D linking algorithm is validated completely and successfully in both M to H and H to M directions, which are initiated by mechanical surface loading for geomechanical validation and hydrological groundwater pumping for hydrogeological validation, respectively. For such complete and successful validation, a TOUGH2-FLAC3D linked numerical model is developed first by adopting the TOUGH2-FLAC3D linking algorithm, which uses the two-way (fixed-stress split) sequential coupling scheme and the implicit backward time stepping method. Two geomechanical and two hydrogeological validation problems are then simulated using the linked numerical model together with basic validation strategies and prerequisites. The second geomechanical and second hydrogeological validation problems are also associated with the Mandel effect and the Noordbergum and Rhade effects, respectively, which are three phenomenally well-known but numerically challenging HM effects. Finally, sequentially coupled numerical solutions are compared with either analytical solutions (verification) or fully coupled numerical solutions (benchmarking). In all the four validation problems, they show almost perfect to extremely or very good agreement. In addition, the second geomechanical validation problem clearly displays the Mandel effect and suggests a proper or minimum geometrical ratio of the height to the width for the rectangular domain to maximize agreement between the numerical and analytical solutions. In the meantime, the second hydrogeological validation problem clearly displays the Noordbergum and Rhade effects and implies that the HM two-way sequential coupling scheme used in the linked numerical model is as rigorous as the HM two-way full coupling scheme used in a fully coupled numerical model.

Basic Principles of the Validation for Good Laboratory Practice Institutes

  • Cho, Kyu-Hyuk;Kim, Jin-Sung;Jeon, Man-Soo;Lee, Kyu-Hong;Chung, Moon-Koo;Song, Chang-Woo
    • Toxicological Research
    • /
    • v.25 no.1
    • /
    • pp.1-8
    • /
    • 2009
  • Validation specifies and coordinates all relevant activities to ensure compliance with good laboratory practices (GLP) according to suitable international standards. This includes validation activities of past, present and future for the best possible actions to ensure the integrity of non-clinical laboratory data. Recently, validation has become increasingly important, not only in good manufacturing practice (GMP) institutions but also in GLP facilities. In accordance with the guideline for GLP regulations, all equipments used to generate, measure, or assess data should undergo validation to ensure that this equipment is of appropriate design and capacity and that it will consistently function as intended. Therefore, the implantation of validation processes is considered to be an essential step in a global institution. This review describes the procedures and documentations required for validation of GLP. It introduces basic elements such as the validation master plan, risk assessment, gap analysis, design qualification, installation qualification, operational qualification, performance qualification, calibration, traceability, and revalidation.

Mean-Variance-Validation Technique for Sequential Kriging Metamodels (순차적 크리깅모델의 평균-분산 정확도 검증기법)

  • Lee, Tae-Hee;Kim, Ho-Sung
    • Transactions of the Korean Society of Mechanical Engineers A
    • /
    • v.34 no.5
    • /
    • pp.541-547
    • /
    • 2010
  • The rigorous validation of the accuracy of metamodels is an important topic in research on metamodel techniques. Although a leave-k-out cross-validation technique involves a considerably high computational cost, it cannot be used to measure the fidelity of metamodels. Recently, the mean$_0$ validation technique has been proposed to quantitatively determine the accuracy of metamodels. However, the use of mean$_0$ validation criterion may lead to premature termination of a sampling process even if the kriging model is inaccurate. In this study, we propose a new validation technique based on the mean and variance of the response evaluated when sequential sampling method, such as maximum entropy sampling, is used. The proposed validation technique is more efficient and accurate than the leave-k-out cross-validation technique, because instead of performing numerical integration, the kriging model is explicitly integrated to accurately evaluate the mean and variance of the response evaluated. The error in the proposed validation technique resembles a root mean squared error, thus it can be used to determine a stop criterion for sequential sampling of metamodels.