• Title/Summary/Keyword: Validation

Search Result 7,015, Processing Time 0.233 seconds

Validation Measures of Bicluster Solutions

  • Lee, Young-Rok;Lee, Jeong-Hwa;Jun, Chi-Hyuck
    • Industrial Engineering and Management Systems
    • /
    • v.8 no.2
    • /
    • pp.101-108
    • /
    • 2009
  • Biclustering is a method to extract subsets of objects and features from a dataset which are characterized in some way. In contrast to traditional clustering algorithms which group objects similar in a whole feature set, biclustering methods find groups of objects which have similar values or patterns in some features. Both in clustering and biclustering, validating how much the result is informative or reliable is a very important task. Whereas validation methods of cluster solutions have been studied actively, there are only few measures to validate bicluster solutions. Furthermore, the existing validation methods of bicluster solutions have some critical problems to be used in general cases. In this paper, we review several well-known validation measures for cluster and bicluster solutions and discuss their limitations. Then, we propose several improved validation indices as modified versions of existing ones.

CROSS- VALIDATION OF LANDSLIDE SUSCEPTIBILITY MAPPING IN KOREA

  • LEE SARO
    • Proceedings of the KSRS Conference
    • /
    • /
    • pp.291-293
    • /
    • 2004
  • The aim of this study was to cross-validate a spatial probabilistic model of landslide likelihood ratios at Boun, Janghung and Yongin, in Korea, using a Geographic Information System (GIS). Landslide locations within the study areas were identified by interpreting aerial photographs, satellite images and field surveys. Maps of the topography, soil type, forest cover, lineaments and land cover were constructed from the spatial data sets. The 14 factors that influence landslide occurrence were extracted from the database and the likelihood ratio of each factor was computed. 'Landslide susceptibility maps were drawn for these three areas using likelihood ratios derived not only from the data for that area but also using the likelihood ratios calculated from each of the other two areas (nine maps in all) as a cross-check of the validity of the method For validation and cross-validation, the results of the analyses were compared, in each study area, with actual landslide locations. The validation and cross-validation of the results showed satisfactory agreement between the susceptibility map and the existing landslide locations.

  • PDF

Introduction to the Validation Module Design for CMDPS Baseline Products

  • Kim, Shin-Young;Chung, Chu-Yong;Ou, Mi-Lim
    • Proceedings of the KSRS Conference
    • /
    • /
    • pp.146-148
    • /
    • 2007
  • CMDPS (COMS Meteorological Data Processing System) is the operational meteorological products extraction system for data observed from COMS (Communication, Ocean and Meteorological Satellite) meteorological imager. CMDPS baseline products consist of 16 parameters including cloud information, water vapor products, surface information, environmental products and atmospheric motion vector. Additionally, CMDPS includes the function of calibration monitoring, and validation mechanism of the baseline products. The main objective of CMDPS validation module development is near-real time monitoring for the accuracy and reliability of the whole CMDPS products. Also, its long time validation statistics are used for upgrade of CMDPS such as algorithm parameter tuning and retrieval algorithm modification. This paper introduces the preliminary design on CMDPS validation module.

  • PDF

Validation of UNIST Monte Carlo code MCS for criticality safety calculations with burnup credit through MOX criticality benchmark problems

  • Ta, Duy Long;Hong, Ser Gi;Lee, Deokjung
    • Nuclear Engineering and Technology
    • /
    • v.53 no.1
    • /
    • pp.19-29
    • /
    • 2021
  • This paper presents the validation of the MCS code for critical safety analysis with burnup credit for the spent fuel casks. The validation process in this work considers five critical benchmark problem sets, which consist of total 80 critical experiments having MOX fuels from the International Criticality Safety Benchmark Evaluation Project (ICSBEP). The similarity analysis with the use of sensitivity and uncertainty tool TSUNAMI in SCALE was used to determine the applicable benchmark experiments corresponding to each spent fuel cask model and then the Upper Safety Limits (USLs) except for the isotopic validation were evaluated following the guidance from NUREG/CR-6698. The validation process in this work was also performed with the MCNP6 for comparison with the results using MCS calculations. The results of this work showed the consistence between MCS and MCNP6 for the MOX fueled criticality benchmarks, thus proving the reliability of the MCS calculations.

Implementation of Validation Tool for Cryptographic Modules (암호기술 구현물 검증도구 구현)

  • 이종후;김충길;이재일;이석래;류재철
    • Journal of the Korea Institute of Information Security & Cryptology
    • /
    • v.11 no.2
    • /
    • pp.45-58
    • /
    • 2001
  • There are relatively many research results of the validation of the cryptography. But few researches on the validation of cryptography implementations were accomplished. However, developer\`s misunderstanding the crypto-algorithm or a mistake in implementation of the crypto-a1gorithm leads to lose reliability of security products. Therefore, as validation of the crypto-algorithm itself also validation of the implementation is important. The major objective of this paper is to propose Security Products Validation Tool. Our tool validates implementation of the public key algorithm (RSA. KCDSA) and hash algorithm (SHA-1, HAS-170). The validation process is composed of several items and our tool performs validation teats for conformance to related standard.

  • PDF

The Differences of Self-Validation, Regulatory Focus and Information Distortion Between Happiness and Sadness (행복감정과 슬픔감정 간의 자기타당화와 규제초점 및 정보왜곡의 차이)

  • Choi, Nak-Hwan;Chen, Fei;Kim, Min-Ji
    • Science of Emotion and Sensibility
    • /
    • v.20 no.3
    • /
    • pp.71-88
    • /
    • 2017
  • This paper compared self-validation and regulatory focus between consumers who felt happy vs. sad prior to decision and explored the effects of self-validation on regulatory focus and information distortion. The results of empirical analysis are as follows. First, consumers who felt happy beforehand revealed larger self-validation and stronger promotion focus than those who felt sad in advance. Second, compared to sadness, just-felt happiness was found to have partially positive impact on promotion focus by means of self-validation and exercise entirely positive impact on information distortion through mediation of self-validation. This study has made theoretic contributions by identifying the differences in the extent of self-validation and promotion focus between happiness and sadness as ambient emotion felt prior to the impending decision making as well as by investigating the effects of self-validation upon information distortion.

Design of an Algorithm for the Validation of SCL in Digital Substations

  • Jang, B.T.;Alidu, A.;Kim, N.D.
    • KEPCO Journal on Electric Power and Energy
    • /
    • v.3 no.2
    • /
    • pp.89-97
    • /
    • 2017
  • The substation is a critical node in the power network where power is transformed in the power generation, transmission and distribution system. The IEC 61850 is a global standard which proposes efficient substation automation by defining interoperable communication and data modelling techniques. In order to achieve this level of interoperability and automation, the IEC 61850 (Part 6) defines System Configuration description Language (SCL). The SCL is an XML based file format for defining the abstract model of primary and secondary substation equipment, communications systems and also the relationship between them. It enables the interoperable exchange of data during substation engineering by standardizing the description of applications at different stages of the engineering process. To achieve the seamless interoperability, multi-vendor devices are required to adhere completely to the IEC 61850. This paper proposes an efficient algorithm required for verifying the interoperability of multi-vendor devices by checking the adherence of the SCL file to specifications of the standard. Our proposed SCL validation algorithm consists of schema validation and other functionalities including information model validation using UML data model, the Vendor Defined Extension model validation, the User Defined Rule validation and the IED Engineering Table (IET) consistency validation. It also integrates the standard UCAIUG (Utility Communication Architecture International Users Group) Procedure validation for quality assurance testing. Our proposed algorithm is not only flexible and efficient in terms of ensuring interoperable functionality of tested devices, it is also convenient for use by system integrators and test engineers.

Application of Time-series Cross Validation in Hyperparameter Tuning of a Predictive Model for 2,3-BDO Distillation Process (시계열 교차검증을 적용한 2,3-BDO 분리공정 온도예측 모델의 초매개변수 최적화)

  • An, Nahyeon;Choi, Yeongryeol;Cho, Hyungtae;Kim, Junghwan
    • Korean Chemical Engineering Research
    • /
    • v.59 no.4
    • /
    • pp.532-541
    • /
    • 2021
  • Recently, research on the application of artificial intelligence in the chemical process has been increasing rapidly. However, overfitting is a significant problem that prevents the model from being generalized well to predict unseen data on test data, as well as observed training data. Cross validation is one of the ways to solve the overfitting problem. In this study, the time-series cross validation method was applied to optimize the number of batch and epoch in the hyperparameters of the prediction model for the 2,3-BDO distillation process, and it compared with K-fold cross validation generally used. As a result, the RMSE of the model with time-series cross validation was lower by 9.06%, and the MAPE was higher by 0.61% than the model with K-fold cross validation. Also, the calculation time was 198.29 sec less than the K-fold cross validation method.

A Study on the Statistical Model Validation using Response-adaptive Experimental Design (반응적응 시험설계법을 이용하는 통계적 해석모델 검증 기법 연구)

  • Jung, Byung Chang;Huh, Young-Chul;Moon, Seok-Jun;Kim, Young Joong
    • Proceedings of the Korean Society for Noise and Vibration Engineering Conference
    • /
    • /
    • pp.347-349
    • /
    • 2014
  • Model verification and validation (V&V) is a current research topic to build computational models with high predictive capability by addressing the general concepts, processes and statistical techniques. The hypothesis test for validity check is one of the model validation techniques and gives a guideline to evaluate the validity of a computational model when limited experimental data only exist due to restricted test resources (e.g., time and budget). The hypothesis test for validity check mainly employ Type I error, the risk of rejecting the valid computational model, for the validity evaluation since quantification of Type II error is not feasible for model validation. However, Type II error, the risk of accepting invalid computational model, should be importantly considered for an engineered products having high risk on predicted results. This paper proposes a technique named as the response-adaptive experimental design to reduce Type II error by adaptively designing experimental conditions for the validation experiment. A tire tread block problem and a numerical example are employed to show the effectiveness of the response-adaptive experimental design for the validity evaluation.

  • PDF

Validation of time domain seakeeping codes for a destroyer hull form operating in steep stern-quartering seas

  • Van Walree, Frans;Carette, Nicolas F.A.J.
    • International Journal of Naval Architecture and Ocean Engineering
    • /
    • v.3 no.1
    • /
    • pp.9-19
    • /
    • 2011
  • The paper describes the validation of two time domain methods to simulate the behaviour of a destroyer operating in steep, stern-quartering seas. The significance of deck-edge immersion and water on deck on the capsize risk is shown as well as the necessity to account for the wave disturbances caused by the ship. A method is described to reconstruct experimental wave trains and finally two deterministic validation cases are shown.