• Title/Summary/Keyword: NetCDF

Search Result 25, Processing Time 0.031 seconds

Compressing Method of NetCDF Files Based on Sparse Matrix (희소행렬 기반 NetCDF 파일의 압축 방법)

  • Choi, Gyuyeun;Heo, Daeyoung;Hwang, Suntae
    • KIISE Transactions on Computing Practices
    • /
    • v.20 no.11
    • /
    • pp.610-614
    • /
    • 2014
  • Like many types of scientific data, results from simulations of volcanic ash diffusion are of a clustered sparse matrix in the netCDF format. Since these data sets are large in size, they generate high storage and transmission costs. In this paper, we suggest a new method that reduces the size of the data of volcanic ash diffusion simulations by converting the multi-dimensional index to a single dimension and keeping only the starting point and length of the consecutive zeros. This method presents performance that is almost as good as that of ZIP format compression, but does not destroy the netCDF structure. The suggested method is expected to allow for storage space to be efficiently used by reducing both the data size and the network transmission time.

Optimization of the computing environment to improve the speed of the modeling (WRF and CMAQ) calculation of the National Air Quality Forecast System (국가 대기질 예보 시스템의 모델링(기상 및 대기질) 계산속도 향상을 위한 전산환경 최적화 방안)

  • Myoung, Jisu;Kim, Taehee;Lee, Yonghee;Suh, Insuk;Jang, Limsuk
    • Journal of Environmental Science International
    • /
    • v.27 no.8
    • /
    • pp.723-735
    • /
    • 2018
  • In this study, to investigate an optimal configuration method for the modeling system, we performed an optimization experiment by controlling the types of compilers and libraries, and the number of CPU cores because it was important to provide reliable model data very quickly for the national air quality forecast. We were made up the optimization experiment of twelve according to compilers (PGI and Intel), MPIs (mvapich-2.0, mvapich-2.2, and mpich-3.2) and NetCDF (NetCDF-3.6.3 and NetCDF-4.1.3) and performed wall clock time measurement for the WRF and CMAQ models based on the built computing resources. In the result of the experiment according to the compiler and library type, the performance of the WRF (30 min 30 s) and CMAQ (47 min 22 s) was best when the combination of Intel complier, mavapich-2.0, and NetCDF-3.6.3 was applied. Additionally, in a result of optimization by the number of CPU cores, the WRF model was best performed with 140 cores (five calculation servers), and the CMAQ model with 120 cores (five calculation servers). While the WRF model demonstrated obvious differences depending on the number of CPU cores rather than the types of compilers and libraries, CMAQ model demonstrated the biggest differences on the combination of compilers and libraries.

A study of ocean eddy simulation using real water temperature data (실 수온자료를 이용한 와동류 모의 및 활용에 대한 연구)

  • Kim, Chang-Jin;Na, Young-Nam
    • Proceedings of the Korean Society of Computer Information Conference
    • /
    • 2015.07a
    • /
    • pp.99-100
    • /
    • 2015
  • 본 논문에서는 해양현상의 하나인 와동류를 컴퓨터 환경에서 모의하는 방안에 대해 제안한다. 이를 위해 와동류의 구조적 특징을 분석하고 이와 유사하게 모의하는 방안을 소개한다. 또한 해양 분야에서 가장 널리 사용되는 netCDF 포맷의 실 수온데이터를 이용하여 와동류를 표현하고 NASA에서 제공하는 공개 분석 툴을 이용하여 결과를 비교 분석한다. 또한 모의한 와동류가 실제 해양전투 모의실험에서 어떻게 활용 될 수 있는지 그 활용방안에 대해 소개한다.

  • PDF

Development of an earthquake-induced landslide risk assessment approach for nuclear power plants

  • Kwag, Shinyoung;Hahm, Daegi
    • Nuclear Engineering and Technology
    • /
    • v.50 no.8
    • /
    • pp.1372-1386
    • /
    • 2018
  • Despite recent advances in multi-hazard analysis, the complexity and inherent nature of such problems make quantification of the landslide effect in a probabilistic safety assessment (PSA) of NPPs challenging. Therefore, in this paper, a practical approach was presented for performing an earthquake-induced landslide PSA for NPPs subject to seismic hazard. To demonstrate the effectiveness of the proposed approach, it was applied to Korean typical NPP in Korea as a numerical example. The assessment result revealed the quantitative probabilistic effects of peripheral slope failure and subsequent run-out effect on the risk of core damage frequency (CDF) of a NPP during the earthquake event. Parametric studies were conducted to demonstrate how parameters for slope, and physical relation between the slope and NPP, changed the CDF risk of the NPP. Finally, based on these results, the effective strategies were suggested to mitigate the CDF risk to the NPP resulting from the vulnerabilities inherent in adjacent slopes. The proposed approach can be expected to provide an effective framework for performing the earthquake-induced landslide PSA and decision support to increase NPP safety.

Construction of NCAM-LAMP Precipitation and Soil Moisture Database to Support Landslide Prediction (산사태 예측을 위한 NCAM-LAMP 강수 및 토양수분 DB 구축)

  • So, Yun-Yeong;Lee, Su-Jung;Choi, Sung-Won;Lee, Seung-Jae
    • Korean Journal of Agricultural and Forest Meteorology
    • /
    • v.22 no.3
    • /
    • pp.152-163
    • /
    • 2020
  • The present study introduces a procedure to prepare and manage a high-resolution rainfall and soil moisture (SM) database in the LAMP prediction system, especially for landslide researchers. The procedure also includes converting the data into spatial resolution suitable for their interest regions following proper map projection methods. The LAMP model precipitation and SM data are quantitatively and qualitatively evaluated to identify the model prediction characteristics using the ERA5 reanalysis precipitation and observed 10m depth SM data. A detailed process of converting LAMP Weather Research and Forecasting (WRF) output data for 10m horizontal resolution is described in a step-wise manner, providing technical convenience for users to easily convert NetCDF data from the WRF model into TIF data in ArcGIS. The converted data can be viewed and downloaded via the LAMP website (http://df.ncam.kr/lamp/index.do) of the National Center for AgroMeteorology. The constructed database will contribute to monitoring and prediction of landslide risk prior to landslide response steps and should be data quality controlled by more observation data.

FLOODING PSA BY CONSIDERING THE OPERATING EXPERIENCE DATA OF KOREAN PWRs

  • Choi, Sun-Yeong;Yang, Joon-Eon
    • Nuclear Engineering and Technology
    • /
    • v.39 no.3
    • /
    • pp.215-220
    • /
    • 2007
  • The existing flooding Probabilistic Safety Analysis(PSA) was updated to reflect the Korean plant specific operating experience data into the flooding frequency to improve the PSA quality. Both the Nuclear Power Experience(NPE) database and the Korea Nuclear Pipe Failure Database(NuPIPE) databases were used in this study, and from these databases, only the Pressurized Water Reactor(PWR) data were used for the flooding frequencies of the flooding areas in the primary auxiliary building. With these databases and a Bayesian method, the flooding frequencies for the flooding areas were estimated. Subsequently, the Core Damage Frequency(CDF) for the flooding PSA of the Ulchin(UCN) unit 3 and 4 plants based on the Korean Standard Nuclear Power Plant(KSNP) internal full-power PSA model was recalculated. The evaluation results showed that sixteen flooding events are potentially significant according to the screening criterion, while there were two flooding events exceeding the screening criterion of the existing UCN 3 and 4 flooding PSA. The result was compared with two kinds of cases: (1) the flooding frequency and CDF from the method of the existing flooding PSA with the PWR and Boiled Water Reactor(BWR) data of the NPE database and the Maximum Likelihood Estimate(MLE) method and (2) the flooding frequency and CDF with the NPE database(PWR and BWR data), NuPIPE database, and a Bayesian method. From the comparison, a difference in CDF results was revealed more clearly between the CDF from this study and case (2) than between case (1) and case (2). That is, the number of flooding events exceeding the screen criterion further increased when only the PWR data were used for the primary auxiliary building than when the Korean specific data were used.

Development of Radar Data Use Program (레이더자료 활용 프로그램 개발)

  • Han, Myoung Sun;Lee, Dong-Ryul
    • Proceedings of the Korea Water Resources Association Conference
    • /
    • 2016.05a
    • /
    • pp.233-233
    • /
    • 2016
  • 현재 한국건설기술연구원에서 X-밴드 이중편파레이더를 운영하고 있으며, 이 결과 NetCDF 파일형식의 레이더 관측자료가 생성되고 있다. 레이더 자료포맷인 Netcdf 자료의 경우 레이더 관측과정에서 발생한 결과를 극좌표 형식으로 저장하고 있어 이를 분석이나 시스템에 적용하여 활용하기 위해서는 격자좌표로 변환하는 것이 필요하고, 또한 다양한 자료 변환 및 추출작업을 텍스트 기반으로 하기 위해 다양한 사전 작업이 필요하여 일반사용자가 사용하는데 어려운 상황이다. 그래서 이를 쉽게 수행할 수 있도록 JAVA를 이용하여 윈도우 기반으로 사용할 수 있는 프로그램(KICTRadar4WIN) 프로그램을 개발하였다. KICTRadar4WIN 프로그램의 경우 레이더 자료 품질관리, 레이더 자료 관리, 레이더 자료 추출, 레이더 자료 표출의 4가지 기능을 포함하고 있다. ${\bullet}$ 레이더 자료 품질관리 - 원시자료에 QC 기준을 입력하여 QC된 레이더자료를 생성 ${\bullet}$ 레이더 자료관리 - CAPPI 자료생성 : 관측된 PPI 및 RHI 자료를 이용하여, CAPPI 자료를 생성 - QPE 자료생성 : CAPPI 자료를 이용하여 QPE 자료를 생 - QPE 자료보정 : 지점우량을 이용한 G/R비를 산정하여 QPE 보정자료를 생성 ${\bullet}$ 레이더 자료 추출 - 격자자료 추출 : PPI, CAPPI, QPE 자료를 TEXT 자료로 변환하여 저장 - 지점자료 추출 : 입력된 지점좌표 중심으로 선택한 범위의 평균값을 TEXT 파일로 저장 - 면적자료 추출 : 입력된 면적자료의 평균값을 추출하여 TEXT파일로 저장 ${\bullet}$ 레이더 자료 표출 - 영상표출 : PPI, CAPPI, QPE 관측변수 자료를 그림파일 생성 - KMZ 자료생성 : PPI, CAPPI, QPE 자료를 KMZ 파일 생성

  • PDF

Application of probabilistic safety assessment (PSA) to the power reactor innovative small module (PRISM)

  • Alrammah, Ibrahim
    • Nuclear Engineering and Technology
    • /
    • v.54 no.9
    • /
    • pp.3324-3335
    • /
    • 2022
  • Several countries show interest in the Generation-IV power reactor innovative small module (PRISM), including: Canada, Japan, Korea, Saudi Arabia and the United Kingdom. Generation IV International Forum (GIF) has recommended the utilizing of probabilistic safety assessment (PSA) in evaluating the safety of Generation-IV reactors. This paper reviews the PSA performed for PRISM using SAPHIRE 7.27 code. This work shows that the core damage frequency (CDF) of PRISM for a single module is estimated by 8.5E-8/year which is lower than the Generation-IV target that is 1E-6 core damage per year. The social risk of PRISM (likelihood of latent cancer fatality) with evacuation is estimated by 9.0E-12/year which is much lower than the basic safety objective (BSO) that is 1E-7/year. The social risk without evacuation is estimated by 1.2E- 11/year which is also much lower than the BSO. For the individual risk (likelihood of prompt fatality), it is concluded that it can be considered negligible with evacuation (1.0E-13/year). Assuming no evacuation, the individual risk is 2.7E-10/year which is again much lower than the BSO. In comparison with other PSAs performed for similar sodium fast reactors (SFRs), it shows that PRISM concept has the lowest CDF.

Development of logical structure for multi-unit probabilistic safety assessment

  • Lim, Ho-Gon;Kim, Dong-San;Han, Sang Hoon;Yang, Joon Eon
    • Nuclear Engineering and Technology
    • /
    • v.50 no.8
    • /
    • pp.1210-1216
    • /
    • 2018
  • Site or multi-unit (MU) risk assessment has been a major issue in the field of nuclear safety study since the Fukushima accident in 2011. There have been few methods or experiences for MU risk assessment because the Fukushima accident was the first real MU accident and before the accident, there was little expectation of the possibility that an MU accident will occur. In addition to the lack of experience of MU risk assessment, since an MU nuclear power plant site is usually very complex to analyze as a whole, it was considered that a systematic method such as probabilistic safety assessment (PSA) is difficult to apply to MU risk assessment. This paper proposes a new MU risk assessment methodology by using the conventional PSA methodology which is widely used in nuclear power plant risk assessment. The logical failure structure of a site with multiple units is suggested from the definition of site risk, and a decomposition method is applied to identify specific MU failure scenarios.

How to incorporate human failure event recovery into minimal cut set generation stage for efficient probabilistic safety assessments of nuclear power plants

  • Jung, Woo Sik;Park, Seong Kyu;Weglian, John E.;Riley, Jeff
    • Nuclear Engineering and Technology
    • /
    • v.54 no.1
    • /
    • pp.110-116
    • /
    • 2022
  • Human failure event (HFE) dependency analysis is a part of human reliability analysis (HRA). For efficient HFE dependency analysis, a maximum number of minimal cut sets (MCSs) that have HFE combinations are generated from the fault trees for the probabilistic safety assessment (PSA) of nuclear power plants (NPPs). After collecting potential HFE combinations, dependency levels of subsequent HFEs on the preceding HFEs in each MCS are analyzed and assigned as conditional probabilities. Then, HFE recovery is performed to reflect these conditional probabilities in MCSs by modifying MCSs. Inappropriate HFE dependency analysis and HFE recovery might lead to an inaccurate core damage frequency (CDF). Using the above process, HFE recovery is performed on MCSs that are generated with a non-zero truncation limit, where many MCSs that have HFE combinations are truncated. As a result, the resultant CDF might be underestimated. In this paper, a new method is suggested to incorporate HFE recovery into the MCS generation stage. Compared to the current approach with a separate HFE recovery after MCS generation, this new method can (1) reduce the total time and burden for MCS generation and HFE recovery, (2) prevent the truncation of MCSs that have dependent HFEs, and (3) avoid CDF underestimation. This new method is a simple but very effective means of performing MCS generation and HFE recovery simultaneously and improving CDF accuracy. The effectiveness and strength of the new method are clearly demonstrated and discussed with fault trees and HFE combinations that have joint probabilities.