The management of algal bloom is essential for the proper management of water supply systems and to maintain the safety of drinking water. Chlorophyll-a(Chl-a) is a commonly used indicator to represent the algal concentration. In recent years, advanced machine learning models have been increasingly used to predict Chl-a in freshwater systems. Machine learning models show good performance in various fields, while the process of model development requires considerable labor and time by experts. Automated machine learning(auto ML) is an emerging field of machine learning study. Auto ML is used to develop machine learning models while minimizing the time and labor required in the model development process. This study developed an auto ML to predict Chl-a using auto sklearn, one of most widely used open source auto ML algorithms. The model performance was compared with other two popular ensemble machine learning models, random forest(RF) and XGBoost(XGB). The model performance was evaluated using three indices, root mean squared error, root mean squared error-observation standard deviation ratio(RSR) and Nash-Sutcliffe coefficient of efficiency. The RSR of auto ML, RF, and XGB were 0.659, 0.684 and 0.638, respectively. The results shows that auto ML outperforms RF, and XGB shows better prediction performance than auto ML, while the differences between model performances were not significant. Shapley value analysis, an explainable machine learning algorithm, was used to provide quantitative interpretation about the model prediction of auto ML developed in this study. The results of this study present the possible applicability of auto ML for the prediction of water quality.
The prediction of algal bloom is an important field of study in algal bloom management, and chlorophyll-a concentration(Chl-a) is commonly used to represent the status of algal bloom. In, recent years advanced machine learning algorithms are increasingly used for the prediction of algal bloom. In this study, XGBoost(XGB), an ensemble machine learning algorithm, was used to develop a model to predict Chl-a in a reservoir. The daily observation of water quality data and climate data was used for the training and testing of the model. In the first step of the study, the input variables were clustered into two groups(low and high value groups) based on the observed value of water temperature(TEMP), total organic carbon concentration(TOC), total nitrogen concentration(TN) and total phosphorus concentration(TP). For each of the four water quality items, two XGB models were developed using only the data in each clustered group(Model 1). The results were compared to the prediction of an XGB model developed by using the entire data before clustering(Model 2). The model performance was evaluated using three indices including root mean squared error-observation standard deviation ratio(RSR). The model performance was improved using Model 1 for TEMP, TN, TP as the RSR of each model was 0.503, 0.477 and 0.493, respectively, while the RSR of Model 2 was 0.521. On the other hand, Model 2 shows better performance than Model 1 for TOC, where the RSR was 0.532. Explainable artificial intelligence(XAI) is an ongoing field of research in machine learning study. Shapley value analysis, a novel XAI algorithm, was also used for the quantitative interpretation of the XGB model performance developed in this study.
Starting with the permanent shutdown of Kori Unit 1, the first waste treatment facility in Korea will be built on the Kori site. In this facility, major process such as decontamination, cutting, radiation measurement and volume reduction of decommissioning waste are performed, and radioactive liquid waste is generated by the waste treatment process and personnel decontamination. The generated liquid waste is finally discharged to the sea through radioactive monitoring system after sufficient treatment to meet the standard radiological effluent control. Whereas the treated liquid waste is additionally diluted through the circulation water discharge conduit and discharged to the sea in the operating nuclear power plants, there is no circulation water in the waste treatment facility. Therefore, a new discharging method for dilution after treatment should be considered. In this paper, the treatment concept and discharge method of radioactive liquid waste system in waste treatment facility are reviewed.
Algal bloom is an ongoing issue in the management of freshwater systems for drinking water supply, and the chlorophyll-a concentration is commonly used to represent the status of algal bloom. Thus, the prediction of chlorophyll-a concentration is essential for the proper management of water quality. However, the chlorophyll-a concentration is affected by various water quality and environmental factors, so the prediction of its concentration is not an easy task. In recent years, many advanced machine learning algorithms have increasingly been used for the development of surrogate models to prediction the chlorophyll-a concentration in freshwater systems such as rivers or reservoirs. This study used a light gradient boosting machine(LightGBM), a gradient boosting decision tree algorithm, to develop an ensemble machine learning model to predict chlorophyll-a concentration. The field water quality data observed at Daecheong Lake, obtained from the real-time water information system in Korea, were used for the development of the model. The data include temperature, pH, electric conductivity, dissolved oxygen, total organic carbon, total nitrogen, total phosphorus, and chlorophyll-a. First, a LightGBM model was developed to predict the chlorophyll-a concentration by using the other seven items as independent input variables. Second, the time-lagged values of all the input variables were added as input variables to understand the effect of time lag of input variables on model performance. The time lag (i) ranges from 1 to 50 days. The model performance was evaluated using three indices, root mean squared error-observation standard deviation ration (RSR), Nash-Sutcliffe coefficient of efficiency (NSE) and mean absolute error (MAE). The model showed the best performance by adding a dataset with a one-day time lag (i=1) where RSR, NSE, and MAE were 0.359, 0.871 and 1.510, respectively. The improvement of model performance was observed when a dataset with a time lag up of about 15 days (i=15) was added.
The increased turbidity in rivers during flood events has various effects on water environmental management, including drinking water supply systems. Thus, prediction of turbid water is essential for water environmental management. Recently, various advanced machine learning algorithms have been increasingly used in water environmental management. Ensemble machine learning algorithms such as random forest (RF) and gradient boosting decision tree (GBDT) are some of the most popular machine learning algorithms used for water environmental management, along with deep learning algorithms such as recurrent neural networks. In this study GBDT, an ensemble machine learning algorithm, and gated recurrent unit (GRU), a recurrent neural networks algorithm, are used for model development to predict turbidity in a river. The observation frequencies of input data used for the model were 2, 4, 8, 24, 48, 120 and 168 h. The root-mean-square error-observations standard deviation ratio (RSR) of GRU and GBDT ranges between 0.182~0.766 and 0.400~0.683, respectively. Both models show similar prediction accuracy with RSR of 0.682 for GRU and 0.683 for GBDT. The GRU shows better prediction accuracy when the observation frequency is relatively short (i.e., 2, 4, and 8 h) where GBDT shows better prediction accuracy when the observation frequency is relatively long (i.e. 48, 120, 160 h). The results suggest that the characteristics of input data should be considered to develop an appropriate model to predict turbidity.
The quantified analysis of damages to wastewater treatment plants by natural disasters is essential to maintain the stability of wastewater treatment systems. However, studies on the quantified analysis of natural disaster effects on wastewater treatment systems are very rare. In this study, a total disaster index (DI) was developed to quantify the various damages to wastewater treatment systems from natural disasters using two statistical methods (i.e., AHP: analytic hierarchy process and PCA: principal component analysis). Typhoons, heavy rain, and earthquakes are considered as three major natural disasters for the development of the DI. A total of 15 input variables from public open-source data (e.g., statistical yearbook of wastewater treatment system, meteorological data and financial status in local governments) were used for the development of a DI for 199 wastewater treatment plants in Korea. The total DI was calculated from the weighted sum of the disaster indices of the three natural disasters (i.e., TI for typhoon, RI for heavy rain, and EI for earthquake). The three disaster indices of each natural disaster were determined from four components, such as possibility of occurrence and expected damages. The relative weights of the four components to calculate the disaster indices (TI, RI and EI) for each of the three natural disasters were also determined from AHP. PCA was used to determine the relative weights of the input variables to calculate the four components. The relative weights of TI, RI and EI to calculate total DI were determined as 0.547, 0.306, and 0.147 respectively.
Turbidity has various effects on the water quality and ecosystem of a river. High turbidity during floods increases the operation cost of a drinking water supply system. Thus, the management of turbidity is essential for providing safe water to the public. There have been various efforts to estimate turbidity in river systems for proper management and early warning of high turbidity in the water supply process. Advanced data analysis technology using machine learning has been increasingly used in water quality management processes. Artificial neural networks(ANNs) is one of the first algorithms applied, where the overfitting of a model to observed data and vanishing gradient in the backpropagation process limit the wide application of ANNs in practice. In recent years, deep learning, which overcomes the limitations of ANNs, has been applied in water quality management. LSTM(Long-Short Term Memory) is one of novel deep learning algorithms that is widely used in the analysis of time series data. In this study, LSTM is used for the prediction of high turbidity(>30 NTU) in a river from the relationship of turbidity to discharge, which enables early warning of high turbidity in a drinking water supply system. The model showed 0.98, 0.99, 0.98 and 0.99 for precision, recall, F1-score and accuracy respectively, for the prediction of high turbidity in a river with 2 hour frequency data. The sensitivity of the model to the observation intervals of data is also compared with time periods of 2 hour, 8 hour, 1 day and 2 days. The model shows higher precision with shorter observation intervals, which underscores the importance of collecting high frequency data for better management of water resources in the future.
The proper operation and safety management of water and wastewater treatment systems are essential for providing stable water service to the public. However, various natural disasters including floods, large storms, volcano eruptions and earthquakes threaten public water services by causing serious damage to water and wastewater treatment plants and pipeline systems. Korea is known as a country that is relatively safe from earthquakes, but the recent increase in the frequency of earthquakes has increased the need for a proper earthquake management system. Interest in research and the establishment of legal regulations has increased, especially since the large earthquake in Gyeongju in 2016. Currently, earthquakes in Korea are managed by legal regulations and guidelines integrated with other disasters such as floods and large storms. The legal system has long been controlled and relatively well managed, but technical research has made limited progress since it was considered in the past that Korea is safe from earthquake damage. Various technologies, including seismic design and earthquake forecasting, are required to minimize possible damages from earthquakes, so proper research is essential. This paper reviews the current state of technology development and legal management systems to prevent damages and restore water and wastewater treatment systems after earthquakes in Korea and other countries. High technologies such as unmanned aerial vehicles, wireless networks and real-time monitoring systems are already being applied to water and wastewater treatment processes, and to further establish the optimal system for earthquake response in water and wastewater treatment facilities, continuous research in connection with the Fourth Industrial Revolution, including information and communications technologies, is essential.
일반적으로 SWRO의 경우 에너지 소비량은 3.5 kWh/m³ 이상의 에너지를 소비한다. 그중 RO트레인은 2.5~3.0 kWh/m³를 소모해 전체 에너지 소비량의 70% 이상을 소비하고 있으며, 전체 시스템 에너지 절감을 위해서는 RO트레인의 최적 설계가 중요하다. 따라서 당사는 다양한 RO트레인의 설계를 최적화 하여 소모되는 에너지의 양을 10% 절감하고자 한다. 1) ISDInternally Staged Desing)을 적용한 1st Pass의 설계최적화, 2) SPSP(Split Partial Sencond Pass) 적용을 통한 2nd 용량 최적화 설계, 3) 초고성능 막을 적용한 Single Pass 설계방법의 최적 조합을 통해 저에너지 역삼투막 시스템 설계 기술을 개발하고자 한다.
본 연구는 기업이 직면한 혁신의 장애요인을 자금부족과 정보부족으로 조명하 고 수직적 협력과 수평적 협력에 미치는 영향을 분석한다. 이를 위해 한국 제조업 대상으로 진행된 기술혁신조사(KIS) 자료를 이용하였다. 연구결과, 혁신의 장애요인인 자금부족은 수 직적 협력에 정(+)의 관계가 있다는 가설은 지지되지 않았다. 하지만, 정보부족은 수직적 협 력과 수평적 협력에 정(+)의 관계가 있다는 가설은 지지되었다. 기존 연구들은 R&D협력의 선행요인으로서 기업 내부의 활동과 관련한 논의를 진행하였으나, 본 연구에서는 이를 혁신 의 장애요인으로 확장하였다. 또한, 경쟁의 구도가 기업과 경쟁사 사이의 기업 간 경쟁에서 서로 다른 협력 집단들 간의 경쟁구도로 변화함에 따라 R&D협력 유형을 범위를 수평적 협 력까지 확장하여 실증 분석하였다.