Recently, the importance of preventive maintenance has been emerging since failures in a complex system are automatically detected due to the development of artificial intelligence techniques and sensor technology. Therefore, prognostic and health management (PHM) is being actively studied, and prediction of the remaining useful life (RUL) of the system is being one of the most important tasks. A lot of researches has been conducted to predict the RUL. Deep learning models have been developed to improve prediction performance, but studies on identifying the importance of features are not carried out. It is very meaningful to extract and interpret features that affect failures while improving the predictive accuracy of RUL is important. In this paper, a total of six popular deep learning models were employed to predict the RUL, and identified important variables for each model through SHAP (Shapley Additive explanations) that one of the explainable artificial intelligence (XAI). Moreover, the fluctuations and trends of prediction performance according to the number of variables were identified. This paper can suggest the possibility of explainability of various deep learning models, and the application of XAI can be demonstrated. Also, through this proposed method, it is expected that the possibility of utilizing SHAP as a feature selection method.
This article suggests the machine learning model, i.e., classifier, for predicting the production quality of free-machining 303-series stainless steel(STS303) small rolling wire rods according to the operating condition of the manufacturing process. For the development of the classifier, manufacturing data for 37 operating variables were collected from the manufacturing execution system(MES) of Company S, and the 12 types of derived variables were generated based on literature review and interviews with field experts. This research was performed with data preprocessing, exploratory data analysis, feature selection, machine learning modeling, and the evaluation of alternative models. In the preprocessing stage, missing values and outliers are removed, and oversampling using SMOTE(Synthetic oversampling technique) to resolve data imbalance. Features are selected by variable importance of LASSO(Least absolute shrinkage and selection operator) regression, extreme gradient boosting(XGBoost), and random forest models. Finally, logistic regression, support vector machine(SVM), random forest, and XGBoost are developed as a classifier to predict the adequate or defective products with new operating conditions. The optimal hyper-parameters for each model are investigated by the grid search and random search methods based on k-fold cross-validation. As a result of the experiment, XGBoost showed relatively high predictive performance compared to other models with an accuracy of 0.9929, specificity of 0.9372, F1-score of 0.9963, and logarithmic loss of 0.0209. The classifier developed in this study is expected to improve productivity by enabling effective management of the manufacturing process for the STS303 small rolling wire rods.
Treatment and management of chronic low back pain (CLBP) should be tailored to the patient’s individual context. However, there are limited resources available in which to find and manage the causes and mechanisms for each patient. In this study, we designed and developed a personalized context awareness system that uses machine learning techniques to understand the relationship between a patient’s lower back pain and the surrounding environment. A pilot study was conducted to verify the context awareness model. The performance of the lower back pain prediction model was successful enough to be practically usable. It was possible to use the information from the model to understand how the variables influence the occurrence of lower back pain.
The printing process can have to print various colors with a limited capacity of printing facility such as ink containers that are needed cleaning to change color. In each container, cleaning time exists to assign corresponding inks, and it is considered as the setup cost required to reduce the increasing productivity. The existing manual method, which is based on the worker’s experience or intuition, is difficult to respond to the diversification of color requirements, mathematical modeling and algorithms are suggested for efficient scheduling. In this study, we propose a new type of scheduling problem for the printing process. First, we suggest a mathematical model that optimizes the color assignment and scheduling. Although the suggested model guarantees global optimality, it needs a lot of computational time to solve. Thus, we decompose the original problem into sequencing orders and allocating ink problems. An approximate function is used to compute the job scheduling, and local search heuristic based on 2-opt algorithm is suggested for reducing computational time. In order to verify the effectiveness of our method, we compared the algorithms' performance. The results show that the suggested decomposition structure can find acceptable solutions within a reasonable time. Also, we present schematized results for field application.
In weapon assignment studies to defend against threats such as ballistic missiles and long range artillery, threat assessment was partially lacking in analysis of various threat attributes, and considering the threat characteristics of warheads, which are difficult to judge in the early flight stages, it is very important to apply more reliable optimal solutions than approximate solution using LP model, Meta heuristics Genetic Algorithm, Tabu search and Particle swarm optimization etc. Our studies suggest Generic Rule based threat evaluation and weapon assignment algorithm in the basis of various attributes of threats. First job of studies analyzes information on Various attributes such as the type of target, Flight trajectory and flight time, range and intercept altitude of the intercept system, etc. Second job of studies propose Rule based threat evaluation and weapon assignment algorithm were applied to obtain a more reliable solution by reflection the importance of the interception system. It analyzes ballistic missiles and long-range artillery was assigned to multiple intercept system by real time threat assessment reflecting various threat information. The results of this study are provided reliable solution for Weapon Assignment problem as well as considered to be applicable to establishing a missile and long range artillery defense system.
Depression is one of the most important psychiatric disorders worldwide. Most depression-related data mining and machine learning studies have been conducted to predict the presence of depression or to derive individual risk factors. However, since depression is caused by a combination of various factors, it is necessary to identify the complex relationship between the factors in order to establish effective anti-depression and management measures. In this study, we propose a methodology for identifying and interpreting patterns of depression expressions using the method of deriving random forest rules, where the random forest rule consists of the condition for the manifestation of the depressive pattern and the prediction result of depression when the condition is met. The analysis was carried out by subdividing into 4 groups in consideration of the different depressive patterns according to gender and age. Depression rules derived by the proposed methodology were validated by comparing them with the results of previous studies. Also, through the AUC comparison test, the depression diagnosis performance of the derived rules was evaluated, and it was not different from the performance of the existing PHQ-9 summing method. The significance of this study can be found in that it enabled the interpretation of the complex relationship between depressive factors beyond the existing studies that focused on prediction and deduction of major factors.
This study explores multiple variables of an OTT service for discovering hidden relationship between rating and the other variables of each successful and failed content, respectively. In order to extract key variables that are strongly correlated to the rating across the contents, this work analyzes 170 Netflix original dramas and 419 movies. These contents are classified as success and failure by using the rating site IMDb, respectively. The correlation between the contents, which are classified via rating, and variables such as violence, lewdness and running time are analyzed to determine whether a certain variable appears or not in each successful and failure content. This study employs a regression analysis to discover correlations across the variables as a main analysis method. Since the correlation between independent variables should be low, check multicollinearity and select the variable. Cook's distance is used to detect and remove outliers. To improve the accuracy of the model, a variable selection based on AIC(Akaike Information Criterion) is performed. Finally, the basic assumptions of regression analysis are identified by residual diagnosis and Dubin Watson test. According to the whole analysis process, it is concluded that the more director awards exist and the less immatatable tend to be successful in movies. On the contrary, lower fear tend to be failure in movies. In case of dramas, there are close correlations between failure dramas and lower violence, higher fear, higher drugs.
Military aircraft R&D projects require large-scale investment in cost and time, and involve a complex coordination process in decision-making. The R&D project manager should determine the development management priorities as accurately as possible and focus on R&D capabilities, thereby reducing the risks of the aircraft R&D project. To this end, this study aims to reduce R&D risk by prioritizing cost, schedule, and performance, which are basic management factors used in R&D project management in defense project management regulations. Analytic Hierarchy Process (AHP) is applied using a questionnaire for managers in charge of aviation R&D under the Defense Acquisition Program Administration. As a primary result, the importance of the factors that the aircraft R&D project manager should consider was derived in the order of performance, cost, and schedule, and the priorities of performance and cost in the lower layer were also identified. In addition, in order to provide practical risk management measures to aircraft R&D project managers, the results of analyzing 28 cases of US National Transportation Safety Board accidents were compared and analyzed with the AHP analysis results, and management measures suitable for the situation were specified.
Online consumer activities have increased considerably since the COVID-19 outbreak. For the products and services which have an impact on everyday life, online reviews and recommendations can play a significant role in consumer decision-making processes. Thus, to better serve their customers, online firms are required to build online-centric marketing strategies. Especially, it is essential to define core value of customers based on the online customer reviews and to propose these values to their customers. This study discovers specific perceived values of customers in regard to a certain product and service, using online customer reviews and proposes a customer value proposition methodology which enables online firms to develop more effective marketing strategies. In order to discover customers value, the methodology employs a text-mining technology, which combines a sentiment analysis and topic modeling. By the methodology, customer emotions and value factors can be more clearly defined. It is expected that online firms can better identify value elements of their respective customers, provide appropriate value propositions, and thus gain sustainable competitive advantage.
Using the frequency-based decomposition, I decompose the consumption growth to explain well-known patterns of stock returns in the Korean market. To be more specific, the consumption growth is decomposed by its half-life of shocks. The component over four years of half-life is called the business-cycle consumption component, and the components with half-lives under four years are short-run components. I compute the long-run and short-run components of stock excess returns as well and use component- by-component sensitivities to price stock portfolios. As a result, the business-cycle consumption risk with half-life of over four years is useful in explaining the cross-section of size-book-to-market portfolios and size-momentum portfolios in the Korean stock market. The short-run components have their own pricing abilities with mixed direction, so that the restricted one short-term factor model is rejected. The explanatory power with short- and long-run components is comparable to that of the Fama-French three-factor model. The components with one- to four-year half-lives are also helpful in explaining the returns. The results about the long-run components emphasize the importance of long-run component in consumption growth to explain the asset returns.
In the early 1960s, US Air Force lost missile launch bases during ICBM development by a defect in the missile design and operation plan. U.S. DoD realized the limitation of the existing accident prevention method. Therefore, the weapon development required system safety activity, and procurement projects of U.S. DoD applied MIL-STD-882(System Safety). Development projects of U.S. DoD more emphasized the importance of system safety after the space shuttle Challenger exploded in 1986. Currently, Airworthiness certification for military aircraft uses system safety to minimize accidents. The domestic defense aviation R&D projects also use the system safety for the airworthiness certification. However, non-aviation weapon R&D projects rarely applied system safety. This paper presents a system safety application method for domestic weapon R&D projects by studying the U.S. military standards/organizations and domestic defense aviation projects.
Topic modeling has been receiving much attention in academic disciplines in recent years. Topic modeling is one of the applications in machine learning and natural language processing. It is a statistical modeling procedure to discover topics in the collection of documents. Recently, there have been many attempts to find out topics in diverse fields of academic research. Although the first Department of Industrial Engineering (I.E.) was established in Hanyang university in 1958, Korean Institute of Industrial Engineers (KIIE) which is truly the most academic society was first founded to contribute to research for I.E. and promote industrial techniques in 1974. Korean Society of Industrial and Systems Engineering (KSIE) was established four years later. However, the research topics for KSIE journal have not been deeply examined up until now. Using topic modeling algorithms, we cautiously aim to detect the research topics of KSIE journal for the first half of the society history, from 1978 to 1999. We made use of titles and abstracts in research papers to find out topics in KSIE journal by conducting four algorithms, LSA, HDP, LDA, and LDA Mallet. Topic analysis results obtained by the algorithms were compared. We tried to show the whole procedure of topic analysis in detail for further practical use in future. We employed visualization techniques by using analysis result obtained from LDA. As a result of thorough analysis of topic modeling, eight major research topics were discovered including Production/Logistics/Inventory, Reliability, Quality, Probability/Statistics, Management Engineering/Industry, Engineering Economy, Human Factor/Safety/Computer/Information Technology, and Heuristics/Optimization.
WDM(Wavelength Division Multiplexing) is called a wavelength division multiplexing optical transmission method and is a next-generation optical transmission technology. Case company F has recently developed and sold PLC(Planar Lightwave Circuit), a key element necessary for WDM system production. Although Chinese processing companies are being used as a global outsourcing strategy to increase price competitiveness by lowering manufacturing unit prices, the average defect rate of products manufactured by Chinese processing companies is more than 50%, causing many problems. However, Chinese processing companies are trying to avoid responsibility, saying that the cause of the defect is the defective PLC Wafer provided by Company F. Therefore, in this study, the responsibility of the PLC defect is clearly identified through estimating the defect rate of PLC using the sampling inspection method, and the improvement plan for each cause of the PLC defect for PLC yeild improvement is proposed. The result of this research will greatly contribute to eliminating the controversy over providing the cause of defects between global outsourcing companies and the head office. In addition, it is expected to form a partnership with Company F and a Chinese processing company, which will serve as a cornerstone for successful global outsourcing. In the future, it is necessary to increase the reliability of the PLC yield calculation by extracting more precisely the number of defects.
This study focuses on the necessity of MOT methods in companies, especially the utilization level. Based on the analysis structure of the previous study (2012), this study was conducted to compare the results with the previous results. We investigated the settlement level of MOT, the degree of necessity for MOT methods, the degree of actual use, and the Product Realization Process (PRP) for MOT-related researchers in electronic companies (n=184). It was confirmed that the higher the demand for MOT methods in the corporate field, the higher the utilization level (ratio). In particular, the need for and utilization of techniques such as Environmental Analysis, Business Opportunity Analysis, Project Feasibility Review, Roadmap, Risk Management was high. These methods were beneficial along with cost management and quality management techniques. The most challenging part of using MOT methods was the lack of systematic use, the absence of experts, and the difficulty in selecting suitable techniques. The necessity of opening subjects such as Creative Thinking, Communication, Teamwork, and Professional Ethics was high among the PRP subjects. Furthermore, the necessity of opening courses in Cost and Safety Design and Applied Statistics was higher than in the previous study.
Automated Guided Vehicle (AGV) is commonly used in manufacturing plant, warehouse, distribution center, and terminal. AGV is self-driven vehicle used to transport material between workstations in the shop floor without the help of an operator, and AGV includes a material transfer system located on the top and driving system at the bottom to move the vehicle as desired. For navigation, AGV mostly uses lane paths, signal paths or signal beacons. Various predominant sensors are also used in the AGV. However, in the conventional AGV, there is a problem of not turning or damaging nearby objects or AGV in a narrow space. In this paper, a new driving system is proposed to move the vehicle in a narrow space. In the proposed driving system, two sets of the combined steering-drive unit are adopted to solve the above problem. A prototype of AGV with the new driving system is developed for the comparative analysis with the conventional AGV. In addition, the experimental result shows the improved performance of the new driving system in the maximum speed, braking distance and positioning precision tests.
COVID-19 has been spreading all around the world, and threatening global health. In this situation, identifying and isolating infected individuals rapidly has been one of the most important measures to contain the epidemic. However, the standard diagnosis procedure with RT-PCR (Reverse Transcriptase Polymerase Chain Reaction) is costly and time-consuming. For this reason, pooled testing for COVID-19 has been proposed from the early stage of the COVID-19 pandemic to reduce the cost and time of identifying the COVID-19 infection. For pooled testing, how many samples are tested in group is the most significant factor to the performance of the test system. When the arrivals of test requirements and the test time are stochastic, batch-service queueing models have been utilized for the analysis of pooled-testing systems. However, most of them do not consider the false-negative test results of pooled testing in their performance analysis. For the COVID-19 RT-PCR test, there is a small but certain possibility of false-negative test results, and the group-test size affects not only the time and cost of pooled testing, but also the false-negative rate of pooled testing, which is a significant concern to public health authorities. In this study, we analyze the performance of COVID-19 pooled-testing systems with false-negative test results. To do this, we first formulate the COVID-19 pooled-testing systems with false negatives as a batch-service queuing model, and then obtain the performance measures such as the expected number of test requirements in the system, the expected number of RP-PCR tests for a test sample, the false-negative group-test rate, and the total cost per unit time, using the queueing analysis. We also present a numerical example to demonstrate the applicability of our analysis, and draw a couple of implications for COVID-19 pooled testing.
Maritime monitoring requirements have been beyond human operators capabilities due to the broadness of the coverage area and the variety of monitoring activities, e.g. illegal migration, or security threats by foreign warships. Abnormal vessel movement can be defined as an unreasonable movement deviation from the usual trajectory, speed, or other traffic parameters. Detection of the abnormal vessel movement requires the operators not only to pay short-term attention but also to have long-term trajectory trace ability. Recent advances in deep learning have shown the potential of deep learning techniques to discover hidden and more complex relations that often lie in low dimensional latent spaces. In this paper, we propose a deep autoencoder-based clustering model for automatic detection of vessel movement anomaly to assist monitoring operators to take actions on the vessel for more investigation. We first generate gridded trajectory images by mapping the raw vessel trajectories into two dimensional matrix. Based on the gridded image input, we test the proposed model along with the other deep autoencoder-based models for the abnormal trajectory data generated through rotation and speed variation from normal trajectories. We show that the proposed model improves detection accuracy for the generated abnormal trajectories compared to the other models.
Over the past 40 years, Korea's defense industry has been deepening into a low-efficiency industrial structure as the government directly controls prices, quantities, and costs. By implementing the Defense Industry Building Act in 2021, the government is creating a healthy ecosystem for the defense industry and strengthening its global competitiveness. In this study, based on KPC's Productivity Management System (PMS), a diagnostic model of defense companies implemented since 2013, the on-site diagnosis was performed from 4 to 28 days depending on the size of the company data was collected based on the results. The causal relationship was analyzed through structural equation model path analysis for the effect of innovation capability on productivity performance. As a result, it suggests that defense materials suppliers should focus on which core processes to innovate and strengthen and improve their innovation capabilities.
Recently, the demand for more precise and demand-oriented customized spatial information is increasing due to the 4th industrial revolution. In particular, the use of 3D spatial information and digital twins which based on spatial information, and research for solving social problems in cities by using such information are continuously conducted. Globally, non-face-to-face services are increasing due to COVID-19, and the national policy direction is also rapidly progressing digital transformation, digitization and virtualization of the Korean version of the New Deal, which means that 3D spatial information has become an important factor to support it. In this study, physical objects for cities defined by world organizations such as ISO, OGC, and ITU were selected and the target of the 3D object model was limited to buildings. Based on CityGML2.0, the data collected using a drone suitable for building a 3D model of a small area is selected to be updated through road name address and building ledger, which are administrative information related to this, and LoD2.5 data is constructed and urban space. It was intended to suggest an object update method for a 3D building among data.
As Deepfakes phenomenon is spreading worldwide mainly through videos in web platforms and it is urgent to address the issue on time. More recently, researchers have extensively discussed deepfake video datasets. However, it has been pointed out that the existing Deepfake datasets do not properly reflect the potential threat and realism due to various limitations. Although there is a need for research that establishes an agreed-upon concept for high-quality datasets or suggests evaluation criterion, there are still handful studies which examined it to-date. Therefore, this study focused on the development of the evaluation criterion for the Deepfake video dataset. In this study, the fitness of the Deepfake dataset was presented and evaluation criterions were derived through the review of previous studies. AHP structuralization and analysis were performed to advance the evaluation criterion. The results showed that Facial Expression, Validation, and Data Characteristics are important determinants of data quality. This is interpreted as a result that reflects the importance of minimizing defects and presenting results based on scientific methods when evaluating quality. This study has implications in that it suggests the fitness and evaluation criterion of the Deepfake dataset. Since the evaluation criterion presented in this study was derived based on the items considered in previous studies, it is thought that all evaluation criterions will be effective for quality improvement. It is also expected to be used as criteria for selecting an appropriate deefake dataset or as a reference for designing a Deepfake data benchmark. This study could not apply the presented evaluation criterion to existing Deepfake datasets. In future research, the proposed evaluation criterion will be applied to existing datasets to evaluate the strengths and weaknesses of each dataset, and to consider what implications there will be when used in Deepfake research.