간행물

한국산업경영시스템학회지 KCI 등재 Journal of Society of Korea Industrial and Systems Engineering

권호리스트/논문검색
이 간행물 논문 검색

권호

Vol. 39 No. 1 (2016년 3월) 19

1.
2016.03 구독 인증기관 무료, 개인회원 유료
Patent management activities are considered to play a key role for technology-based firms under the recent knowledge-based economies. This is because intellectual property, including patents, can act as a system for continuous profit generation by protecting firms’ products, processes and services. In Korea, healthcare industry is now regarded as one of the promising next generation industries. Despite the promise of healthcare industry, Korean healthcare product manufacturers are faced with turbulent business changes, such as market opening. Even though there are various industrial studies on the effect of patent management activities on firm outcome, previous studies have hardly paid attention to Korean healthcare product manufacturing firms. For this reason, this study identifies the effect of patent management activities, such as patenting activeness, technical excellence and cooperation degree, on firm outcomes, including financial profitability and firm growth, with respect to the Korean healthcare product manufacturers. In this study, we located 86 Korean healthcare manufacturing firms from KORCHAMBIZ and DART, and then collected the data of their patenting activities and outcomes between 2001 and 2013. By applying factor analysis and regression analysis, our empirical study found that firms’ patenting activeness has the significant positive relationship on firms’ financial profitability, and firms’ patenting activeness and technical excellence have the significant positive relationship on firms’ financial growth. Our study is an initial attempt to identify the effect of patent management activities on firm outcome within Korean healthcare product manufacturing industry, and thus its results can be used as the basis to formulate national policies for Korean healthcare product industry.
4,000원
2.
2016.03 구독 인증기관 무료, 개인회원 유료
The project schedule risk in the engineering and facility construction industry is increasingly considered as important management factor because the risks in terms of schedule or deadline may significantly affect the project cost. Especially, the project-based operating companies attempt to find the best estimate of the project completion time for use at their proposals, and therefore, usually have much interest in accurate estimation of the duration of the projects. In general, the management of projects schedule risk is achieved by modeling project schedule with PERT/CPM techniques, and then performing risk assessment with simulation such as Monte-Carlo simulation method. However, since these approaches require the accumulated executional data, which are not usually available in project-based operating company, and, further, they cannot reflect various schedule constraints, which usually are met during the project execution, the project managers have difficulty in preparing for the project risks in advance of their occurrence in the project execution. As these constraints may affect time and cost which role as the crucial evaluation factors to the quality of the project result, they must be identified and described in advance of their occurrence in the project management. This paper proposes a Bayesian Net based methodology for estimating project schedule risk by identifying and enforcing the project risks and its response plan which may occur in storage tank engineering and construction project environment. First, we translated the schedule network with the project risks and its response plan into Bayesian Net. Second, we analyzed the integrated Bayesian Net and suggested an estimate of project schedule risk with simulation approach. Finally, we applied our approach to a storage tank construction project to validate its feasibility.
4,000원
3.
2016.03 구독 인증기관 무료, 개인회원 유료
Many organizations have transformed their business in order to survive and compete in the future. They generate projects by creating a vision, using strategies and objectives with funds aligning strategies and make efforts to complete them successfully because project success leads to business success. All projects have triple constraints such as scope, time, and cost to be completed. Project cost performance is a key factor to achieve project goals and which is mostly related with risks among various cost drivers. Projects require a cost estimation method to complete them within their budget and on time. An accurate budget cannot be estimated due to the uncertainties and risks. Thus some additional money should be funded in addition to the base budget as a contingency reserve for identified risks and a management reserve for unidentified risks. While research on contingency reserve for identified risks included in project budget baseline have been presented, research on management reserve for unidentified risks included in total project budget is still scarce. The lack of research on estimation method and role of the management reserve have made project managers little confidence to estimate project budget accurately with reasonable basis. This study proposes a practical model to estimate budgets including contingency and management reserves for not only project cost management but also to keep the balance of organization’s total funds to maximize return on investments for project portfolio management. The advantages of the proposed model are demonstrated by its application to construction projects in Korea and the processes to apply this model for verification are also provided.
4,000원
4.
2016.03 구독 인증기관 무료, 개인회원 유료
It is critical to forecast the maximum daily and monthly demand for power with as little error as possible for our industry and national economy. In general, long-term forecasting of power demand has been studied from both the consumer’s perspective and an econometrics model in the form of a generalized linear model with predictors. Time series techniques are used for short-term forecasting with no predictors as predictors must be predicted prior to forecasting response variables and containing estimation errors during this process is inevitable. In previous researches, seasonal exponential smoothing method, SARMA (Seasonal Auto Regressive Moving Average) with consideration to weekly pattern Neuron-Fuzzy model, SVR (Support Vector Regression) model with predictors explored through machine learning, and K-means clustering technique in the various approaches have been applied to short-term power supply forecasting. In this paper, SARMA and intervention model are fitted to forecast the maximum power load daily, weekly, and monthly by using the empirical data from 2011 through 2013. ARMA(2,1,2)(1,1,1)7 and ARMA (0,1,1)(1,1,0)12 are fitted respectively to the daily and monthly power demand, but the weekly power demand is not fitted by AREA because of unit root series. In our fitted intervention model, the factors of long holidays, summer and winter are significant in the form of indicator function. The SARMA with MAPE (Mean Absolute Percentage Error) of 2.45% and intervention model with MAPE of 2.44% are more efficient than the present seasonal exponential smoothing with MAPE of about 4%. Although the dynamic repression model with the predictors of humidity, temperature, and seasonal dummies was applied to foretaste the daily power demand, it lead to a high MAPE of 3.5% even though it has estimation error of predictors.
4,000원
5.
2016.03 구독 인증기관 무료, 개인회원 유료
Recently, Live-Virtual-Constructive (L-V-C) integrate training system has proposed as a solution for the problems such as limitation of training areas, increase of mission complexity, rise in oil prices. In order to integrate each training system into the one effectively, we should solve the issue about stress of pilots by the environmental differences between Live and Virtual simulation which could be occurred when each system is connected together. Although it was already examined in previous study that the psychological effects on pilots was occurred by the environmental differences between actual and simulated flights, the study did not include what the causal factors affecting psychological effects are. The aim of this study is to examine which environmental factors that cause pilots’ psychological effects. This study analyzed the biochemical stress hormone, cortisol to measure the pilots’ psychological effects and cortisol was measured using Enzyme-linked immunoassay (EIA). A total of 40 pilots participated in the experiment to compare the differences in pilots’ cortisol response among live simulation, virtual simulation, and the virtual simulation applying three environmental factors (gravity force, noise, and equipment) respectively. As a result, there were significant differences in cortisol level when applied the gravity force and equipment factors to the virtual simulation, while there was no significant difference in the case of the noise factor. The results from this study can be used as a basis for the future research on how to make L-V system by providing minimum linkage errors and design the virtual simulator that can reduce the differences in the pilots’ psychological effects.
4,000원
6.
2016.03 구독 인증기관 무료, 개인회원 유료
Quality has been a key issue to manufacturers. Many distinguished scholars have defined quality with profound insight. Korean firms struggle to make better products to fulfil requirements and satisfy customers. Korean industries have implemented quality management from Japan in early 70s. Statistical quality control, QCC (Quality Control Circle), and total quality management have also been introduced in succession. Chief executive officers, managers, and field employees have been aware of the importance of quality since then. This quality movement force workers to improve quality. They have to maintain the quality of products and compete with foreign products. Korean industries were able to compete with foreign industries in price. However, Korean firms now have to compete in quality as well as price. ISO (International Organization for Standardization) was established and industries around world have started to implement standardized systems depending on their need. ISO 9000 has continuously been revised and firms around world started to register a ISO 9000 certificate. Today’s quality competitiveness gets more deeply involved. KSA (Korean Standard Association) have launched QCAS (Quality Competitiveness Assessment System) since 1997. Up until now recent status of QCAS have been reported but the characteristics of QCAS results have not been analyzed. In this article we examine the QCAS results of 41 firms in 2014. QCAS consisted of 13 subsections : strategy and management system, organization culture and development of human resource, information management, quality system, customer satisfaction, management achievement, TPM, logistics, product development and technology, PL, QCC, SQC/SPC, and reliability. We performed one way ANOVA to discover the difference among the levels of firm size, business type, and quality hall of fame using the total scores of 13 subsections resulted from QCAS. We also analyzed the scores of 13 individual subsections of QCAS to see if there is any differences based on firm size and business type. We interpret the results and implication of analysis and finally draw a conclusion.
4,000원
7.
2016.03 구독 인증기관 무료, 개인회원 유료
As one of recent issues in the information and communication industry, internet of things has attracted attention to provide intelligent infrastructure services which connect and share data and information between real and virtual world. According to the development of these internet of things technologies, types of machines, telecommunication devices, and terminals are increasing tremendously. In this situation, connectivity and interoperability between internet of things components are important issues to build a hyper connected society. To visualize this society, it is important to set up and develop information and communication technology (ICT) standards among stakeholders. However, under limited budget and human resources, it is essential to rank standardization work items for setting standards with respect to efficiency. The purpose of this study is to provide a method for setting standardization strategies within group decision making. As a multi-criteria group decision making tool, analytic hierarchy process (AHP) is adopted and applied to determine the priorities of setting work items. The proposed method first defines decision making problem with objective, criteria, and alternatives which produces a hierarchy consisting of upper and lower criteria. Then, pairwise comparisons of academy and public sector experts are performed with respect to their relative meaning and importance. Individual surveys of expert groups are collected and summarized to determine relative criteria importance measures. Furthermore, to deliver reliable importance criteria measures, differences between academy and public sector expert groups are compared and tested using Mann-Whitney non-parametric test. The results are illustrated for useful guidelines to practical group decision making in standardization strategy establishments.
4,000원
8.
2016.03 구독 인증기관 무료, 개인회원 유료
We consider a satellite mission scheduling problem, which is a promising problem in recent satellite industry. This problem has various considerations such as customer importance, due date, limited capacity of energy and memory, distance of the location of each mission, etc. Also we consider the objective of each satellite such as general purpose satellite, strategic mission and commercial satellite. And this problem can be modelled as a general knapsack problem, which is famous NP-hard problem, if the objective is defined as to maximize the total mission score performed. To solve this kind of problem, heuristic algorithm such as taboo and genetic algorithm are applied and their performance are acceptable in some extent. To propose more efficient algorithm than previous research, we applied a particle swarm optimization algorithm, which is the most promising method in optimization problem recently in this research. Owing to limitation of current study in obtaining real information and several assumptions, we generated 200 satellite missions with required information for each mission. Based on generated information, we compared the results by our approach algorithm with those of CPLEX. This comparison shows that our proposed approach give us almost accurate results as just less than 3% error rate, and computation time is just a little to be applied to real problem. Also this algorithm has enough scalability by innate characteristic of PSO. We also applied it to mission scheduling problem of various class of satellite. The results are quite reasonable enough to conclude that our proposed algorithm may work in satellite mission scheduling problem.
4,000원
9.
2016.03 구독 인증기관 무료, 개인회원 유료
In this paper, we utilize a Gaussian process to predict the power consumption in the air-conditioning system. As the power consumption in the air-conditioning system takes a form of a time-series and the prediction of the power consumption becomes very important from the perspective of the efficient energy management, it is worth to investigate the time-series model for the prediction of the power consumption. To this end, we apply the Gaussian process to predict the power consumption, in which the Gaussian process provides a prior probability to every possible function and higher probabilities are given to functions that are more likely consistent with the empirical data. We also discuss how to estimate the hyper-parameters, which are parameters in the covariance function of the Gaussian process model. We estimated the hyper-parameters with two different methods (marginal likelihood and leave-one-out cross validation) and obtained a model that pertinently describes the data and the results are more or less independent of the estimation method of hyper-parameters. We validated the prediction results by the error analysis of the mean relative error and the mean absolute error. The mean relative error analysis showed that about 3.4% of the predicted value came from the error, and the mean absolute error analysis confirmed that the error in within the standard deviation of the predicted value. We also adopt the non-parametric Wilcoxon’s sign-rank test to assess the fitness of the proposed model and found that the null hypothesis of uniformity was accepted under the significance level of 5%. These results can be applied to a more elaborate control of the power consumption in the air-conditioning system.
4,000원
10.
2016.03 구독 인증기관 무료, 개인회원 유료
This study focuses on the formation of input release lots in a semiconductor wafer fabrication facility. After the order-lot pegging process assigns lots in the fab to orders and calculates the required quantity of wafers for each product type to meet customers’ orders, the decisions on the formation of input release lots should be made to minimize the production costs of the release lots. Since the number of lots being processed in the wafer fab directly is related to the productivity of the wafer fab, the input lot formation is crucial process to reduce the production costs as well as to improve the efficiency of the wafer fab. Here, the input lot formation occurs before every shift begins in the semiconductor wafer fab. When input quantities (of wafers) for product types are given from results of the order-lot pegging process, lots to be released into the wafer fab should be formed satisfying the lot size requirements. Here, the production cost of a homogeneous lot of the same type of product is less than that of a heterogeneous lot that will be split into the number of lots according to their product types after passing the branch point during the wafer fabrication process. Also, more production cost occurs if a lot becomes more heterogeneous. We developed a multi-dimensional dynamic programming algorithm for the input lot formation problem and showed how to apply the algorithm to solve the problem optimally with an example problem instance. It is necessary to reduce the number of states at each stage in the DP algorithm for practical use. Also, we can apply the proposed DP algorithm together with lot release rules such as CONWIP and UNIFORM.
4,000원
11.
2016.03 구독 인증기관 무료, 개인회원 유료
Different from general operating policies to be applied for controllable queueing models, two of three well-known simple N, T and D operating policies are applied alternatingly to the single server controllable queueing models, so called alternating (NT), (ND) and (TD) policies. For example, the alternating (ND) operating policy is defined as the busy period is initiated by the simple N operating policy first, then the next busy period is initiated by the simple D operating policy and repeats the same sequence after that continuously. Because of newly designed operating policies, important system characteristic such as the expected busy and idle periods, the expected busy cycle, the expected number of customers in the system and so on should be redefined. That is, the expected busy and idle periods are redefined as the sum of the corresponding expected busy periods and idle periods initiated by both one of the two simple operating policies and the remaining simple operating policy, respectively. The expected number of customers in the system is represented by the weighted or pooled average of both expected number of customers in the system when the predetermined two simple operating policies are applied in sequence repeatedly. In particular, the expected number of customers in the system could be used to derive the expected waiting time in the queue or system by applying the famous Little’s formulas. Most of such system characteristics derived would play important roles to construct the total cost functions per unit time for determination of the optimal operating policies by defining appropriate cost elements to operate the desired queueing systems.
4,000원
12.
2016.03 구독 인증기관 무료, 개인회원 유료
This study is concerned about the process capability index in single process. Previous process capability indices have been developed for the consistency with the nonconforming rate due to the process target value and skewness. These indices calculate the process capability by measuring one spot in an item. But the only one datum in an item reduces the representativeness of the item. In addition to the lack of representativeness, there are many cases that the uniformity of the item such as flatness of panel is absolutely important. In these cases, we have to measure several spots in an item. Also the nonconforming judgment to an item is mainly due to the range not due to the standard variation or the shift from the specifications. To imply the uniformity concept to the process capability index, we should consider only the variation in an item. It is the within subgroup variation. When the universe is composed of several subgroups, the sample standard deviation is the sum of the within subgroup variation and the between subgroup variation. So the range R which represents only the within subgroup variation is the much better measure than that of the sample standard deviation. In general, a subgroup contains a couple of individual items. But in our cases, a subgroup is an item and R is the difference between the maximum and the minimum among the measured data in an item. Even though our object is a single process index, causing by the subgroups, its analytic structure looks like a system process capability index. In this paper we propose a new process capability index considering the representativeness and uniformity.
4,000원
13.
2016.03 구독 인증기관 무료, 개인회원 유료
With the development of modern science and technology, weapon systems such as tanks, submarines, combat planes, radar are also dramatically advanced. Among these weapon systems, the ballistic missile, one of the asymmetric forces, could be considered as a very economical means to attack the core facilities of the other country in order to achieve the strategic goals of the country during the war. Because of the current ballistic missile threat from the North Korea, establishing a missile defense (MD) system becomes one of the major national defense issues. This study focused on the optimization of air defense artillery units’ deployment for effective ballistic missile defense. To optimize the deployment of the units, firstly this study examined the possibility of defense, according to the presence of orbital coordinates of ballistic missiles in the limited defense range of air defense artillery units. This constraint on the defense range is originated from the characteristics of anti-ballistic missiles (ABMs) such as PATRIOT. Secondly, this study proposed the optimized mathematical model considering the total covering problem of binary integer programming, as an optimal deployment of air defense artillery units for defending every core defense facility with the least number of such units. Finally, numerical experiments were conducted to show how the suggested approach works. Assuming the current state of the Korean peninsula, the study arbitrarily set ballistic missile bases of the North Korea and core defense facilities of the South Korea. Under these conditions, numerical experiments were executed by utilizing MATLAB R2010a of the MathWorks, Inc.
4,000원
14.
2016.03 구독 인증기관 무료, 개인회원 유료
Ensemble classification involves combining individually trained classifiers to yield more accurate prediction, compared with individual models. Ensemble techniques are very useful for improving the generalization ability of classifiers. The random subspace ensemble technique is a simple but effective method for constructing ensemble classifiers; it involves randomly drawing some of the features from each classifier in the ensemble. The instance selection technique involves selecting critical instances while deleting and removing irrelevant and noisy instances from the original dataset. The instance selection and random subspace methods are both well known in the field of data mining and have proven to be very effective in many applications. However, few studies have focused on integrating the instance selection and random subspace methods. Therefore, this study proposed a new hybrid ensemble model that integrates instance selection and random subspace techniques using genetic algorithms (GAs) to improve the performance of a random subspace ensemble model. GAs are used to select optimal (or near optimal) instances, which are used as input data for the random subspace ensemble model. The proposed model was applied to both Kaggle credit data and corporate credit data, and the results were compared with those of other models to investigate performance in terms of classification accuracy, levels of diversity, and average classification rates of base classifiers in the ensemble. The experimental results demonstrated that the proposed model outperformed other models including the single model, the instance selection model, and the original random subspace ensemble model.
4,200원
15.
2016.03 구독 인증기관 무료, 개인회원 유료
Recently, the production cycle in manufacturing process has been getting shorter and different types of product have been produced in the same process line. In this case, the control chart using coefficient of variation would be applicable to the process. The theory that random variables are located in the three times distance of the deviation from mean value is applicable to the control chart that monitor the process in the manufacturing line, when the data of process are changed by the type of normal distribution. It is possible to apply to the control chart of coefficient of variation too.  ,  estimates that taken in the coefficient of variation have just used all of the data, but the upper control limit, center line and lower control limit have been settled by the effect of abnormal values, so this control chart could be in trouble of detection ability of the assignable value. The purpose of this study was to present the robust control chart than coefficient of variation control chart in the normal process. To perform this research, the location parameter, xα, sα were used. The robust control chart was named Tim-CV control chart. The result of simulation were summarized as follows; First, P values, the probability to get away from control limit, in Trim-CV control chart were larger than CV control chart in the normal process. Second, ARL values, average run length, in Trim-CV control chart were smaller than CV control chart in the normal process. Particularly, the difference of performance of two control charts was so sure when the change of the process was getting to bigger. Therefore, the Trim-CV control chart proposed in this paper would be more efficient tool than CV control chart in small quantity batch production.
4,000원
16.
2016.03 구독 인증기관 무료, 개인회원 유료
In the manufacturing industry fields, thousands of quality characteristics are measured in a day because the systems of process have been automated through the development of computer and improvement of techniques. Also, the process has been monitored in database in real time. Particularly, the data in the design step of the process have contributed to the product that customers have required through getting useful information from the data and reflecting them to the design of product. In this study, first, characteristics and variables affecting to them in the data of the design step of the process were analyzed by decision tree to find out the relation between explanatory and target variables. Second, the tolerance of continuous variables influencing on the target variable primarily was shown by the application of algorithm of decision tree, C4.5. Finally, the target variable, loss, was calculated by a loss function of Taguchi and analyzed. In this paper, the general method that the value of continuous explanatory variables has been used intactly not to be transformed to the discrete value and new method that the value of continuous explanatory variables was divided into 3 categories were compared. As a result, first, the tolerance obtained from the new method was more effective in decreasing the target variable, loss, than general method. In addition, the tolerance levels for the continuous explanatory variables to be chosen of the major variables were calculated. In further research, a systematic method using decision tree of data mining needs to be developed in order to categorize continuous variables under various scenarios of loss function.
4,000원
17.
2016.03 구독 인증기관 무료, 개인회원 유료
The purpose of this paper is to analyze the problems and the sources of defective products and draw improvement plans in a small plastic boat manufacturing process using TOC (Theory Of Constraints) and statistical analysis. TOC is a methodology to present a scheme for optimization of production process by finding the CCR (Capacity Constraints Resource) in the organization or the all production process through the concentration improvement activity. In this paper, we found and reformed constraints and bottlenecks in plastic boat manufacturing process in the target company for less defect ratio and production cost by applying DBR (Drum, Buffer, Rope) scheduling. And we set the threshold values for the critical process variables using statistical analysis. The result can be summarized as follows. First, CCRs in inventory control, material mix, and oven setting were found and solutions were suggested by applying DBR method. Second, the logical thinking process was utilized to find core conflict factors and draw solutions. Third, to specify the solution plan, experiment data were statistically analyzed. Data were collected from the daily journal addressing the details of 96 products such as temperature, humidity, duration and temperature of heating process, rotation speed, duration time of cooling, and the temperature of removal process. Basic statistics and logistic regression analysis were conducted with the defection as the dependent variable. Finally, critical values for major processes were proposed based on the analysis. This paper has a practical importance in contribution to the quality level of the target company through theoretical approach, TOC, and statistical analysis. However, limited number of data might depreciate the significance of the analysis and therefore it will be interesting further research direction to specify the significant manufacturing conditions across different products and processes.
4,000원
18.
2016.03 구독 인증기관 무료, 개인회원 유료
It is very important for the competitiveness and sustainable management of enterprises that the rapid changes in the managerial environments quickly and accurately are responded. For example, the large-scale investment accompanied by bad alternatives in accordance with misunderstanding of the managerial environments yields the huge cost and effort to modify and improve. In firm management, the quality of products and the productivity are influenced by changes of the endogenous factors yielded in manufacturing process and the exogenous factors as market, etc. These changes include not only changes in 4M (man, machine, material, method) but also those in the market, competitors, and technologies in the process of commodification, i.e., first, such disturbances make dispersion of the process big and odd. By Shewhart chart it can be checked that the process monitored is control-in or out. Business administration executes activities for input stabilization by monitoring changes in 4Ms, comparing with the standards, and taking measures for any abnormality. Second, TRM (technology road map) is to prospect product deployment and technological trend by predicting technologies in the competitive environment as the market, and to suggest the future directions of business. So, TRM must be modified and improved according to DR (design review) stages and changes in mass-production like input material change. Therefore, a role of TRM in input stabilization for reducing cost and man-hour is important. This study purposed to suggest that the environment changes are classified into endogenous factors and exogenous factors in production process, and then, quality and productivity should be stabilized efficiently through connection between TRM and input stabilization, and to prove that it is more effective for the display industry to connect TRM with input stabilization rather than to use TRM separately.
4,500원
19.
2016.03 구독 인증기관 무료, 개인회원 유료
Semiconductor manufacturing has suffered from the complex process behavior of the technology oriented control in the production line. While the technological processes are in charge of the quality and the yield of the product, the operational management is also critical for the productivity of the manufacturing line. The fabrication line in the semiconductor manufacturing is considered as the most complex part because of various kinds of the equipment, re-entrant process routing and various product devices. The efficiency and the productivity of the fabrication line may give a significant impact on the subsequent processes such as the probe line, the assembly line and final test line. In the management of the re-entrant process such as semiconductor fabrication, it is important to keep balanced fabrication line. The Performance measures in the fabrication line are throughput, cycle time, inventory, shortage, etc. In the fabrication, throughput and cycle time are the conflicting performance measures. It is very difficult to achieve two conflicting goal simultaneously in the manufacturing line. The capacity of equipment is important factor in the production planning and scheduling. The production planning consideration of capacity can make the scheduling more realistic. In this paper, an input and scheduling rule are to achieve the balanced operation in semiconductor fabrication line through equipment capacity and workload are proposed and evaluated. New backward projection and scheduling rule consideration of facility capacity are suggested. Scheduling wafers on the appropriate facilities are controlled by available capacity, which are determined by the workload in terms of the meet the production target.
4,000원