This study selected two labor-intensive processes in harsh environments among domestic food production processes. It analyzed their improvement effectiveness using 3-dimensional (3D) simulation. The selected processes were the “frozen storage source transfer and dismantling process” (Case 1) and the “heavily loaded box transfer process” (Case 2). The layout, process sequence, man-hours, and output of each process were measured during a visit to a real food manufacturing factory. Based on the data measured, the 3D simulation model was visually analyzed to evaluate the operational processes. The number of workers, work rate, and throughput were also used as comparison and verification indicators before and after the improvement. The throughput of Case 1 and Case 2 increased by 44.8% and 69.7%, respectively, compared to the previous one, while the utilization rate showed high values despite the decrease, confirming that the actual selected process alone is a high-fatigue and high-risk process for workers. As a result of this study, it was determined that 3D simulation can provide a visual comparison to assess whether the actual process improvement has been accurately designed and implemented. Additionally, it was confirmed that preliminary verification of the process improvement is achievable.
In this paper, we propose a new algorithm of the guidance line extraction for autonomous agricultural robot based on vision camera in paddy field. It is the important process for guidance line extraction which finds the central point or area of rice row. We are trying to use the central region data of crop that the direction of rice leaves have convergence to central area of rice row in order to improve accuracy of the guidance line. The guidance line is extracted from the intersection points of extended virtual lines using the modified robust regression. The extended virtual lines are represented as the extended line from each segmented straight line created on the edges of the rice plants in the image using the Hough transform. We also have verified an accuracy of the proposed algorithm by experiments in the real wet paddy.
Global positioning system (GPS) is widely used to measure the position of a vehicle. However, the accuracy of the GPS can be severely affected by surrounding environmental conditions. To deal with this problem, the GPS and odometry data can be combined using an extended Kalman filter. For stable navigation of an outdoor mobile robot using the GPS, this paper proposes two methods to evaluate the reliability of the GPS data. The first method is to calculate the standard deviation of the GPS data and reflect it to deal with the uncertainty of the GPS data. The second method is to match the GPS data to the traversability map which can be obtained by classifying outdoor terrain data. By matching of the GPS data with the traversability map, we can determine whether to use the GPS data or not. The experimental results show that the proposed methods can enhance the performance of the GPS‐based outdoor localization.
This paper proposes a low-complexity indoor localization method of mobile robot under the dynamic environment by fusing the landmark image information from an ordinary camera and the distance information from sensor nodes in an indoor environment, which is based on sensor network. Basically, the sensor network provides an effective method for the mobile robot to adapt to environmental changes and guides it across a geographical network area. To enhance the performance of localization, we used an ordinary CCD camera and the artificial landmarks, which are devised for self-localization. Experimental results show that the real-time localization of mobile robot can be achieved with robustness and accurateness using the proposed localization method.
Low-cost sensors have been widely used for mobile robot navigation in recent years. However, navigation performance based on low-cost sensors is not good enough to be practically used. Among many navigation techniques, building of an accurate map is a fundamental task for service robots, and mapping with low-cost IR sensors was investigated in this research. The robot’s orientation uncertainty was considered for mapping by modifying the Bayesian update formula. Then, the data association scheme was investigated to improve the quality of a built map when the robot’s pose uncertainty was large. Six low-cost IR sensors mounted on the robot could not give rich data enough to align the range data by the scan matching method, so a new sample-based method was proposed for data association. The real experiments indicated that the mapping method proposed in this research was able to generate a useful map for navigation.
The UPnP is middleware architecture that supports dynamic distributed computing environment. It has many good features for possible use as middleware in robot system integration. There is a need for bulky binary data transmission between distributed robot S/W components. Since the UPnP utilizes SOAP protocol for message transmission, however, it is not efficient to send bulky binary data. In order to overcome this weak point, this paper proposes UPnP-MTOM, MTOM (Message Transmission Optimization Mechanism) implementation over UPnP, as an efficient way for bulky binary data transmission with UPnP messages. This paper presents our implementation method and experimental results of the UPnP-MTOM implementation.
Abstract It is essential to estimating positions of multiple robots in order to perform cooperative task in common workspace. Accordingly, we propose a new approach of cooperative localization for multiple robots utilizing correlation among GPS errors in common workspace. Assuming that GPS data of individual robot are correlated strongly as the distance among robots are close, it is confirmed that the proposed method provides improved localization accuracy. In addition, we define two operational parameters to apply proposed method in multiple robot system. With mentioned two parameters, we present a practical solution to accumulated position error in traveling long distance.