검색결과

검색조건
좁혀보기
검색필터
결과 내 재검색

간행물

    분야

      발행연도

      -

        검색결과 5

        1.
        2021.09 KCI 등재 구독 인증기관 무료, 개인회원 유료
        Non-verbal communication is important in human interaction. It provides a layer of information that complements the message being transmitted. This type of information is not limited to human speakers. In human– robot communication, increasing the animacy of the robotic agent—by using non-verbal cues—can aid the expression of abstract concepts such as emotions. Considering the physical limitations of artificial agents, robots can use light and movement to express equivalent emotional feedback. This study analyzes the effects of LED and motion animation of a spherical robot on the emotion being expressed by the robot. A within-subjects experiment was conducted at the University of Tsukuba where participants were asked to rate 28 video samples of a robot interacting with a person. The robot displayed different motions with and without light animations. The results indicated that adding LED animations changes the emotional impression of the robot for valence, arousal, and dominance dimensions. Furthermore, people associated various situations according to the robot’s behavior. These stimuli can be used to modulate the intensity of the emotion being expressed and enhance the interaction experience. This paper facilitates the possibility of designing more affective robots in the future, using simple feedback.
        4,000원
        2.
        2010.02 KCI 등재 서비스 종료(열람 제한)
        This paper presents the sound-based emotion estimation method and the growing HRI (human-robot interaction) system for a Mon-E robot. The method of emotion estimation uses the musical element based on the law of harmony and counterpoint. The emotion is estimated from sound using the information of musical elements which include chord, tempo, volume, harmonic and compass. In this paper, the estimated emotions display the standard 12 emotions including Eckman’s 6 emotions (anger, disgust, fear, happiness, sadness, surprise) and the opposite 6 emotions (calmness, love, confidence, unhappiness, gladness, comfortableness) of those. The growing HRI system analyzes sensing information, estimated emotion and service log in an edutainment robot. So, it commands the behavior of the robot. The growing HRI system consists of the emotion client and the emotion server. The emotion client estimates the emotion from sound. This client not only transmits the estimated emotion and sensing information to the emotion server but also delivers response coming from the emotion server to the main program of the robot. The emotion server not only updates the rule table of HRI using information transmitted from the emotion client and but also transmits the response of the HRI to the emotion client. The proposed system was applied to a Mon-E robot and can supply friendly HRI service to users.
        3.
        2008.05 KCI 등재 서비스 종료(열람 제한)
        This paper presents the concept for the development of a pet-type robot with an emotion engine. The pet-type robot named KOBIE (KOala roBot with Intelligent Emotion) is able to interact with a person through touch. KOBIE is equipped with tactile sensors on the body for interaction with a person through recognition of his/her touching behaviors such as “Stroke”,“Tickle”,“Hit”. We have covered KOBIE with synthetic fur fabric in order to can make him/her feel affection as well. KOBIE is able to also express an emotional status that varies according to the circumstances under which it is presented. The emotion engine of KOBIE's emotion expression system generates an emotional status in an emotion vector space which is associated with a predefined needs and mood models. In order to examine the feasibility of our emotion expression system, we verified a changing emotional status in our emotion vector space by a touching behavior. We specially examined the reaction of children who have interacted with three kind of pet-type robots: KOBIE, PARO, AIBO for roughly 10 minutes to investigate the children's preference for pet-type robots.
        4.
        2007.09 KCI 등재 서비스 종료(열람 제한)
        Humanoid and android robots are emerging as a trend shifts from industrial robot to personal robot. So human-robot interaction will increase. Ultimate objective of humanoid and android would be a robot like a human. In this aspect, implementation of robot’s facial expression is necessary in making a human-like robot. This paper proposes a dynamic emotion model for a mascot-type robot to display similar facial and more recognizable expressions.
        5.
        2007.06 KCI 등재 서비스 종료(열람 제한)
        Emotion interaction between human and robot is an important element for natural interaction especially for service robot. We propose a hybrid emotion generation architecture and detailed design of reactive process in the architecture based on insight about human emotion system. Reactive emotion generation is to increase task performance and believability of the service robot. Experiment result shows that it seems possible for the reactive process to function for those purposes, and reciprocal interaction between different layers is important for proper functioning of robot’s emotion generation system.