Interaction between human and robot
Interaction between human and robot
Interaction between human and robot
Create successful ePaper yourself
Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.
<strong>Interaction</strong> <strong>between</strong> <strong>human</strong> <strong>and</strong> <strong>robot</strong><br />
Author: Jun-Ming Lu, Tzung-Cheng Tsai, Yeh-Liang Hsu (2010-10-05); recommended:<br />
Yeh-Liang Hsu (2010-10-05).<br />
Note: This paper is a chapter in the book “Talking about interaction”, to be published.<br />
Abstract<br />
<strong>Interaction</strong> <strong>between</strong> <strong>human</strong> <strong>and</strong> <strong>robot</strong><br />
With the rapid advancement of <strong>robot</strong>ics, <strong>robot</strong>s have become smarter <strong>and</strong> thus develop<br />
a closer relationship with <strong>human</strong>s over time. To accommodate this strong growth, the<br />
interaction <strong>between</strong> <strong>human</strong> <strong>and</strong> <strong>robot</strong> plays a major role in the modern applications of<br />
<strong>robot</strong>ics. This multidisciplinary field of research, namely <strong>human</strong>-<strong>robot</strong> interaction (HRI),<br />
requires contributions from a variety of research fields such as <strong>robot</strong>ics, <strong>human</strong>-computer<br />
interaction, <strong>and</strong> artificial intelligence. This chapter will first define the two main categories<br />
of <strong>robot</strong>s, namely industrial <strong>and</strong> service <strong>robot</strong>s, <strong>and</strong> their applications. Subsequently, the<br />
general HRI issues will be discussed to explain how they affect the use of <strong>robot</strong>s. Finally,<br />
key design elements for good HRI are summarized to reveal the enabling technologies <strong>and</strong><br />
future trends in the development of HRI.<br />
Keywords: Robot; <strong>robot</strong>ics; <strong>human</strong>-<strong>robot</strong> interaction (HRI); telepresence<br />
1. Robots <strong>and</strong> <strong>robot</strong>ics<br />
Prior to the use of the term “<strong>robot</strong>,” <strong>human</strong> beings have been dreamed about<br />
<strong>human</strong>-like creations that can assist in performing tasks for a long time. For example, in<br />
1495, Leonardo Da Vinci designed a <strong>human</strong>oid automaton that is intended to make several<br />
<strong>human</strong>-like motions (Figure 1). However, due to the technological limitations, most of<br />
these studies remain in conceptual stages. In the 18th century, miniature automatons come<br />
out as toys for entertainment. With the programmed musical box embedded in a doll, the<br />
melody starts to play as if the doll plays the instrument by itself. In 1920, the term “<strong>robot</strong>”<br />
was first introduced by Čapek in his play entitled “Rossum's Universal Robots.” Based on<br />
1<br />
http://grc.yzu.edu.tw/
<strong>Interaction</strong> <strong>between</strong> <strong>human</strong> <strong>and</strong> <strong>robot</strong><br />
his idea, <strong>robot</strong>s are the artificial people created to work as servants. In the beginning, the<br />
<strong>robot</strong>s are happy to work for the <strong>human</strong> who invented them. However, as time goes by, the<br />
<strong>robot</strong>s begin to revolt against <strong>human</strong>s <strong>and</strong> fight for their own rights. This play reflects the<br />
desire of <strong>human</strong> to enrich daily lives through the use of <strong>robot</strong>s, as well as the consequence<br />
it may lead to. From then on, the term “<strong>robot</strong>” began to be widespread <strong>and</strong> adopted in<br />
many domains to describe the <strong>human</strong>-like machines that work to assist <strong>human</strong>s.<br />
Figure 1. The <strong>human</strong>oid automation created by Da Vinci (Möller, 2005)<br />
In 1927, Roy J. Wensley invented the <strong>human</strong>oid “Televox,” which is likely the first<br />
actual <strong>robot</strong> <strong>and</strong> can be controlled by means of specific voice input (Figure 2). Later on,<br />
Elektro was on exhibit at the 1939 New York World's Fair with his mechanical dog Sparko<br />
(Figure 3). He can walk by voice comm<strong>and</strong>, speak about 700 words, smoke cigarettes, <strong>and</strong><br />
blow up balloons. These brilliant inventions immediately caught people‟s eyes <strong>and</strong><br />
encouraged them to continue bringing their dreams to reality. Afterwards, the term<br />
“<strong>robot</strong>ics” appeared in the science fiction “I, Robot” to describe this field of study. The<br />
three laws of <strong>robot</strong>ics were also proposed to address the interaction <strong>between</strong> <strong>robot</strong>s <strong>and</strong><br />
<strong>human</strong> beings (Asimov, 1942).<br />
2<br />
http://grc.yzu.edu.tw/
<strong>Interaction</strong> <strong>between</strong> <strong>human</strong> <strong>and</strong> <strong>robot</strong><br />
Figure 2. Roy J. Wensley <strong>and</strong> his <strong>human</strong>oid Televox (Hoggett, 2009)<br />
Figure 3. Elektro (middle) <strong>and</strong> Sparko (left) (McKellar, 2006)<br />
In late 20th century, a <strong>robot</strong> was defined as “a reprogrammable <strong>and</strong> multifunctional<br />
manipulator designed to move materials, parts, tools, or specialized devices through<br />
various programmed motions for the performance of a variety of tasks” (Robot Institute of<br />
America, 1979). Nevertheless, this fails to include the broad range of <strong>robot</strong>ics in modern<br />
development. At present, the <strong>robot</strong>s are more than agents that help to perform the repetitive<br />
works. Moreover, they are expected to cooperate with <strong>human</strong> beings. Generally speaking,<br />
according to the application fields, <strong>robot</strong>s can be categorized as industrial <strong>robot</strong>s <strong>and</strong><br />
service <strong>robot</strong>s. The different purposes <strong>and</strong> characteristics of these two types of <strong>robot</strong>s are<br />
discussed in the following context.<br />
3<br />
http://grc.yzu.edu.tw/
1.1 Industrial Robots<br />
4<br />
<strong>Interaction</strong> <strong>between</strong> <strong>human</strong> <strong>and</strong> <strong>robot</strong><br />
ISO 8373 (International St<strong>and</strong>ard Organization, 1994) defines an industrial <strong>robot</strong> as<br />
“an automatically controlled, reprogrammable, <strong>and</strong> multipurpose manipulator<br />
programmable in three or more axes, which may be either fixed in place or mobile for use<br />
in industrial automation applications.” On the basis of this concept, industrial <strong>robot</strong>s are<br />
intended to assist <strong>human</strong>s in repetitive tasks, so that the efficiency can be improved<br />
through the automation of manufacturing processes. For this purpose, industrial <strong>robot</strong>s do<br />
not need to resemble real <strong>human</strong>s. Instead, they are designed to imitate body movements of<br />
<strong>human</strong>s. Thus, an industrial <strong>robot</strong> generally comes in the form of an articulated <strong>robot</strong>ic arm.<br />
Typical applications can be seen in assembly, packaging (Figure 4a), painting (Figure 4b)<br />
<strong>and</strong> so on. In addition, industrial <strong>robot</strong>s can be seen in the agricultural (Figure 5a) <strong>and</strong> food<br />
industries (Figure 5b) as well.<br />
(a) (b)<br />
Figure 4. (a) An industrial <strong>robot</strong> for packaging (Gromyko, 2009); (b) an industrial <strong>robot</strong><br />
for printing (Schaefer, 2008)<br />
http://grc.yzu.edu.tw/
(a) (b)<br />
5<br />
<strong>Interaction</strong> <strong>between</strong> <strong>human</strong> <strong>and</strong> <strong>robot</strong><br />
Figure 5. (a) An industrial <strong>robot</strong> for cookie manufacturing (Garcia et al., 2007); (b) an<br />
agricultural <strong>robot</strong> for apple grading (Billingsley et al., 2009)<br />
1.2 Service Robots<br />
According to the International Federation of Robotics (IFR), a service <strong>robot</strong> is a <strong>robot</strong><br />
which operates semi- or fully autonomously to perform services useful to the well-being of<br />
<strong>human</strong>s <strong>and</strong> equipment, excluding manufacturing operations. Generally, service <strong>robot</strong>s<br />
include cleaning <strong>robot</strong>s, assistive <strong>robot</strong>s, wheelchair <strong>robot</strong>s, guide <strong>robot</strong>s, entertainment<br />
<strong>robot</strong>s, <strong>and</strong> educational <strong>robot</strong>s. For example, Figure 6 depicts a <strong>robot</strong> suit which helps to<br />
enhance the strength of caregiver (Satoh et al., 2009). Besides, as shown in Figure 7,<br />
<strong>robot</strong>ic wheelchairs with the functions of navigation <strong>and</strong> motion planning allow people<br />
with limited mobility, such as the elderly <strong>and</strong> the disabled, to move freely <strong>and</strong> easily<br />
(Prassler et al., 2001; Pineau <strong>and</strong> Atrash, 2007).<br />
Figure 6. Robot suit HAL for bathing care assistance (Satoh et al., 2009)<br />
http://grc.yzu.edu.tw/
(a) (b)<br />
6<br />
<strong>Interaction</strong> <strong>between</strong> <strong>human</strong> <strong>and</strong> <strong>robot</strong><br />
Figure 7. The <strong>robot</strong>ic wheelchairs (a) MAid (Prassler et al., 2001) (b) SmartWheeler (Pineau <strong>and</strong><br />
Atrash, 2007)<br />
In addition to the basic requirements of service <strong>robot</strong>s, the need for <strong>robot</strong>s facilitating<br />
healthcare for the elderly, both physiologically <strong>and</strong> psychologically, is also becoming an<br />
urgent issue in the aging society. Interactive autonomous <strong>robot</strong>s behave autonomously<br />
using various kinds of sensors <strong>and</strong> actuators, <strong>and</strong> can react to stimulation by its<br />
environment, including interacting with a <strong>human</strong>. Seal <strong>robot</strong> Paro is an example of<br />
<strong>robot</strong>-assisted therapy for improving <strong>human</strong> mind at hospitals or institutions (Wada et al.,<br />
2008), as shown in Figure 8. Besides, Lytle (2002) reported that Matsushita electrics had<br />
developed a <strong>robot</strong>ic care bear whose purpose was to watch over the elderly residents in a<br />
hi-tech retirement center. These modern applications of service <strong>robot</strong>s significantly<br />
improve the quality of life for the elderly.<br />
Figure 8. Paro interacting with the elderly in a nursing home (Wada et al., 2008)<br />
In health care applications, telepresence <strong>robot</strong>s, which let a person be in two places at<br />
once, are also of great interest. The remote-controlled <strong>robot</strong> “Rosie” st<strong>and</strong>s 65 inches tall<br />
<strong>and</strong> has a computer-screen head which serves as a physician‟s eyes <strong>and</strong> ears, as shown in<br />
Figure 9. Its two-way audio <strong>and</strong> video capabilities enable individuals to be physically<br />
http://grc.yzu.edu.tw/
<strong>Interaction</strong> <strong>between</strong> <strong>human</strong> <strong>and</strong> <strong>robot</strong><br />
located in one location <strong>and</strong> virtually present in another at the same time. The <strong>robot</strong> allows<br />
medical assessments <strong>and</strong> diagnoses to take place in real-time. Patient-specific medical data,<br />
such as ultrasound images, can be transmitted through the wireless Internet. Medical<br />
personnel can discuss treatment plans <strong>and</strong> interact with patients remotely. By serving as an<br />
extension of physician-patient contact, patients feel more satisfied because physicians<br />
seem to spend more time with them (Gerrard et al., 2010).<br />
Figure 9. Physicians operate the <strong>robot</strong> to visit patients (Gerrard et al., 2010)<br />
2. Human-<strong>robot</strong> interaction<br />
As introduced in section 1, <strong>robot</strong>s are becoming more common <strong>and</strong> have changed the<br />
way we live. In such an environment, <strong>human</strong>s need to interact with <strong>robot</strong>s to perform the<br />
tasks or access the service they provide. Thus, the interaction <strong>between</strong> <strong>human</strong> <strong>and</strong> <strong>robot</strong> is<br />
of great concern in the development of <strong>robot</strong>ics. Human-<strong>robot</strong> interaction (HRI) is a<br />
multidisciplinary study that requires contributions from <strong>robot</strong>ics, <strong>human</strong>-computer<br />
interaction, <strong>human</strong> factors engineering, artificial intelligence, <strong>and</strong> some other research<br />
fields. The origin of HRI as a discrete problem can be traced back to Asimov‟s three laws<br />
of <strong>robot</strong>ics (Asimov, 1941):<br />
(1) A <strong>robot</strong> may not injure a <strong>human</strong> being or, through inaction, allow a <strong>human</strong> being to come<br />
to harm.<br />
(2) A <strong>robot</strong> must obey any orders given to it by <strong>human</strong> beings, except where such orders<br />
would conflict with the First Law.<br />
(3) A <strong>robot</strong> must protect its own existence as long as such protection does not conflict with the<br />
First or Second Law.<br />
These three laws mainly point out the HRI issue in terms of safety. Under this concept,<br />
<strong>robot</strong> <strong>and</strong> <strong>human</strong> should be regarded as separate individuals that do not conflict with each<br />
other. Besides, in order that a <strong>robot</strong> can obey the orders given by <strong>human</strong> beings, a control<br />
mechanism enabling the <strong>robot</strong> to perceive <strong>human</strong>s <strong>and</strong> make responses is required.<br />
7<br />
http://grc.yzu.edu.tw/
<strong>Interaction</strong> <strong>between</strong> <strong>human</strong> <strong>and</strong> <strong>robot</strong><br />
Moreover, as <strong>human</strong>s wish to have <strong>human</strong>-like <strong>robot</strong>s as assistants or servants,<br />
anthropomorphic characteristics help to meet this expectation. Considering these issues<br />
with respect to <strong>human</strong>-<strong>robot</strong> interaction, the associated research topics will be discussed in<br />
the following paragraphs.<br />
2.1 Safety<br />
Safety is the primer issue in <strong>human</strong>-<strong>robot</strong> interaction. Since <strong>robot</strong>s are designed to<br />
assist <strong>human</strong>s, they should not bother or even harm <strong>human</strong>s during operation. In order to<br />
exp<strong>and</strong> safety awareness throughout the <strong>robot</strong>ics industry, the Robotic Industries<br />
Association (RIA) has released ANSI/RIA R15.06 in 1992. It is an American national<br />
st<strong>and</strong>ard that provides information <strong>and</strong> guidance in safeguarding personnel from injury in<br />
<strong>robot</strong>ic-production applications. Internationally, ISO 10218 (2006) describes the basic<br />
hazards associated with <strong>robot</strong>s <strong>and</strong> provides requirements to reduce the resulting risks. On<br />
the basis of these st<strong>and</strong>ards, researchers <strong>and</strong> practitioners are striving to provide safety<br />
assessment in the use of <strong>robot</strong>s.<br />
In <strong>human</strong>-<strong>robot</strong> interaction, especially the industrial <strong>robot</strong>s, the hazards may come<br />
from impact or collision, crushing or trapping, <strong>and</strong> some other accidents. In order to<br />
prevent against the possible accidents <strong>and</strong> injuries, special attentions should be given to the<br />
workplace layout, sensors, <strong>and</strong> emergency-off devices of the <strong>robot</strong>s.<br />
Through careful workplace layout, <strong>human</strong>s <strong>and</strong> <strong>robot</strong>s can be separated in different<br />
blocks to avoid direct contacts. The work envelope of a <strong>robot</strong> defines the space that it can<br />
reach. Thus, while designing the layout, it is critical to prevent personnel from entering this<br />
dangerous area in case that a collision happens. In addition, with the support of computer<br />
technologies, it is possible to simulate the physical interaction <strong>between</strong> <strong>human</strong>s <strong>and</strong> <strong>robot</strong>s.<br />
For example, Oberer et al. (2006) used the CAD models of the <strong>human</strong> operator <strong>and</strong> the<br />
<strong>robot</strong> to conduct impact simulation to assess the injury severity for a <strong>human</strong> operator<br />
(Figure 10).<br />
8<br />
http://grc.yzu.edu.tw/
Figure 10. Impact simulation (Oberer et al., 2006)<br />
9<br />
<strong>Interaction</strong> <strong>between</strong> <strong>human</strong> <strong>and</strong> <strong>robot</strong><br />
Ideally, careful workplace layout helps to eliminate the risk of impact or collision.<br />
However, in most cases, the space is too limited to enable such practices. Thus, it requires<br />
both <strong>human</strong> <strong>and</strong> <strong>robot</strong> to be aware of the possible impacts. For <strong>human</strong> operators, warning<br />
signs <strong>and</strong> sounds are usually used to alert them whenever collision is about to happen. But<br />
when the <strong>human</strong> operator concentrates too much on the task <strong>and</strong> fails to detect that, these<br />
warning messages won‟t work. In such a condition, providing sensors with the <strong>robot</strong>s gives<br />
a feasible solution. For example, if the <strong>robot</strong> can “see” the <strong>human</strong> operator by means of<br />
CCD camera <strong>and</strong> computer vision techniques, it can stop moving to avoid collision with<br />
the <strong>human</strong> in time. Moreover, while a <strong>robot</strong> is approaching the <strong>human</strong> but both of them do<br />
not notice that, an emergency-off (EMO) device allows a third person to prevent against<br />
the accident. By pushing the EMO button, the power supply of the <strong>robot</strong> can be<br />
disconnected immediately to ensure the safety of <strong>human</strong> beings.<br />
2.2 Control<br />
Effective control methods are essential for guiding <strong>robot</strong>s to follow the orders given<br />
by <strong>human</strong> beings. Technically, the means of control depend on the application field of<br />
<strong>robot</strong>s. Autonomous <strong>robot</strong>s are driven by preprogrammed comm<strong>and</strong>s. As the power is<br />
turned on, the <strong>robot</strong> starts to execute the comm<strong>and</strong>s <strong>and</strong> then performs a series of actions.<br />
In such applications, computer programming is of great concern. However, the <strong>robot</strong> itself<br />
seems not to really make interaction with <strong>human</strong>, unless it is equipped with sensors to<br />
detect <strong>human</strong> <strong>and</strong> make real-time response. Neves <strong>and</strong> Oliveira (1997) divide the control<br />
system of autonomous <strong>robot</strong>s into three levels, including reflexive level, reactive level, <strong>and</strong><br />
cognitive level. At the reflexive level, <strong>robot</strong> behaviors are developed in a pure<br />
stimulus-response way, which involves only the sensors. To the reactive level, actions are<br />
quickly made to the pre-defined problems based on a database of <strong>robot</strong> behaviors. As the<br />
complexity increases <strong>and</strong> results in heavier computation, it goes to the cognitive level<br />
which requires decision making. Combining these features, autonomous <strong>robot</strong>s can be<br />
intelligent <strong>and</strong> interact well with the environment <strong>and</strong> <strong>human</strong>. Takahashi et al. (2010)<br />
http://grc.yzu.edu.tw/
<strong>Interaction</strong> <strong>between</strong> <strong>human</strong> <strong>and</strong> <strong>robot</strong><br />
developed a mobile <strong>robot</strong> for transport called MKR. MKR is able to identify the obstacles<br />
<strong>and</strong> perform real-time path planning, so that it can avoid collision with <strong>human</strong>s or objects,<br />
as shown in Figure 11.<br />
Figure 11. The autonomous <strong>robot</strong> MKR (Takahashi et al., 2010)<br />
For manual controlled <strong>robot</strong>s, the control panel is either connected to the <strong>robot</strong> or<br />
located remotely. No matter which approach is adopted, it is necessary to provide<br />
appropriate user interface for control. The key elements to a good interface design<br />
generally depend on the nature of the task <strong>and</strong> user. Concerning the task, it needs to be<br />
simple <strong>and</strong> intuitive to avoid possible errors or mistakes. From the user‟s point of view, an<br />
operational process that meets <strong>human</strong>‟s expectation helps to enhance the efficiency of<br />
control. This is in relation to the mental model of the user. Due to the diversity of <strong>human</strong><br />
beings, related knowledge of engineering psychology <strong>and</strong> <strong>human</strong> factors engineering<br />
should be considered to ensure the usability of interface design. In addition to the mental<br />
factors, physical characteristics of <strong>human</strong>s are also important to the success of interface<br />
design. For example, the button size is required to fit the finger size of target users, so that<br />
they can perform the operation smoothly with ease.<br />
Figure 12. The <strong>robot</strong> controlled by a waistcoat (Suomela <strong>and</strong> Halme, 2001)<br />
10<br />
http://grc.yzu.edu.tw/
<strong>Interaction</strong> <strong>between</strong> <strong>human</strong> <strong>and</strong> <strong>robot</strong><br />
The <strong>robot</strong>s controlled remotely, which are also referred to as teleoperated <strong>robot</strong>s,<br />
involve more complicated HRI issues rather than interface design. Since the user is not<br />
beside the <strong>robot</strong>, cameras <strong>and</strong> microphones are needed to allow the operator to see <strong>and</strong> hear<br />
exactly what the <strong>robot</strong> does. In other words, the <strong>human</strong>s should be able to feel as if they are<br />
present during remote operation. This is known as “telepresence." In the development of<br />
telepresence <strong>robot</strong>s, advanced devices that provide sensory stimuli are critical. As the user<br />
gets more immersed into the remote environment, the performance of control <strong>and</strong><br />
interaction will be better. Adalgeirsson <strong>and</strong> Breazeal (2010) presented the design <strong>and</strong><br />
implementation of MeBot, a <strong>robot</strong>ic platform for socially embodied telepresence (Figure<br />
13). This tele<strong>robot</strong> communicates with <strong>human</strong> by more than simply audio or video but also<br />
expressive gestures <strong>and</strong> body pose. And it was found that a more engaging <strong>and</strong> enjoyable<br />
interaction is achieved through this practice.<br />
Figure 13. The telepresence <strong>robot</strong> MeBot (Adalgeirsson <strong>and</strong> Breazeal, 2010)<br />
2.3 Anthropomorphism<br />
Since the early development of <strong>robot</strong>ics, there has been a significant trend toward<br />
anthropomorphizing <strong>robot</strong>s to exhibit <strong>human</strong>-like appearance <strong>and</strong> behavior. In this way,<br />
people can interact with <strong>robot</strong>s in the ways that they are familiar with. As the level of<br />
anthropomorphism goes higher, the interaction performance can be further improved (Li et<br />
al., 2010). A simple example can be seen in an articulated <strong>robot</strong>ic arm, which is based on<br />
the structure of the <strong>human</strong> arm <strong>and</strong> serves as a third arm for <strong>human</strong> to enhance the<br />
productivity. From this point of view, techniques contributing to a higher level of<br />
anthropomorphism of <strong>robot</strong>s play an important role in the study of HRI.<br />
One approach to make <strong>robot</strong>s more anthropomorphic concentrates on building the<br />
appearance, usually the <strong>robot</strong> head or face. This is because people generally recognize a<br />
person by the face. If the <strong>robot</strong> head produces <strong>human</strong>-like expressions, people may find it<br />
friendlier to interact with. A typical application is the <strong>robot</strong>ic creature Kismet, an<br />
expressive anthropomorphic <strong>robot</strong> as shown in Figure 14. It is the first autonomous <strong>robot</strong><br />
11<br />
http://grc.yzu.edu.tw/
<strong>Interaction</strong> <strong>between</strong> <strong>human</strong> <strong>and</strong> <strong>robot</strong><br />
explicitly designed to explore face-to-face interactions with people (Breazeal, 2002).<br />
Besides, Michaud et al. (2005) designed Roball, a ball-shaped <strong>robot</strong>, which can effectively<br />
draw the attention of young children <strong>and</strong> interact with them in simple ways. Figure 15<br />
shows the Roball <strong>and</strong> the interaction <strong>between</strong> a child <strong>and</strong> Roball.<br />
Figure 14. The sociable <strong>robot</strong> Kismet (Coradeschi et al., 2006)<br />
Figure 15. Roball <strong>and</strong> its interaction with a child (Michaud et al., 2005)<br />
In addition to a <strong>human</strong>-like face, the <strong>human</strong>oid body further makes a <strong>robot</strong> more<br />
anthropomorphic. Combining the <strong>robot</strong> head with the torso, arms <strong>and</strong> legs, it comes closer<br />
to the realization of an artificial <strong>human</strong>. Nevertheless, <strong>human</strong>-like arms <strong>and</strong> legs are not<br />
sufficient for a <strong>human</strong>oid <strong>robot</strong>. The <strong>robot</strong> also needs to have the ability of mimicking<br />
<strong>human</strong> motions, so that it can move as <strong>human</strong> does. This is enabled by collaborative efforts<br />
among a number of research fields, such as anatomy, kinematics <strong>and</strong> biomechanics.<br />
Furthermore, if the <strong>robot</strong> is intended to make decision <strong>and</strong> react to the environment as<br />
<strong>human</strong> does, <strong>human</strong> behavior modeling should be taken into consideration as well. As it<br />
further relates to the study of psychology <strong>and</strong> sociology, the complexity thus increases<br />
considerably. Figure 16(a) illustrates the <strong>human</strong>oid <strong>robot</strong> ASIMO developed by Honda,<br />
which was intended to act as a <strong>human</strong> servant (Garcia et al., 2007). Besides, as shown in<br />
Figure 16(b), Aldebaran Robotics has developed the <strong>human</strong>oid <strong>robot</strong> Nao, which can<br />
interact with both <strong>human</strong> <strong>and</strong> <strong>robot</strong>.<br />
12<br />
http://grc.yzu.edu.tw/
(a) (b)<br />
13<br />
<strong>Interaction</strong> <strong>between</strong> <strong>human</strong> <strong>and</strong> <strong>robot</strong><br />
Figure 16. Humanoid <strong>robot</strong>s (a) ASIMO (Garcia et al., 2007) <strong>and</strong> (b) Nao (Aldebaran<br />
Robotics, 2010)<br />
3. Design elements for good <strong>human</strong>-<strong>robot</strong> interaction<br />
In section 2, the major domains of HRI issues <strong>and</strong> some related applications,<br />
including safety, control, <strong>and</strong> anthropomorphism have been reviewed. On the basis of these<br />
topics, some design elements contributing to good <strong>human</strong>-<strong>robot</strong> interaction need to be<br />
further highlighted for successful implementation. This section will discuss these design<br />
elements along with the associated technologies <strong>and</strong> future trends.<br />
Equipped with the capabilities to detect <strong>human</strong>s or objects in the environment <strong>and</strong> to<br />
react accordingly, the <strong>robot</strong> can perform autonomous behaviors for the safe use <strong>and</strong> a<br />
stronger interaction. In order to enhance the performance of control, the interface needs to<br />
follow the principles of user-centered design. Further, for a more immersive telepresence,<br />
sensory enhancing elements including stereoscopic <strong>and</strong> stereophonic perception, as well as<br />
supersensory, can make great contributions to stronger <strong>human</strong>-<strong>robot</strong> interaction. Moreover,<br />
through the realization of anthropomorphism, <strong>human</strong>-<strong>robot</strong> interaction will become as<br />
natural as interpersonal communication. This can be achieved by providing <strong>human</strong>oid<br />
elements <strong>and</strong> enabling eye contact <strong>between</strong> <strong>human</strong> <strong>and</strong> <strong>robot</strong>. Finally, in order to enable<br />
seamless dataflow, a robust system for data transmission should be adopted. Table 1<br />
summarizes these elements <strong>and</strong> the associated technologies.<br />
TABLE 1. Design elements <strong>and</strong> associated technologies for good HRI<br />
Design elements Associated technologies<br />
Autonomous behaviors sensors <strong>and</strong> actuator, path planning<br />
User interface <strong>human</strong>-computer interaction, teleoperation, virtual reality<br />
Sensory enhancing<br />
elements<br />
telepresence, multi-sensory stimulation, binocular <strong>and</strong><br />
panoramic vision, stereo audio, virtual reality<br />
http://grc.yzu.edu.tw/
<strong>Interaction</strong> <strong>between</strong> <strong>human</strong> <strong>and</strong> <strong>robot</strong><br />
Anthropomorphism <strong>human</strong>oid appearance, expression, <strong>and</strong> motion<br />
Eye contact camera <strong>and</strong> screen with specific placement<br />
Data transmission RF <strong>and</strong> Internet transmission, time-delay improved algorithm<br />
3.1 Autonomous behaviors<br />
In pure <strong>human</strong>-<strong>robot</strong> interaction, autonomous behaviors of the <strong>robot</strong> are generally<br />
designed for the safety of use. For collision prevention, the active identification of possible<br />
obstacles in a reasonable distance is required. This involves three design elements: the<br />
sensors, an intelligent system for path planning, <strong>and</strong> the actuators. Yasuda et al. (2009)<br />
applied fuzzy logic to develop strategies for collision prevention of a powered wheelchair,<br />
which is equipped with a laser range sensor <strong>and</strong> a position sensitive diode sensor to<br />
observe the front <strong>and</strong> both sides (Figure 17). Combining these elements, the <strong>robot</strong>ic<br />
wheelchair can either slow down to stop or directly modify the path setting to avoid<br />
obstacles. Besides, C<strong>and</strong>ido et al. (2008) proposed a hierarchical motion planner for an<br />
autonomous <strong>human</strong>oid <strong>robot</strong>. Based on this motion planner, the <strong>robot</strong> can generate a<br />
feasible path to finish its walk without making collision or falls, as shown in Figure 18.<br />
Figure 17. The <strong>robot</strong>ic wheelchair <strong>and</strong> its structure of operation (Yasuda et al., 2009)<br />
14<br />
http://grc.yzu.edu.tw/
<strong>Interaction</strong> <strong>between</strong> <strong>human</strong> <strong>and</strong> <strong>robot</strong><br />
Figure 18. The <strong>human</strong>oid <strong>robot</strong> that is capable for path planning (C<strong>and</strong>ido et al., 2008)<br />
As for <strong>human</strong>-<strong>robot</strong>-<strong>human</strong> interaction, in which the <strong>robot</strong> serves as an interface for<br />
communication <strong>between</strong> people situated in two places, the autonomous behaviors become<br />
more important for a successful interaction. In such applications of telepresence <strong>robot</strong>ics,<br />
the person who operates the <strong>robot</strong> remotely is called the user, whereas the other person<br />
interacting directly with the <strong>robot</strong> is assigned as a participant. From the user‟s perspective,<br />
autonomous behaviors of the <strong>robot</strong> extends the capability of projection to operate the <strong>robot</strong><br />
reliably in a dynamic environment. From the participant‟s view, autonomous behaviors<br />
also increase the interactive capability of the participant as a dialogist. For example, a<br />
telepresence <strong>robot</strong> with the autonomous behavior of identifying the direction of the<br />
participant who is speaking can assist the remote user to respond more quickly <strong>and</strong><br />
properly. This is achieved by adopting cameras, microphones, <strong>and</strong> a dedicated software<br />
system for recognition.<br />
An interactive museum tour-guide <strong>robot</strong>, as shown in Figure 19, was developed by<br />
two research projects TOURBOT <strong>and</strong> WebFAIR funded by the European Union (Burgard<br />
et al., 1999; Schulz et al., 2000; Trahanias et al., 2005). Thous<strong>and</strong>s of users over the world<br />
have experienced controlling this <strong>robot</strong> through the web to visit the museum remotely.<br />
They developed a modular <strong>and</strong> distributed software architecture which integrates<br />
localization, mapping, collision avoidance, planning, <strong>and</strong> various modules concerned with<br />
user interaction <strong>and</strong> web-based telepresence. With these autonomous features, the user can<br />
operate the <strong>robot</strong> to move quickly <strong>and</strong> safely in a museum crowded with visitors.<br />
15<br />
http://grc.yzu.edu.tw/
<strong>Interaction</strong> <strong>between</strong> <strong>human</strong> <strong>and</strong> <strong>robot</strong><br />
Figure 19. An interactive museum tour-guide <strong>robot</strong> <strong>and</strong> its GUI (Trahanias et al., 2005)<br />
3.2 User interface<br />
The performance of control mainly depends on the usability of user interface. The<br />
first step toward a usable interface design is to acquire a detailed underst<strong>and</strong>ing the<br />
relationship <strong>between</strong> the user <strong>and</strong> the task. This is highly related to the study of <strong>human</strong><br />
factors engineering, which aims to develop user-centered design based on scientific<br />
evidence. Since the interface of control for <strong>robot</strong>s is usually established on a computer<br />
system, most of the problems fall within the domain of <strong>human</strong>-computer interaction (HCI).<br />
There have been numerous ongoing HCI studies that endeavored to formulate universal<br />
principles of interface design. The focus tends to be on how users can deal with the tasks<br />
efficiently without committing errors. To make the design principles into practice, it also<br />
requires efforts from the fields of computer science <strong>and</strong> mechanical design. For instance,<br />
Baker et al. (2004) designed the user interface of the <strong>robot</strong> for search <strong>and</strong> rescue toward<br />
providing easy <strong>and</strong> intuitive use. As Figure 20 illustrates, the interface helps the user to<br />
concentrate on the video window without being distracted by additional information.<br />
Figure 20. The easy <strong>and</strong> intuitive control interface of the <strong>robot</strong> (Baker et al., 2004)<br />
In addition to these basic requirements of user interface, there are some more issues to<br />
be noted in the modern development of <strong>robot</strong>ics, especially for those with respect to<br />
teleoperators. A teleoperator is a machine that extends the user‟s sensing <strong>and</strong> manipulating<br />
16<br />
http://grc.yzu.edu.tw/
<strong>Interaction</strong> <strong>between</strong> <strong>human</strong> <strong>and</strong> <strong>robot</strong><br />
capability to a location remote from that user. Teleoperation refers to direct <strong>and</strong> continuous<br />
<strong>human</strong> control of the teleoperator. Many studies emphasize on enabling the user to modify<br />
the remote environment (Stoker et al., 1995; Engelberger, 2001; Spudis, 2001), that is,<br />
projecting the user to the teleoperator. In order to provide the user with a better remote<br />
interaction, virtual reality can be applied to create an environment with more realistic<br />
immersion. With a head-mounted display, the user can really feel that he/she is present at<br />
the remote location. Further, wired gloves that offer tactile feedbacks as if the user really<br />
touches what the <strong>robot</strong> does.<br />
The Full-Immersion Telepresence Testbed (FITT) developed by NASA, which<br />
combines a wearable interface integrating <strong>human</strong> perception, cognition <strong>and</strong> eye-h<strong>and</strong><br />
coordination skills with a <strong>robot</strong>‟s physical abilities, as shown in Figure 21, is a recent<br />
example of advent in teleoperation (Rehnmark et al., 2005). The teleoperated master-slave<br />
system Robonaut allows an intuitive, one-to-one mapping <strong>between</strong> master <strong>and</strong> slave<br />
motions. The operator uses the FITT wearable interface to remotely control the Robonaut<br />
to follow the operator‟s motion fully in simultaneous operation to perform complex tasks<br />
in the international space station.<br />
3.3 Sensory enhancing elements<br />
Figure 21. FITT <strong>and</strong> Robonaut (Rehnmark et al., 2005)<br />
In telepresence, stereoscopic <strong>and</strong> stereophonic elements are often emphasized to<br />
create the illusion of remote environment, which increases the feeling of immersion for the<br />
user. For example, the user can identify the distance <strong>between</strong> an object <strong>and</strong> the<br />
telepresence <strong>robot</strong> by binocular vision (Brooker et al., 1999). In addition, Boutteau et al.<br />
(2008) developed an omnidirectional stereoscopic system for the mobile <strong>robot</strong> navigation.<br />
As shown in Figure 22, the 360-degree field of view enables the remote operator to have a<br />
more detailed underst<strong>and</strong>ing about the environment. Moreover, the head-related transfer<br />
function (HRTF) for stereophonic effect further enables the user to identify the location<br />
<strong>and</strong> direction of a sound (Hawksford, 2002).<br />
17<br />
http://grc.yzu.edu.tw/
<strong>Interaction</strong> <strong>between</strong> <strong>human</strong> <strong>and</strong> <strong>robot</strong><br />
Figure 22. The <strong>robot</strong> with a stereoscopic system (Boutteau et al., 2005)<br />
For teleoperated <strong>robot</strong>s, stereoscopic <strong>and</strong> stereophonic elements also help to enhance<br />
the feel of presence during operation. In the design of teleoperators, these elements can be<br />
added to provide stronger interaction by adopting the technologies involved in telepresence<br />
videoconferencing. As many practices show, telepresence videoconferencing enables the<br />
users <strong>and</strong> the participants to communicate more efficiently. For example, Lei et al. (2004)<br />
proposed a representation <strong>and</strong> reconstruction module for an image-based telepresence<br />
system, using a viewpoint-adaptation scheme <strong>and</strong> an image-based rendering technique.<br />
This system provides life-size views <strong>and</strong> 3D perception of participants <strong>and</strong> viewers in real<br />
time <strong>and</strong> hence improves the interaction.<br />
Supersensory refers to an advanced capability to modify the remote environment<br />
provided by a dexterous <strong>robot</strong> or a precise telepresence system. From the user‟s view, the<br />
user‟s manipulative efficiency for special tasks is enhanced when projecting onto a<br />
telepresence <strong>robot</strong> with supersensory. Green et al. (1995) developed a telepresence surgery<br />
system integrating vision, hearing <strong>and</strong> manipulation. It consists of two main modules: a<br />
surgeon‟s console <strong>and</strong> a remote surgical unit located at the surgical table. The remote unit<br />
provides scaled motion, force reflection <strong>and</strong> minimized friction for the surgeon to carry out<br />
complex tasks with quick, precise motions. Similar applications of supersensory in<br />
telepresence surgery can be also seen in the studies of Satava (1999), Schurr et al., (2000),<br />
<strong>and</strong> Ballantyne (2002).<br />
Supersensory elements can also provide the user with a novel immersion feeling in a<br />
remote environment. For example, the user can control the zoom function of the camera on<br />
a telepresence <strong>robot</strong> to observe the small details of the remote environment, which the user<br />
does not normally see with the naked eye. Intuitive Surgery (2010) developed the da<br />
Vinci® Surgical System through the use of supersensory in telepresence. As Figure 23<br />
shows, the da Vinci Surgical System consists of an ergonomically designed surgeon‟s<br />
console, a patient-side cart with four interactive <strong>robot</strong>ic arms, <strong>and</strong> the high-performance<br />
vision system. Powered by state-of-the-art <strong>robot</strong>ic technology, the surgeon‟s h<strong>and</strong><br />
movements are scaled, filtered <strong>and</strong> seamlessly translated into precise movements.<br />
18<br />
http://grc.yzu.edu.tw/
<strong>Interaction</strong> <strong>between</strong> <strong>human</strong> <strong>and</strong> <strong>robot</strong><br />
Figure 23. The da Vinci® Surgical System (Intuitive Surgery, 2010)<br />
3.4 Anthropomorphism<br />
As the <strong>robot</strong> resembles <strong>human</strong> more, the <strong>human</strong>-<strong>robot</strong> interaction comes closer to<br />
interpersonal communication. Thus, anthropomorphism of <strong>robot</strong>s helps to enhance the<br />
performance of <strong>human</strong>-<strong>robot</strong> interaction by means of creating an environment that <strong>human</strong>s<br />
are more familiar with. Generally, this is enabled by providing <strong>human</strong>oid appearance,<br />
expression, <strong>and</strong> motion. Coradeschi et al. (2006) addressed that appearance <strong>and</strong> behaviors<br />
of <strong>robot</strong> are essential in <strong>human</strong>-<strong>robot</strong> interaction. A <strong>robot</strong>‟s appearance influences subject‟s<br />
impressions, <strong>and</strong> it is an important factor in evaluating the interaction. Humanlike<br />
appearance can be deceiving, convincing users that <strong>robot</strong> can underst<strong>and</strong> <strong>and</strong> do much<br />
more than they actually can. Observable behaviors are gaze, posture, movement patterns<br />
<strong>and</strong> linguistic interactions.<br />
Ishiguro created a <strong>human</strong>oid <strong>robot</strong> by copying the appearance of him. As Figure 24<br />
presents, he constructed this <strong>robot</strong> with silicone rubber, pneumatic actuators, powerful<br />
electronics, <strong>and</strong> hair from his own scalp. Although it is not able to move, this <strong>robot</strong><br />
however meets the expectation of mimicking a real person‟s appearance (Guizzo, 2010).<br />
An alternative approach to provide a <strong>human</strong>oid appearance is by displaying the face of the<br />
remote user on a telepresence <strong>robot</strong>. For interacting with the participants, the user‟s face<br />
displayed on a LCD screen is incorporated in many telepresence <strong>robot</strong>s. Dr. Robot <strong>and</strong> the<br />
telepresence system PEBBLES both use a LCD screen to display the user‟s face, which<br />
allows the participants to realize whom the telepresence <strong>robot</strong> represents. The commercial<br />
product “Giraffe” (2007), a remote-controlled mobile video conferencing platform, is also<br />
a telepresence <strong>robot</strong> application. It is composed of two subsystems: the client application,<br />
<strong>and</strong> the Giraffe <strong>robot</strong> itself. On the Giraffe <strong>robot</strong>, there is a video screen <strong>and</strong> camera<br />
mounted on an adjustable height <strong>robot</strong>ic base. The user can move the Giraffe <strong>robot</strong> from<br />
afar using the client application. Software that runs on a st<strong>and</strong>ard PC with a webcam<br />
19<br />
http://grc.yzu.edu.tw/
<strong>Interaction</strong> <strong>between</strong> <strong>human</strong> <strong>and</strong> <strong>robot</strong><br />
enables the user connects to the distant Giraffe <strong>robot</strong> through the Internet for a telepresence<br />
interaction.<br />
Figure 24. Ishiguro <strong>and</strong> the <strong>human</strong>oid <strong>robot</strong> (Guizzo, 2010)<br />
There are many other solutions for presenting anthropomorphic elements, such as the<br />
<strong>human</strong>oid expressions. For example, as depicted in Figure 14, the <strong>human</strong>oid <strong>robot</strong> Kismet<br />
is installed with mechanical facial expressions to make face-to-face interaction with<br />
<strong>human</strong>s (Breazeal, 2002). Besides, Berns <strong>and</strong> Hirth (2006) developed a <strong>human</strong>oid <strong>robot</strong><br />
face ROMAN. As Figure 25 shows, the mechanical structure allows ROMAN to make<br />
facial expressions such as anger, disgust, fear, happiness, sadness <strong>and</strong> surprise. Facial<br />
expressiveness in <strong>human</strong>oid-type <strong>robot</strong>s has received a lot of attention because it is a key<br />
component to developing personal attachment with <strong>human</strong> users. From a psychological<br />
point of view, using facial expressions is an effective method to build personal attachment<br />
in communicating with a <strong>human</strong> user.<br />
Figure 25. The expressive <strong>robot</strong> head ROMAN (Berns <strong>and</strong> Hirth, 2006)<br />
Moreover, <strong>human</strong>-like motions extend the anthropomorphism of <strong>robot</strong>s to a higher<br />
level. This involves the efforts from motion capture, biomechanics, kinematics, <strong>and</strong><br />
statistical methods. For example, Chen (2010) employed a high-speed video camera to<br />
capture the jumping procedure of <strong>human</strong> <strong>and</strong> then conducted kinematic analysis, which<br />
helps to develop a <strong>human</strong> jumping <strong>robot</strong>. In addition, Kim et al. (2006) adapted the <strong>human</strong><br />
motion capture data <strong>and</strong> formulated an inverse kinematics problem. By optimizing the<br />
problem, the <strong>robot</strong> is able to imitate <strong>human</strong> arm motion.<br />
20<br />
http://grc.yzu.edu.tw/
3.5 Eye contact<br />
21<br />
<strong>Interaction</strong> <strong>between</strong> <strong>human</strong> <strong>and</strong> <strong>robot</strong><br />
Eye contact is an important element in <strong>human</strong>-to-<strong>human</strong> communication. It is a<br />
well-known cue for gaining attention <strong>and</strong> attracting interest. In <strong>human</strong>-<strong>robot</strong> interaction, a<br />
<strong>robot</strong> with eye contact can make the user feel more familiar <strong>and</strong> comfortable to interact<br />
with. Yamato et al. (2003) focused on the effect that recommendations made by the agent<br />
or <strong>robot</strong> had on user decisions, <strong>and</strong> designed a “color name selection task” to determine the<br />
key factors in designing interactively communicating <strong>robot</strong>s. They used two <strong>robot</strong>s as the<br />
<strong>robot</strong>/agent for comparison. Based on the experimental results, eye contact <strong>and</strong><br />
attention-sharing are considered to be important features of communications that display<br />
<strong>and</strong> recognize the attention of participants.<br />
In social psychology, joint attention is people who are communicating with each other<br />
frequently focus on the same object. The joint attention is a mental state where two people<br />
not only pay attention to the same information but also notice the other‟s attention to it.<br />
Imai et al. (2003) investigates situated utterance generation in <strong>human</strong>-<strong>robot</strong> interaction. In<br />
their study, a person has joint attention with a <strong>robot</strong> to identify the object indicated by a<br />
situated utterance generation generated by the <strong>robot</strong> named Robovie. A psychological<br />
experiment was conducted to verify the effect of eye contact on achieving joint attention.<br />
According to the experimental results, it was found that a relationship developed by eye<br />
contact produces a more fundamental effect on communications than logical reasoning or<br />
knowledge processing.<br />
In telepresence applications, eye contact can increase the immersion feeling of the<br />
user <strong>and</strong> the interactive capability of the participant as a dialogist. It is very difficult to<br />
achieve eye contact during interpersonal communication <strong>between</strong> the user <strong>and</strong> the<br />
participant through a telepresence <strong>robot</strong> when the face of the user is displayed on a LCD<br />
screen, because the placement of the camera on a telepresence <strong>robot</strong> is usually on top of<br />
the LCD screen, which hinders direct eye contact <strong>between</strong> the user <strong>and</strong> the participant<br />
through the telepresence <strong>robot</strong>. DVE Telepresence (2005) developed a novel LCE screen<br />
by setting the internal camera just behind the monitor. It provides natural face-to-face <strong>and</strong><br />
eye contact communication without causing eyestrain. By adopting advanced devices like<br />
this, it is possible to ensure high-quality eye contact in <strong>robot</strong>ics, which contributes to a<br />
stronger interaction <strong>and</strong> enhanced performance.<br />
3.6 Data transmission<br />
The transmission of control comm<strong>and</strong>s <strong>and</strong> sensory feedback is a basic design<br />
element for the connection <strong>between</strong> <strong>human</strong> <strong>and</strong> <strong>robot</strong>. Without this back support, it is not<br />
possible to realize real-time teleoperation <strong>and</strong> telepresence. Thus, the related development<br />
http://grc.yzu.edu.tw/
<strong>Interaction</strong> <strong>between</strong> <strong>human</strong> <strong>and</strong> <strong>robot</strong><br />
in communication engineering also plays an important role in <strong>robot</strong>ics. Generally, wireless<br />
radio frequency <strong>and</strong> Internet are used in most telepresence applications, <strong>and</strong> dedicated lines<br />
are used in specific applications (such as operation in space <strong>and</strong> deep sea). For example,<br />
Winfield <strong>and</strong> Holl<strong>and</strong> (2000) proposed a communication <strong>and</strong> control infrastructure for<br />
distributed mobile <strong>robot</strong>ics through the use of wireless local area network (WLAN)<br />
technology <strong>and</strong> Internet Protocols (IPs), which results in a powerful platform for collective<br />
or cooperative <strong>robot</strong>ics. The infrastructure described is equally applicable to tele-operated<br />
mobile <strong>robot</strong>s. In addition, considering cost efficiency <strong>and</strong> ease of use, Lister <strong>and</strong><br />
Wunderlich (2002) made use of radio frequency (RF) for mobile <strong>robot</strong> control. They also<br />
explored software methods to correct errors that may develop in RF communication.<br />
In order to realize real-time communications, the speed of transmission is taken into<br />
consideration as well. This is in relation to the effective techniques in data compression<br />
<strong>and</strong> decompression, error control, <strong>and</strong> so on. Combined with adequate algorithms of<br />
reactive functions, <strong>robot</strong>s can respond to the <strong>human</strong> user in a reasonable time. Nevertheless,<br />
the respond time is not suggested to be as short as possible. Instead, Shiwa et al. (2009)<br />
conducted some experiments <strong>and</strong> claimed that people prefer one-second delayed responses<br />
from the <strong>robot</strong> rather than immediate responses. Thus, delaying strategy is adopted by<br />
adding conversational fillers to the <strong>robot</strong>, so that the <strong>robot</strong> seems to make a pause for<br />
thinking prior to communicating with the <strong>human</strong>. This example shows that the issues in<br />
data transmission are related to not only the speed but also the modality of stimulus<br />
presentation.<br />
4. Concluding remarks<br />
Human-<strong>robot</strong> interaction is a growing field of research <strong>and</strong> application, which<br />
includes lots of topics <strong>and</strong> associated challenges. With the multidisciplinary efforts, there is<br />
a global trend toward natural interaction <strong>and</strong> higher performance. In this chapter, we<br />
discussed the highlighted HRI topics <strong>and</strong> related practices to provide conceptual ideas of<br />
how interaction affects the development of <strong>robot</strong>ics. In addition, the according design<br />
elements for good <strong>human</strong>-<strong>robot</strong> interaction are also presented to serve as a further<br />
reference.<br />
In the future development of <strong>human</strong>-<strong>robot</strong> interaction, people are looking forward to<br />
the intelligent <strong>robot</strong>s that can interact with users as <strong>human</strong> beings do. However, although<br />
anthropomorphic characteristics make the <strong>robot</strong>s more similar to real <strong>human</strong>s <strong>and</strong> thus are<br />
appealing to many users, there are still a number of barriers <strong>and</strong> challenges to be addressed.<br />
As the theory of “uncanny valley” describes, when <strong>robot</strong>s look <strong>and</strong> act almost like real<br />
<strong>human</strong>s, it however causes a response of revulsion among <strong>human</strong> users <strong>and</strong> participants<br />
(Mori, 1970). That is to say, <strong>human</strong> likeness of the <strong>robot</strong> is not always positively correlated<br />
22<br />
http://grc.yzu.edu.tw/
<strong>Interaction</strong> <strong>between</strong> <strong>human</strong> <strong>and</strong> <strong>robot</strong><br />
to the perceived familiarity. If the details of behaviors do not match the high realism of<br />
appearance, the <strong>robot</strong> will produce a negative impression to <strong>human</strong>s. As a result, related<br />
technologies are required to cross or avoid the uncanny valley.<br />
One possibility is to develop complete <strong>human</strong>-like appearance <strong>and</strong> behaviors for the<br />
<strong>robot</strong> simultaneously. Nevertheless, there seems a long way to go before overcoming the<br />
difficulties in <strong>human</strong> modeling <strong>and</strong> other related technologies. An alternative is to make the<br />
<strong>robot</strong> as an agent of the distant user by implementing telepresence <strong>and</strong> teleoperation.<br />
Enabled by telepresence, the <strong>human</strong> users on both sides appear to communicate with each<br />
other by means of one-to-one-scale video in real time. Then the <strong>robot</strong>s reproduce the<br />
actions that the distant user intended to perform via teleoperation. In this way, it is also<br />
similar to the real <strong>human</strong>-to-<strong>human</strong> interaction, although the anthropomorphism of the<br />
<strong>robot</strong> is not really in a high level.<br />
Last but not least, no matter how closely a <strong>robot</strong> resembles a real <strong>human</strong> or how<br />
powerful it is, safety will always be the most essential issue in <strong>human</strong>-<strong>robot</strong> interaction. As<br />
Asimov‟s three laws of <strong>human</strong>-<strong>robot</strong> interaction indicate, <strong>human</strong> <strong>and</strong> <strong>robot</strong> must cooperate<br />
with each other upon the principle of not conflicting with each other. After all, it may go<br />
back to the ethics <strong>and</strong> morality with regard to <strong>human</strong> interaction, just as the relationships<br />
among <strong>human</strong> beings that we have gotten used to.<br />
References<br />
Adalgeirsson, SO <strong>and</strong> Breazeal, C: 2010, MeBot: A <strong>robot</strong>ic platform for socially<br />
embodied telepresence, Proceedings of the 5th ACM/IEEE International Conference on<br />
Human-Robot <strong>Interaction</strong>: 15-22, Osaka, Japan.<br />
Aldebaran Robotics: 2010, retrieved from http://www.aldebaran-<strong>robot</strong>ics.com/<br />
Asimov, I: 1942, Runaround, Street & Smith Press, United States.<br />
Baker, M, Casey, R, Keyes, B <strong>and</strong> Yanco, HA: 2004, Improved Interfaces for<br />
Human-Robot <strong>Interaction</strong> in Urban Search <strong>and</strong> Rescue, Proceedings of the IEEE<br />
International Conference on Systems, Man <strong>and</strong> Cybernetics, Hague, the Netherl<strong>and</strong>s.<br />
Ballantyne, GH, 2002: Robotic surgery, tele<strong>robot</strong>ic surgery, telepresence, <strong>and</strong><br />
telementoring - Review of early clinical results, Surgical Endoscopy <strong>and</strong> Other<br />
Interventional Techniques 16(10): 1389-1402.<br />
23<br />
http://grc.yzu.edu.tw/
24<br />
<strong>Interaction</strong> <strong>between</strong> <strong>human</strong> <strong>and</strong> <strong>robot</strong><br />
Berns, K <strong>and</strong> Hirth, J: 2006, Control of facial expressions of the <strong>human</strong>oid <strong>robot</strong> head<br />
ROMAN, Proceedings of the IEEE/RSJ International Conference on Intelligent Robots <strong>and</strong><br />
Systems: 3119-3124, Beijing, China.<br />
Billingsley, J, Oetomo, D <strong>and</strong> Reid, J: 2009, Agricultural <strong>robot</strong>ics, IEEE Robotics &<br />
Automation Magazine 16(4): 16-16, 19.<br />
Boutteau, R, Savatier, X, Ertaud, JY <strong>and</strong> Mazari, B: 2008, An omnidirectional<br />
stereoscopic system for mobile <strong>robot</strong> navigation, Proceedings of the International<br />
Workshop on Robotic <strong>and</strong> Sensors Environments (ROSE): 138-143, Ottawa, Canada.<br />
Breazeal, CL: 2002, Designing Sociable Robots, The MIT Press.<br />
Brooker, JP, Sharkey, PM, Wann, JP, <strong>and</strong> Plooy, AM: 1999, A helmet mounted display<br />
system with active gaze control for visual telepresence, Mechatronics 9(7): 703-716.<br />
Burgard, W, Cremers, AB, Fox, D, Hahnel, D, Lakemeyer, G, Schulz, D, Steiner, W,<br />
<strong>and</strong> Thrun, S: 1999, Experiences with an interactive museum tour-guide <strong>robot</strong>, Artificial<br />
Intelligence 114(1-2): 3-55.<br />
C<strong>and</strong>ido, S, Kim, YT <strong>and</strong> Hutchinson, S: 2008, An Improved Hierarchical Motion<br />
Planner for Humanoid Robots, Proceedings of the IEEE-RAS International Conference on<br />
Humanoid Robots (Humanoids), Daejeon, Korea.<br />
Chen Y: 2010, Motion mechanism <strong>and</strong> simulation of the <strong>human</strong> jumping <strong>robot</strong>,<br />
Proceedings of the International Conference on Computer Design <strong>and</strong> Applications<br />
(ICCDA) 3: 361-364, Qinhuangdao, China<br />
Coradeschi, S, Ishiguro, H, Asada, M, Shapiro, SC, Thielscher, M, Breazeal, C,<br />
Mataric, MJ, <strong>and</strong> Ishida, H: 2006, Human-Inspired Robots, IEEE Intelligent Systems 21(4):<br />
74-85.<br />
Dr. Robot, http://www.intouch-health.com/<br />
DVE Telepresence, http://www.dvetelepresence.com/<br />
Engelberger, G: 2001, NASA's Robonaut, Industrial Robot 28(1): 35-39.<br />
Garcia, E, Jimenez, MA, De Santos, PG <strong>and</strong> Armada, M: 2007, The evolution of<br />
<strong>robot</strong>ics research, IEEE Robotics & Automation Magazine 14(1): 90-103.<br />
http://grc.yzu.edu.tw/
<strong>Interaction</strong> <strong>between</strong> <strong>human</strong> <strong>and</strong> <strong>robot</strong><br />
Gerrard, A, Tusia, J, Pomeroy, B, Dower, A <strong>and</strong> Gillis, J: 2010, On-call<br />
physician-<strong>robot</strong> dispatched to remote Labrador, News in Health, Media Centre, Dalhousie<br />
University, Canada.<br />
Giraffe, http://www.headthere.com/products.html/<br />
Green, PS, Hill, JW, Jensen, JF, <strong>and</strong> Shah, A: 1995, Telepresence surgery, IEEE<br />
Engineering in Medicine <strong>and</strong> Biology Magazine 14(3): 324-329.<br />
Gromyko, A: 2009, Palletising_<strong>robot</strong>.jpg, retrieved from:<br />
http://commons.wikimedia.org/wiki/ File:Palletising_<strong>robot</strong>.jpg/<br />
56.<br />
Guizzo, E: 2010, The man who made a copy of himself, IEEE Spectrum 47(4): 44 –<br />
Hawksford, MOJ: 2002, Scalable multichannel coding with HRTF enhancement for<br />
DVD <strong>and</strong> virtual sound systems, Journal of the Audio Engineering Society 50(11):<br />
894-913.<br />
Hoggett, R: 2009, TelevoxWensley21Feb1928.jpg, retrieved from:<br />
http://cyberneticzoo.com/?p=656/<br />
Imai, M, Ono, T, <strong>and</strong> Ishiguro, H: 2003, Physical relation <strong>and</strong> expression: joint<br />
attention for <strong>human</strong>-<strong>robot</strong> interaction, IEEE Transactions on Industrial Electronics 50(4):<br />
636-643.<br />
International St<strong>and</strong>ard Organization: 1994, ISO 8373: Manipulating industrial<br />
<strong>robot</strong>s – Vocabulary.<br />
International St<strong>and</strong>ard Organization: 2006, ISO 10218-1: Robots for industrial<br />
environments - Safety requirements - Part 1: Robot.<br />
Intuitive Surgical: 2010, da Vinci® Surgical System,<br />
http://www.intuitivesurgical.com/index.aspx<br />
Kim, CH, Kim, D <strong>and</strong> Oh, YH: 2006, Adaptation of <strong>human</strong> motion capture data to<br />
<strong>human</strong>oid <strong>robot</strong>s for motion imitation using optimization, Journal of Integrated<br />
Computer-Aided Engineering 13(4): 377-389.<br />
Lei, BJ, Chang, C, <strong>and</strong> Hendriks, EA: 2004, An efficient image-based telepresence<br />
system for videoconferencing, IEEE Transactions on Circuits <strong>and</strong> Systems for Video<br />
Technology 14(3): 335-347.<br />
25<br />
http://grc.yzu.edu.tw/
<strong>Interaction</strong> <strong>between</strong> <strong>human</strong> <strong>and</strong> <strong>robot</strong><br />
Li, D, Rau PL, <strong>and</strong> Li, Y: 2010, A Cross-cultural Study: Effect of Robot Appearance<br />
<strong>and</strong> Task, International Journal of Social Robotics 2(2): 175-186.<br />
Lister, MB <strong>and</strong> Wunderlich, JT: 2002, Development of software for mobile <strong>robot</strong><br />
control over a radio frequency communications link, Proceedings of IEEE Southeast<br />
Conference: 414 – 417, Columbia, United States.<br />
Lytle, JM: 2002, Robot care bears for the elderly, BBC News, Thursday, 21 February,<br />
2002.<br />
McKellar, I: 2006, Elektro <strong>and</strong> Sparko, retrieved from: http://www.flickr.com/photos/<br />
76722295@N00/263022490/<br />
Möller, E: 2005, Leonardo-Robot3.jpg, retrieved from<br />
http://en.wikipedia.org/wiki/File:Leonardo-Robot3.jpg/<br />
Mori, M: 1970, The uncanny valley, Energy 7(4): 33-35.<br />
Neves, M <strong>and</strong> Oliveira, E: 1997, A control architecture for an autonomous mobile<br />
<strong>robot</strong>, Proceedings of First International Conference on Autonomous Agents, California,<br />
USA.<br />
Oberer, S, Malosio, M, <strong>and</strong> Schraft, RD: 2006, Investigation of Robot-Human Impact,<br />
Proceedings of the Joint Conference on Robotics 87-103.<br />
PEBBLES, http://www.ryerson.ca/pebbles/<br />
Pineau, J <strong>and</strong> Atrash, A: 2007, SmartWheeler: A <strong>robot</strong>ic wheelchair test-bed for<br />
investigating new models of <strong>human</strong>-<strong>robot</strong> interaction, AAAI Spring Symposium on<br />
Multidisciplinary Collaboration for Socially Assistive Robotics.<br />
Prassler, E, Scholz, J, <strong>and</strong> Fiorini, P: 2001, A <strong>robot</strong>ics wheelchair for crowded public<br />
environment, IEEE Robotics & Automation Magazine 8(1): 38-45<br />
Rehnmark, F, Bluethmann, W, Mehling, J, Ambrose, RO, Diftler, M, Chu, M, <strong>and</strong><br />
Necessary, R: 2005, Robonaut: the „short list‟ of technology hurdles, Computer 38(1):<br />
28-37.<br />
Robot Institute of America: 1979, RIA Worldwide Robotics Survey <strong>and</strong> Directory,<br />
Robotic Institute of America, P.O. Box 1366, Dearborn, Michigan, U.S.A.<br />
26<br />
http://grc.yzu.edu.tw/
<strong>Interaction</strong> <strong>between</strong> <strong>human</strong> <strong>and</strong> <strong>robot</strong><br />
Satava, RM: 1999, Emerging technologies for surgery in the 21st century, Archives of<br />
Surgery 134(11): 1197-1202.<br />
Satoh, H, Kawabata, T <strong>and</strong> Sankai, Y: 2009, Bathing care assistance with <strong>robot</strong> suit<br />
HAL, Proceedings of IEEE International Conference on Robotics <strong>and</strong> Biomimetics,<br />
498-503, Guilin, China.<br />
Schaefer, MT: 2008, Bios_<strong>robot</strong>lab_writing_<strong>robot</strong>.jpg, retrieved from:<br />
http://commons.wikimedia.org/ wiki/File:Bios_<strong>robot</strong>lab_writing_<strong>robot</strong>.jpg<br />
Schulz, D, Burgard, W, Fox, D, Thrun, S, <strong>and</strong> Cremers, AB: 2000, Web interfaces for<br />
mobile <strong>robot</strong>s in public places, IEEE Robotics & Automation magazine 7(1): 48-56.<br />
Schurr, MO, Buess, G, Neisius, B, <strong>and</strong> Voges, U: 2000, Robotics <strong>and</strong> telemanipulation<br />
technologies for endoscopic surgery - A review of the ARTEMIS project, Surgical<br />
Endoscopy <strong>and</strong> other Interventional Techniques 14(4): 375-381.<br />
Spudis PD: 2001, The case for renewed <strong>human</strong> exploration of the Moon, Earth Moon<br />
<strong>and</strong> Planets 87(3): 159-169.<br />
Stoker, CR, Burch, DR, Hine, BP III, <strong>and</strong> Barry, J: 1995, Antarctic undersea<br />
exploration using a <strong>robot</strong>ic submarine with a telepresence user interface, IEEE Expert<br />
10(6): 14-23.<br />
Suomela, J <strong>and</strong> Halme, A: 2001, Cognitive Human Machine Interface Of Workpartner<br />
Robot, Proceedings of Intelligent Autonomous Vehicles 2001 Conference (IAV2001),<br />
Sapporo, Japan.<br />
Takahashi, M, Suzuki, T, Shitamoto, H, Moriguchi, T <strong>and</strong> Yoshida, K: 2010,<br />
Developing a mobile <strong>robot</strong> for transport applications in the hospital domain, Robotics <strong>and</strong><br />
Autonomous Systems 58(7): 889-899.<br />
The Robotic Industries Association: 1992, ANSI/RIA R15.06: Industrial Robots <strong>and</strong><br />
Robot Systems - Safety Requirements, ANSI St<strong>and</strong>ard.<br />
Trahanias, P, Burgard, W, Argyros, A, Hahnel, D, Baltzakis, H, Pfaff, P, <strong>and</strong> Stachniss,<br />
C: 2005, TOURBOT <strong>and</strong> WebFAIR: Web-operated mobile <strong>robot</strong>s for tele-presence in<br />
populated exhibitions, IEEE Robotics & Automation Magazine 12(2): 77-89.<br />
Wada, K, Shibata, T, Musha, T <strong>and</strong> Kimura, S: 2008, Robot therapy for elders affected<br />
by dementia, IEEE Engineering in Medicine <strong>and</strong> Biology Magazine 27(4): 53 – 60.<br />
27<br />
http://grc.yzu.edu.tw/
<strong>Interaction</strong> <strong>between</strong> <strong>human</strong> <strong>and</strong> <strong>robot</strong><br />
Winfield, AFT <strong>and</strong> Holl<strong>and</strong>, OE: 2000, The application of wireless local area network<br />
technology to the control of mobile <strong>robot</strong>s, Microprocessors <strong>and</strong> Microsystems 23:<br />
597–607.<br />
Yamato, J, Brooks, R, Shinozawa, K, <strong>and</strong> Naya, F: 2003, Human-Robot Dynamic<br />
Social <strong>Interaction</strong>, NTT Technical Review 1(6): 37-43.<br />
Yasuda, T, Suehiro, N <strong>and</strong> Tanaka, K: 2009, Strategies for collision prevention of a<br />
compact powered wheelchair using SOKUIKI sensor <strong>and</strong> applying fuzzy theory, IEEE<br />
International Conference on Robotics <strong>and</strong> Biomimetics (ROBIO): 202 – 208, Guilin,<br />
China.<br />
Index<br />
Robot<br />
Robotics<br />
Human-<strong>robot</strong> interaction<br />
Teleoperation<br />
Telepresence<br />
Anthropomorphism<br />
Biography<br />
Jun-Ming Lu received his Ph.D. degree in industrial engineering <strong>and</strong> engineering<br />
management from National Tsing Hua University, Taiwan, in 2009. He is currently a<br />
postdoctoral researcher in Gerontechnology Research Center, Yuan Ze University. His<br />
research interests are ergonomics, digital <strong>human</strong> modeling, <strong>and</strong> gerontechnology.<br />
Tzung-Cheng Tsai received his Ph.D. degree in mechanical engineering from Yuan<br />
Ze University, Taiwan, in 2007. He is currently a researcher in Green Energy &<br />
Environment Research Laboratories, Industrial Technology Research Institute, Taiwan.<br />
His research interests are telepresence, teleoperation, <strong>and</strong> green energy.<br />
Yeh-Liang Hsu received his Ph.D. degree in mechanical engineering from Stanford<br />
University, United States, in 1992. He is currently a professor in Department of<br />
Mechanical Engineering, the director of Gerontechnology Research Center, <strong>and</strong> the<br />
28<br />
http://grc.yzu.edu.tw/
<strong>Interaction</strong> <strong>between</strong> <strong>human</strong> <strong>and</strong> <strong>robot</strong><br />
secretary general of Yuan Ze University, Taiwan. His research interests are mechanical<br />
design, design optimization, <strong>and</strong> gerontechnology.<br />
29<br />
http://grc.yzu.edu.tw/