hand motion-based remote control interface with...

10
International Journal of Advanced Robotic Systems Hand Motion-Based Remote Control Interface with Vibrotactile Feedback for Home Robots Regular Paper Juan Wu 1 , Guifang Qiao 1 , Jun Zhang 1 , Ying Zhang 1 and Guangming Song 1,* 1 School of Instrument Science and Engineering, Southeast University, Nanjing, China * Corresponding author E-mail: [email protected] Received 9May 2012; Accepted 3May 2013 DOI: 10.5772/56617 © 2013 Wu et al.; licensee InTech. This is an open access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/3.0), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited. Abstract This paper presents the design and implementation of a hand-held interface system for the locomotion control of home robots. A handheld controller is proposed to implement hand motion recognition and hand motion-based robot control. The handheld controller can provide a ‘connect-and-play’ service for the users to control the home robot with visual and vibrotactile feedback. Six natural hand gestures are defined for navigating the home robots. A three-axis accelerometer is used to detect the hand motions of the user. The recorded acceleration data are analysed and classified to corresponding control commands according to their characteristic curves. A vibration motor is used to provide vibrotactile feedback to the user when an improper operation is performed. The performances of the proposed hand motion-based interface and the traditional keyboard and mouse interface have been compared in robot navigation experiments. The experimental results of home robot navigation show that the success rate of the handheld controller is 13.33% higher than the PC based controller. The precision of the handheld controller is 15.4% more than that of the PC and the execution time is 24.7% less than the PC based controller. This means that the proposed hand motion-based interface is more efficient and flexible. Keywords Hand Motion Recognition, Home Robot, Control Interface, Handheld Controller 1. Introduction In recent years, more and more mobile robots have moved away from industry to enter home environments. As the size and the cost have decreased significantly, the home robot is now available for use as one of the most popular consumer electronic products [1]. More and more home robots are now working around us and they help us a lot in our daily lives. A wide variety of home robots have been proposed to do housework such as cooking, cleaning, houseplant watering and pet feeding. They are also being widely used in home security, entertainment, rehabilitation training and home care for the elderly [2-5]. As the home robots get closer in our daily lives, the question arises: How to interact with them? Complicated control interfaces designed for skilled 1 ARTICLE www.intechopen.com Int J Adv Robotic Sy, 2013, Vol. 10, 270:2013

Upload: lamhanh

Post on 13-Apr-2018

223 views

Category:

Documents


3 download

TRANSCRIPT

Page 1: Hand Motion-Based Remote Control Interface with ...cdn.intechopen.com/pdfs/45323/InTech-Hand_motion_based_remote... · Hand Motion-Based Remote Control Interface with Vibrotactile

International Journal of Advanced Robotic Systems

Hand Motion-Based Remote Control Interface with Vibrotactile Feedback for Home Robots Regular Paper

Juan Wu1, Guifang Qiao1, Jun Zhang1, Ying Zhang1 and Guangming Song1,* 1 School of Instrument Science and Engineering, Southeast University, Nanjing, China * Corresponding author E-mail: [email protected] Received 9May 2012; Accepted 3May 2013 DOI: 10.5772/56617 © 2013 Wu et al.; licensee InTech. This is an open access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/3.0), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

Abstract This paper presents the design and implementation of a hand-held interface system for the locomotion control of home robots. A handheld controller is proposed to implement hand motion recognition and hand motion-based robot control. The handheld controller can provide a ‘connect-and-play’ service for the users to control the home robot with visual and vibrotactile feedback. Six natural hand gestures are defined for navigating the home robots. A three-axis accelerometer is used to detect the hand motions of the user. The recorded acceleration data are analysed and classified to corresponding control commands according to their characteristic curves. A vibration motor is used to provide vibrotactile feedback to the user when an improper operation is performed. The performances of the proposed hand motion-based interface and the traditional keyboard and mouse interface have been compared in robot navigation experiments. The experimental results of home robot navigation show that the success rate of the handheld controller is 13.33% higher than the PC based controller. The precision of the handheld controller is 15.4% more than that of the PC and the execution time is 24.7% less than the PC based controller. This means that the

proposed hand motion-based interface is more efficient and flexible. Keywords Hand Motion Recognition, Home Robot, Control Interface, Handheld Controller

1. Introduction

In recent years, more and more mobile robots have moved away from industry to enter home environments. As the size and the cost have decreased significantly, the home robot is now available for use as one of the most popular consumer electronic products [1]. More and more home robots are now working around us and they help us a lot in our daily lives. A wide variety of home robots have been proposed to do housework such as cooking, cleaning, houseplant watering and pet feeding. They are also being widely used in home security, entertainment, rehabilitation training and home care for the elderly [2-5]. As the home robots get closer in our daily lives, the question arises: How to interact with them? Complicated control interfaces designed for skilled

1Juan Wu, Guifang Qiao, Jun Zhang, Ying Zhang and Guangming Song: Hand Motion-Based Remote Control Interface with Vibrotactile Feedback for Home Robots

www.intechopen.com

ARTICLE

www.intechopen.com Int J Adv Robotic Sy, 2013, Vol. 10, 270:2013

Page 2: Hand Motion-Based Remote Control Interface with ...cdn.intechopen.com/pdfs/45323/InTech-Hand_motion_based_remote... · Hand Motion-Based Remote Control Interface with Vibrotactile

workers and experts are not suitable for ordinary home users. They prefer simple and natural interaction with home robots through voices and gestures. This requires a user-friendly interface that allows the robot to understand voice and gesture commands. The voice interface is suitable for a simple call-and-come service for home robots [6] but it is not suitable for continuous remote control and usually it cannot work normally due to the interference of ambient noise. Therefore, hand gesture or hand motion-based interfaces are more suitable for control of home robots. Several methods have been proposed for hand gesture recognition, such as marker-based gesture recognition, vision-based motion recognition, haptic-based motion recognition and EMG-based hand motion recognition [7]. In [8], a real-time hand gesture recognition system based on difference image entropy using a stereo camera is introduced. The proposed method shows an average recognition rate of 85%. Other hand gesture recognition methods use wearable sensors, accelerometers, angular rate sensors and data gloves to detect hand gestures. In [9], a data glove is used for 3D hand motion tracking and gesture recognition. In [10], the authors proposed a set of recognition algorithms: TC, FAcaGMM and FEC. They evaluated the algorithms on a data glove with 13 different types of grasps and ten in-hand manipulations. In [11-12], the authors used a wearable sensor to recognize hand gestures and daily activities in a smart assisted living system to help elderly people, patients and the disabled. In [13], a wearable wristwatch-type controller is introduced to offer a unified way to control various devices. The controller uses simple and effective hand motion gestures for controlling devices. In [14], an accelerometer combined with a visual tracker is used to detect hand movements in a system for human computer interaction. As compared to the wearable sensor-based control interface, the handheld interface is more suitable for controlling home robots. Since people are used to this kind of control mode they can learn to use it quickly and easily. Wearable sensors fixed on the body of a person are not convenient and flexible to use when he or she wants to control a robot. In [15], the authors introduce a handheld interface system for 3D interaction with digital media contents. The system can track the full six degrees-of-freedom position and orientation of a handheld controller. The gesture recognition depends on acceleration and position measurements. A hand-gesture-based control interface for navigating a car robot is introduced in [16]. A three-axis accelerometer is adopted to record the hand trajectories of a user. In [17-18], the authors proposed a handheld system to recognize hand motions. The system contains three MEMS accelerometers and a Bluetooth wireless module.

Hand gestures usually can be described by the tilt angles of the hand. So tilt sensors can be used to detect the angles of a hand gesture. In [19], a tilt sensor was designed using standard accelerometers. The accuracy of the tilt sensor is 0.3° over the full measurement range of pitch and roll. In [20], the authors used a Kalman filter to estimate inclination from the signals of a three-axis accelerometer for measuring inclination of body segments and activity of daily living (ADL). This method is nearly twice as accurate as the methods based on low-pass filtering of accelerometer signals. In [21], a three-axis accelerometer and a dual-axis angular rate sensor are utilized for orientation sensing in mobile virtual environments. It presents a technical and theoretical description for implementing an orientation-aware device, which is used for navigating large images spread out on a virtual hemispheric space in front of the user through a mobile display. In [22], a novel approach for hand gesture recognition is introduced. In [23], a hand gesture recognition system is implemented to detect hand gestures in any orientation. The system is integrated on an interactive robot, allowing for real-time hand gesture interaction with the robot. Gestures are translated into goals for the robot, telling him where to go. Vibration motors are usually used in handheld interfaces to provide vibrotactile feedback. In [15], a vibration motor and a voice-coil actuator are adopted to achieve vibrotactile feedback. In [24], vibrotactile actuators are used in a handheld input device for providing spatial and directional information. In [25], a vibrotactile feedback approach for posture guidance is introduced and in [26], multi-day training with vibrotactile feedback for virtual object manipulation is introduced. Experimental results show that participants are able to utilize the vibrotactile feedback to improve the performance of virtual object manipulation. In this paper, we present a hand motion-based remote control interface with vibrotactile feedback for home robots. A handheld controller is proposed to implement hand motion recognition and hand motion-based robot control. The handheld controller can provide a ‘connect-and-play’ service for the users to control the home robot. Meanwhile, it implements the function of visual and vibrotactile feedback. Six simple hand gestures are defined for locomotion control of the robot. They are easy to use for untrained people. A three-axis accelerometer is used to detect the hand motions of the user. The recorded acceleration data are analysed and classified to corresponding control commands according to their characteristic curves. A vibration motor is used to provide vibrotactile feedback to the user when an improper operation is performed. The remainder of this paper is organized as follows. Section 2 introduces the overall system architecture that

2 Int J Adv Robotic Sy, 2013, Vol. 10, 270:2013 www.intechopen.com

Page 3: Hand Motion-Based Remote Control Interface with ...cdn.intechopen.com/pdfs/45323/InTech-Hand_motion_based_remote... · Hand Motion-Based Remote Control Interface with Vibrotactile

includes the hardware dein Section 3. Thand motion4. The experprototype syremarks are g

2. System des

The conceptucontrol system

Figure 1.Concrobot by rotatirecognized by commands.

The system csurveillance axis accelerointeraction. Imotions of thto robot conthe robot applications. establishing handheld conand TCP/IP. system is shrobot with applications. been present

Figure 2.The h

The softwareFigure 3. Thcontroller inc

handheld conesign of the haThe methods

n-based robot rimental resuystem are givgiven in Sectio

scription

ual architectum is shown in

ceptual system ing the handhel the handheld c

consists of a hrobot. The haometer and It can sense he user and co

ntrol commandto impleme

There are wireless comntroller and t The home s

hown in Figuran on-board

The detaileded in [5].

home surveillan

e structure of he embeddedclude a hand

ntroller and thandheld controf hand motiocontrol are pr

ults on the peven in Sectioon 4.

ure of the han Figure 1.

architecture. Thld controller. Thcontroller and c

handheld contandheld contr

a vibration the accelerat

onvert those ads. The comment various three alterna

mmunication lthe robot, i.e.,surveillance rre 2. It is a pd camera fo

d design work

ce robot.

the control syd programs motion recog

e home robot.roller is preseon recognitionresented in Secerformance ofon 5. Conclu

and motion b

he user controlhe hand motionconverted to m

troller and a hroller has a th

motor for ions of the h

acceleration vamands are sen

motion contive schemas

links between, Wi-Fi, Bluetobot used in

palm-sized moor home secuk of this robot

ystem is showin the hand

gnizer, a graph

. The ented n and ction f the

uding

based

ls the ns are

motion

home hree-user

hand alues nt to ntrol s for n the tooth

n our obile urity t has

wn in dheld hical

usergeneimplevedockcurrmou

Figusyste

3. H

A p5. Tits portby micWi-vibr

Figu

ThethatneedaccegoocoreIntecont‘con

r interface erator. The e

plement basic el behaviours king and rechrent status of unted or ceilin

ure 3.Software stem.

Handheld cont

rototype of thThe size of the

weight is abtable equipme

hand motiorocontroller uFi adapter, aration motor, a

ure 4.Hardware

core board it provides mded here. Aeleration acqud real-time p

e board throuernet via thtrolling. Ther

nnect-and-play

(GUI) and embedded prlocomotion bsuch as vid

arging. The obthe robot are

ng-mounted ca

tructure of the h

roller

he handheld co controller is

bout 250g. Thent designed fons. It incl

unit (MCU), a a touch screeas shown in Fi

structure of the

is a small emmost of the A standaloneisition and pr

performance. ugh a serial phe Wi-Fi adrefore, the coy’ service fo

a vibrotactrogram in th

behaviours andeo transmissbstacle inform also useful wameras are av

hand motion-ba

ontroller is sho137mm×90mm

he handheld for controllingludes a corthree-axis acc

en, two USB igure 4.

e handheld cont

mbedded comcommunicati

e MCU is rocessing in orIt communicport. Users ca

dapter for rontroller can

or the users.

tile feedbackhe robot can

nd other high-ion, cruising,

mation and thewhen no wall-ailable.

ased control

own in Figurem×31mm and

controller isg home robotsre board, acelerometer, a

ports and a

troller.

mputer systemon interfacesadopted for

rder to ensureates with thean access the

remote robotprovide theAnd it can

k n -, e -

e d s s a a a

m s r e e e t e n

3Juan Wu, Guifang Qiao, Jun Zhang, Ying Zhang and Guangming Song: Hand Motion-Based Remote Control Interface with Vibrotactile Feedback for Home Robots

www.intechopen.com

Page 4: Hand Motion-Based Remote Control Interface with ...cdn.intechopen.com/pdfs/45323/InTech-Hand_motion_based_remote... · Hand Motion-Based Remote Control Interface with Vibrotactile

provide sufftouchscreen Lfeedback foralso allows configure the

Figure 5.Implesize of the han(L×W×H). Its w

4. Hand moti

4.1 Coordinate

The first stepby the handhframes. Thesystem of thThe three-axiorigin of thcoincide withdirection of vertically oudirection of right side wscreen. Thenthe front dire

Figure 6.3-dimcontroller.

ficient bandwiLCD module

r the users totouch input

e control syste

emented prototyndheld controlleweight is 252g.

ion-based con

e System

p to implemeheld controllee three-dime

he handheld cis acceleromet

he coordinate h the axes depthe Z-axis is t from the scrthe X-axis is

when a user hn, the positiveection in which

mensional coord

idth for videois used to giv

o implement rfor the users

em.

ype of the handr is 137mm×90m

ntrol

enting hand mer is the definensional Cartcontroller is shter is assumed

system and picted in Figudefined as th

reen of the dedefined as th

holds the dev direction of h the device is

inate system of

o transmissiove necessary vremote contros to manage

held controller.mm×31mm

motion recognnition of refertesian coordihown in Figud to be fixed to

its sensing ure 6. The poshe direction gevice. The poshe direction tovice and seesthe Y-axis wis pointing.

the handheld

on. A visual ol. It

and

. The

nition rence inate

ure 6. o the axes

sitive going sitive o the s the ill be

4.2 H

Therotaaviayawoftecomfirstrotarotacan rota In ocontone of mfor perfcomfor tmeathe

Figucontreprusercomtilts com180°user

We robo

Hand Motion R

handheld coated about thation terminol

w, pitch and rn represente

mposed of thot rotation aboation about thation about th

be achievedations.

our work, alltrol are simplof the three

more than onethis kind o

formance is mplicated gestu

the user to leaasured accelerhandheld con

ure 7.Hand mottroller is in the esents the com

r tilts the contrmand to move

the controllemand to make to the right. S

r tilts the control

have designeot navigation

Recognition

ontroller is a 3he three ortho

ogy, these rotoll. The orien

ed by usingose three elemout the Z-axihe X-axis of ishe Y-axis is cad by composi

of the plannele gestures, eaelemental rota

e elemental roof application

not accepures are used.arn and use. Sration values

ntroller.

tion set for conright hand of tmand to turn toller to the rigthe robot forwar forward (baa U-turn whe

S represents theller 180° to the l

ed six simpleand control,

3D rigid bodogonal axes. tations will bentation of the g Euler angmental rotationis is called ys called pitchalled roll. Aning those thr

ed hand motiach of which ations. Gestur

otation are toon. The real-

ptable when . It will also b

So we directlyto describe th

ntrolling the hothe user in this the robot right ght (left). F (B) ard (backward)ackward). U

en the user tiltse command to left.

e hand motioas shown in

y that can beAccording toreferred to asrigid body isles that aren angles. The

yaw, the nexth and the lastny orientationree elemental

ions for robotcontains onlyres composedo complicated-time control

too manybe too difficulty use the threehe rotation of

ome robot. Theexample. R (L)(left) when therepresents the

) when the userrepresents thes the controllerstop when the

ons for homeFigure 7. The

e o s s e e t t

n l

t y d d l

y t e f

e ) e e r e r e

e e

4 Int J Adv Robotic Sy, 2013, Vol. 10, 270:2013 www.intechopen.com

Page 5: Hand Motion-Based Remote Control Interface with ...cdn.intechopen.com/pdfs/45323/InTech-Hand_motion_based_remote... · Hand Motion-Based Remote Control Interface with Vibrotactile

hand motionA user can easpecial trainiany two of thhandheld codifferent hanthese hand commands. Tensure that asystem is imp R (L) represewhen the userepresents th(backward) w(backward). Uwhen the useback of thecommand to the left with t Group X-

axis 1

X-axis × ×

2 Y-axis

↗ ↘

↗, ↘ ↘,↗

Table 1. Handrobot. ‘↗’ reprincreases mmonotonicallydecreasing late

The handhelhand motionintegrated acaxes change sfor any propaxis will chanchange signibecause all theither a singBased on tacceleration therefore ide

Hand motion He

R 0xL 0xF 0xB 0xU 0xS 0x

Table 2.The mcommands.

n patterns are asily repeat aling. There is ahe hand motioontroller to nd motion pa

motions to This kind of usa user-indepenplemented.

ents the commer tilts the cohe commandwhen the usU represents ter tilts the cone controller stop when ththe back of the

Y-axis

HandMotion

↗ F ↘ B × R × L × U × S

d motion chararesents that the

monotonically. y. ‘↗, ↘’ reper. ‘↗, ↘’ is oppo

ld controller cns by analysinccelerometer. simultaneouslposed hand mnge little whilificantly as cohe proposed hgle roll rotatithis principlcharacteristicntify different

Rader Length M

xAA 0x07 xAA 0x07 xAA 0x07 xAA 0x07 xAA 0x07 xAA 0x07

mapping between

all commonlyl these actionsa significant don patterns, saccurately diatterns. Then

correspondinser interface dndent and use

mand to turn thntroller to the

d to move ther tilts the cthe command

ntroller 180° tofacing up.

he user tilts thee controller fa

d n

Position oMAX

FinishingBeginninFinishingBeginnin

Middle×

acteristics for coe N-axis (N = X

‘↘’ repreresents increa

osite to ‘↘,↗’

continuously ng the accelera

The acceleraly during the h

motion, the acle that in the oompared to t

hand motions ion or a single, we can cs of every ht hand motion

Robot CommanMotion D/A S

‘R’ * ‘L’ * ‘F’ * ‘B’ * ‘U’ * ‘S’ *

n hand motion a

y used in dailys well withoutdifference betwo it is easy foistinguish amwe can tran

ng robot condesign allows uer-friendly con

he robot right e right (left). Fhe robot forwcontroller forwd to make a U-o the right withS represents e controller 18acing up.

of Position MIN

g Beginninng Finishing Beginnin

ng Finishin ×

Middle

ontrolling the hX or Y) acceleresents decreasing at first

recognizes thation values otions in the thand motionscceleration in

other two axesthe previous are designed t

gle pitch rotaget the un

hand motion ns.

nd Speed Tail C

* 0x55* 0x55* 0x55* 0x55* 0x55* 0x55

and robot contr

y life. t any ween r the

mong nslate ntrol us to ntrol

(left) F (B) ward ward -turn h the

the 80° to

of

ng ng ng ng

e

home ration easing

and

he six of the three s, but n one s will

one, to be

ation. nique

and

CRC* * * * * *

rol

The7, hdatadiffacceshowmotacce

Figu

Figu

Figu

e six hand mohave been cara to assess erent hand eleration chawn in Figurtions are reelerations in t

ure 8.Acceleratio

ure 9. Accelerati

ure 10.Accelerat

otion trials, wrried out to g

the accelermotions.

anges duringres 8-13. Theeflected by the three axes

on characteristic

on characteristi

ion characterist

which are shogather typicalration charaExample pl

g the hand e directions the variatio

s.

cs of the R moti

ics of the L moti

tics of the F mot

wn in Figurel accelerationacteristics oflots of themotions areof the handn trends of

on.

ion.

tion.

e n f e e d f

5Juan Wu, Guifang Qiao, Jun Zhang, Ying Zhang and Guangming Song: Hand Motion-Based Remote Control Interface with Vibrotactile Feedback for Home Robots

www.intechopen.com

Page 6: Hand Motion-Based Remote Control Interface with ...cdn.intechopen.com/pdfs/45323/InTech-Hand_motion_based_remote... · Hand Motion-Based Remote Control Interface with Vibrotactile

Figure 11.Acce

Figure 12.Acce

Figure 13.Acce

The hand mogroup is wheX-axis, whilerevolved arorotation axis[-2m/s2, +2mmotion recoghandheld conin Figure 8 a

eleration charac

eleration charac

eleration charac

otions can beere the handhee the other oound the Y-s almost rem

m/s2]. Thereforgnition is to fntroller is rev

and Figure 9,

cteristics of the B

cteristics of the U

cteristics of the S

divided intoeld controller one is where-axis. The acains within are, the first sfind the axis avolved. For exthe Y-axis acc

B motion.

U motion.

S motion.

two groups. is revolved bythe controlle

cceleration of a threshold rstep of the haround whichxample, as shceleration rem

One y the er is

f the ange hand h the hown mains

almX-axsign TheotheTheTabis pof ehana 15Thethe smoSamare obtaFinaspeerobo

FiguMoticontwhic

4.3 R

Oncidencontcontin Ffeeddetehanrecocontupdthe

ost unchangexis and the Znificantly.

second step ier two axes to variation trenle 1. The algoresented in Fiach axis are sd motion is de50 acceleration data from thmoving-avera

oothly. By comple_NEW and

their positionained and thally, we can oed and target ot control com

ure 14. The proion represents trol the robot sch the robot nee

Robot Control C

ce one of thentified, it will trol commandtroller. This pFigure15. If thdback from thected. Then thd motion is

ognized corretrol command

dated on the Gvibration mot

d during the Z-axis accelera

is to compare o determine wnds of the six hrithm of the vigure 14. The ampled with etected with an data sequene accelerationage filter to momparing thed Sample_OLDns in the sequhe variation obtain the stydistance/angle

mmands.

ocess of the varthe hand motio

speed. D/A repeds to adjust.

Commands

e hand motiobe converted

d by the controrocess of the c

he controller he robot, a nhe controller correct or noctly, it will b

d. All operatinGUI. The users tor.

R/L roll motations increas

e the variationwhich the hand

hand motionsvariation trend

values of thea fixed 10ms

a 1.5s time spance for each h

n sensor firstlymake the data e two continD, MAX(i) anuence respecttrends can b

yles of hand me, which are n

riation trends ron type. The Sppresents the dis

ons shown into the corresp

ol algorithm rcontrol algoridoes not recenew hand m

will identifyot. If the hanbe mapped inng status mes

s also can get f

tion while thee or decrease

n trends of thed motions are.s are shown inds recognitione accelerationsinterval. Each

an. We can gethand motion.

y need to passcurve change

nuous values,d MIN(j) (i, jtively) can bebe examined.motion, robotneeded for the

ecognized. Thepeed is used tostance or angle

n Figure 7 isponding robotrunning in thethm is showneive the errorotion will be

y whether thend motion isnto the robotssages will befeedback from

e e

e .

n n s h t . s e , j e . t e

e o e

s t e n r e e s t e

m

6 Int J Adv Robotic Sy, 2013, Vol. 10, 270:2013 www.intechopen.com

Page 7: Hand Motion-Based Remote Control Interface with ...cdn.intechopen.com/pdfs/45323/InTech-Hand_motion_based_remote... · Hand Motion-Based Remote Control Interface with Vibrotactile

Figure 15. Con

The seven-byLength, Motfields. The HThe Motion fDistance/Anghandheld convelocity of thwhether the details of the

4.4 Graphical

A graphical hand motionlocally and srobot is in a with only hisbe designed users to imhandheld coshaped virtuthe handheimmediately For example,perform a sttrack will liggenerated. If light, it indidefault levegenerated. Hmethod to shard to reach

ntrol algorithm o

yte commandtion, Distance

Header, Lengthfield is consistgle field is ntroller. The She hand motio

commands ae mapping rela

User Interface

user interfacn based controstays in sightremote site a

s or her nakedto provide ne

mplement remontroller is shal track is use

eld controlleknow what

, if the user titandard R m

ght up to remif the circle in ticates that theel posture wHowever, westop the roboth and hold by

of the handheld

d packet conte/Angle, Speeh and Tail fieltent with the hdecided by

Speed field is dons. The CRCare error decoation are listed

ce (GUI) is nool system whet. However, iand cannot bed eyes. Therefecessary visua

mote control. hown in Fig

ed to display tr. Therefore command is

ilts the contromotion, the rig

nd the user ththe centre of te handheld cwith no moe do not sut since this lethe user.

d controller.

tains the Heaed, Tail and ds are predefihand motions.tilt angle ofdetermined by

C field will enoding or not. d in Table 2.

ot needed foren the robot win most casese seen by the fore, a GUI ha

al feedback forThe GUI of

gure 16. A crthe current po

the user sent to the roller to the righ

ght branch ofhat an R motiothe track givescontroller is inotion comm

uggest using evel of postu

ader, CRC ined. . The f the y the

nsure The

r the works s the user as to r the

f the ross-

ose of will

obot. ght to f the on is s out n its ands

this ure is

Figu

4.5 V

A vvibrperfopeThevibrfeedcounaltermotshow

Figu

ure 16.Graphic u

Vibrotactile Fee

vibration morotactile feedbformed. As srating statuse vibrotactile

ration signals dback varies nts. The frequred by changtor. The charawn in Figure 1

ure 17.Character

user interface of

edback Generato

otor is used back when ashown in Fig

es that need tofeedback genbased on the

in vibration uency of its sging the voltacteristics of v17.

ristics of vibratio

f the handheld c

or

to alert than improper gure 15, theo be fed back

nerator will de statuses. Thfrequencies a

sinusoidal vibtage level apvibration mot

on motor outpu

controller.

he user withoperation is

re are manyk to the users.determine thehe vibrotactileand vibration

bration can bepplied to theor output are

ut.

h s y . e e n e e e

7Juan Wu, Guifang Qiao, Jun Zhang, Ying Zhang and Guangming Song: Hand Motion-Based Remote Control Interface with Vibrotactile Feedback for Home Robots

www.intechopen.com

Page 8: Hand Motion-Based Remote Control Interface with ...cdn.intechopen.com/pdfs/45323/InTech-Hand_motion_based_remote... · Hand Motion-Based Remote Control Interface with Vibrotactile

Recognized

CouRL F BUS

No DeteSingle m

recognitiAver

recogniti

Table 3.Recogn

5. Experimen

Some experimperformance system. The motion recogthe performinterface in tinvestigated.

5.1 Robot Com

The control based controresponse seinterface genthe robot tooand execute will be discato be unsmogenerates antoo slowly, and delay. unacceptable The responseits commandwhen we adjthe control ina CER of 10characterizatenable a CERbeen done. 3each test. Thclear that wincreases witCER is 97%. almost unchcontrol, the C

d Motion Runt 10R 9

40

0U 0

1ection 0

motion ion rate

95

age ion rate

nition results of

nts

ments have bof the proporobot comm

gnition rate omance of the

the robot nav

mmands Execut

flexibility ofol interface ensitivity of nerates and seo fast, the robevery comm

arded, which ooth and unsnd sends the c

the user wilThe contro

e.

e sensitivity ofd execution rajust the commnterface. For m00% is neededtion tests to dR of 100%. Fiv30 control comhe test results

when the CDIth the CDI. WIf the CDI inc

hanged. With CDI of100ms is

ActualR L F 00 100 100 95 15 4 4 81 0 0 0 89 0 1 6 0 0 0 1 0 0 0 3 1 5% 81% 89%

88

f the hand motio

been carried osed hand mot

mand executioof the handhe

hand motiovigation applic

tion

f the proposeis directly in

the robot. ends the contbot will not b

mand in time. will cause th

stable. If the control comml become awol efficiency

f the robot canate (CER). Themand detectionmost of the cod. Therefore,determine thee tests with di

mmands are ses are shown iI is less than

When the CDI creases again,consideration

s chosen for o

l Motion B U

100 100 11 2 1 0 0 0 92 0 3 90 3 8 0 0

91% 90% 8

8.5%

ons.

ut to evaluatetion-based con

on rate, the held controller on based concations have b

ed hand motnfluenced by

If the control commandbe able to proSome comma

he control procontrol inter

mands to the rware of the py will then

n be quantifiee CER will chn interval (CDntrol applicat, we have care shortest CDifferent CDIs hent to the robin Figure 18. 100ms, the equals 100ms, the CER remn of the real-ur control sys

S 100

0 8 0 0 6 85 1

85%

e the ntrol hand

and ntrol been

tion-y the ntrol ds to ocess ands ocess rface robot ause

n be

ed by hange DI) of tions, rried

DI to have ot in It is CER

s, the mains -time stem.

Figucom

5.2 H

Severecofreetrajemotindisamspee Thehanbefohanfor mot3. Baverreco

5.3 P

In contwasFigutestbcomin thuserto rhancontallomouconthomrout

ure 18.Commanmand detection

Hand Motions

eral factors cognition. First space so thectory for anytion will be ividual user d

me hand motioed each time.

refore we seld motion rec

orehand. Eachd motion 20 100 times. Fi

tions were recBoth the singlerage recognitognition rate o

Performance of

order to testroller in the s built in our lure 19. The sibed surface i

mplete the navhe figure, whirs are asked tremotely condheld controltrol program. ws the usersuse interface, trol the robot.

me robot connter through W

d execution raten interval (CDI).

Recognition

can affect thely, all the 3D

hat it is impy motion type

made differifferences. Eve

on will be mad

ected five subcognition testh person was times, so eachinally, 600 teorded in the de hand motiotion rate areof six hand mo

f Handheld Con

st the perforrobot navigataboratory. Thide length of s 30cm. The

vigation task bich is from poo use the han

ntrol the robller and the PBut the contr

s to use the which only hThe handheld

nect to a wiWi-Fi.

e (CER) changes.

e results of hand motion

possible to dee. Secondly, thrently each

ven for the samde with differe

bjects to partts without sp

asked to reph hand motio

est results of database, as shon recognitione calculated. otions is 88.5%

ntroller

rmance of ttion applicatio

he testbed setueach white shome robot

by following tosition A to pondheld controbot respectivePC run the samrol program on

traditional khas several bad controller, thireless local a

s with the

hand motions are made inefine a fixedhe same handtime due to

me person, theent range and

ticipate in thepecial trainingpeat the sameon was testedthe six hand

hown in Tablen rate and the

The average%.

he handheldons, a testbed

up is shown insquare on theis ordered tothe pink linesosition B. The

oller and a PCely. Both theme high-leveln the PC only

keyboard andsic buttons tohe PC and thearea network

n n d d o e d

e g e d d e e e

d d n e o s e C e l

y d o e k

8 Int J Adv Robotic Sy, 2013, Vol. 10, 270:2013 www.intechopen.com

Page 9: Hand Motion-Based Remote Control Interface with ...cdn.intechopen.com/pdfs/45323/InTech-Hand_motion_based_remote... · Hand Motion-Based Remote Control Interface with Vibrotactile

Figure 19.Test

The navigatiocontrol interfthe deviationdestination isthe successfuattempt by while the navThe success controller is 9control is ocontroller is 1times of the average timehandheld con

Figure 20.Dest

Figure 21.Exec

bed setup for th

on test is repeface. The test n between the s less than 10cm

ul tests are shothe handheldvigation attemrate of naviga93.33%. Meanwonly 80%. Th15.4% more thnavigation tese to completntroller is 24.7%

tination deviatio

cution times of t

he robot naviga

eated 15 timeis considered actual destinam. The destin

own in Figure d controller sumpt by the PC

ation attemptswhile, the suche precision han that of the sts are shownte the naviga% less than tha

ons of the navig

the navigation t

ation tests.

s for each typto be successf

ation and the iation deviation20. The navigaucceeds 14 tisucceeds 12 ti

s by the handcess rate of thof the handPC. The execu

n in Figure 21.ation task byat by the PC.

gation tests.

tests.

pe of ful if ideal ns of ation imes, imes. dheld he PC dheld ution . The

y the

6. C

We hancontaxisimpbasethe homvisuhanTheDiffcomaccehanproprecohomhancont15.424.7efficbase In twilland

7. A

TheRobScieJian ThisFouNatGraNewGra

8. R

[1] C

[2] R

[3] G

Conclusion

have presentd motion-batrol of home

s accelerometeplement hand ed robot contr‘connect-and-

me robot. Meaual and vibrotd gestures arey are easily u

ferent hand mmparing the receleration chard motion reposed handhognition rate me robot navig

dheld controltroller. The p

4% more than7% less than thciency and fleed control inte

the future, wel enable the u

d improve the h

Acknowledgme

research repobotic Sensor aence and Engingsu, China.

s work was suundation of Ch

ural Science nt BK2009103

w Century Exnt NCET-10-0

References

C. D. Nugent,E. Prassler, independent vol. 5, no. 1, p

R.C. Luo, T.developmentIEEE Int. Co422-427, 2005

G. Song, Y. Zharchitecture

ted the designsed interfacerobots. A han

er and a vibramotion recog

rol. The handh-play’ service anwhile, it imtactile feedbace defined for nused by peoplmotions are idcorded accelerracteristics of ecognition tesheld controllof 88.5%. Th

gation show tller is 13.33% precision of tn that by the he PC based cexibility of therface.

e plan to enriusers to contrhand motion r

ents

orted in this pand Control Lineering, Sout

upported in phina under GraFoundation o

3 and BK2011xcellent Talen

0330.

, D. D. Finlay,“Home autliving”, IEEE

pp. 1-9, 2008. Y. Hsu, T.Y.t of intelligennf. on Mechatr

5. hou, Z. Wei anfor adding m

n and impleme system forndheld controation motor isgnition and hheld controllefor the users

mplements thck. Six simple

navigating thele without spdentified by aration data wi

every hand st results shler achieved he experimenthat the succehigher than

the handheldPC. The execcontroller. Th

he proposed h

ich the hand rol the robot recognition ra

paper was carrLab, School otheast Univer

part by the Naant 60875070 aof Jiangsu Pr1254 and the nts in Unive

, P. Fiorini, Y. tomation as E Trans. Auto

. Lin and Kent home sectronics, Taipei,

nd A. Song, “mobility to w

mentation of ar locomotion

oller with a 3-s proposed tohand motion-er can provideto control the

he function ofe and natural

e home robots.ecial training.

analysing andith the unique

motion. Thehow that the

an averagental results ofess rate of thethe PC based

controller iscution time is

his verifies thehand motion-

motions thatmore flexibly

ate.

ried out at theof Instrumentrsity, Nanjing,

atural Scienceand 60905045,ovince underProgram for

rsities, under

Tsumaki anda means of

om. Sci. Eng.,

K.L. Su, “Thecurity robot”,, Taiwan, pp.

A smart nodeireless sensor

a n -o -e e f l . .

d e e e e f e d s s e -

t y

e t ,

e , r r r

d f ,

e , .

e r

9Juan Wu, Guifang Qiao, Jun Zhang, Ying Zhang and Guangming Song: Hand Motion-Based Remote Control Interface with Vibrotactile Feedback for Home Robots

www.intechopen.com

Page 10: Hand Motion-Based Remote Control Interface with ...cdn.intechopen.com/pdfs/45323/InTech-Hand_motion_based_remote... · Hand Motion-Based Remote Control Interface with Vibrotactile

networks”, Sens Actuators A Phys, vol. 147, no. 1, pp. 216–221, 2008.

[4] G. Song, K. Yin, Y. Zhou and X. Cheng, “A surveillance robot with hopping capabilities for home security”, IEEE Trans Consum Electron, vol. 55, no. 4, pp. 2034-2039, 2009.

[5] G. Song, H. Wang, J. Zhang and T. Meng, “Automatic docking system for recharging home surveillance robots”, IEEE Trans Consum Electron, vol. 57, no. 2, pp. 428-435, 2011.

[6] Y. Oh, J. Yoon, J. Park, M. Kim and H. Kim, “A name recognition based call-and-come service for home robots”, IEEE Trans Consum Electron, vol. 54, no. 2, pp. 247-253, 2008.

[7] H. Liu, “Exploring Human Hand Capabilities into Embedded Multifingered Object Manipulation”, IEEE Transactions on Industrial Informatics, vol. 7, no. 3, pp. 389-398, 2011.

[8] D. Lee and K. Hong, “Game interface using hand gesture recognition”, Proc. - Int. Conf. Comput. Sci. Convergence Inf. Technol., (ICCIT), Seoul, Korea, pp. 1092-1097, 2010.

[9] J. Kim, N. D. Thang and T. Kim, “3D hand motion tracking and gesture recognition using a data glove”, IEEE Intl Symp on Industrial Electronics, Seoul, pp. 1013-1018, 2009.

[10] Z. Ju, H. Liu, “A Unified Fuzzy Framework for Human-Hand Motion Recognition”, IEEE Transaction on Fuzzy Systems, vol. 19, no. 5, pp. 901-913. 2011.

[11] C. Zhu, W. Sun and W. Sheng, “Wearable sensors based human intention recognition in smart assisted living systems”, IEEE Intl Conf on Information and Automation, Zhangjiajie, China, pp. 954-959, 2008.

[12] C. Zhu and W. Sheng, “Wearable sensor-based hand gesture and daily activity recognition for robot-assisted living”, IEEE Transactions on Systems, Man, and Cybernetics-Part A: Systems and Humans, vol. 41, no. 3, pp. 569-573, 2011.

[13] D. Lee, J. Lim, J. Sunwoo, I. Cho and C. Lee, “Actual remote control: a universal remote control using hand motions on a virtual menu”, IEEE Trans Consum Electron, vol. 55, no. 3, pp. 1439-1446, 2009.

[14] V. A. Prisacariu and I. Reid, “Robust 3d hand tracking for human computer interaction”, IEEE Intl Conf on Automatic Face & Gesture Recognition and Workshops, pp. 368-375, 2011.

[15] S. Kim, G. Park, S. Yim, S. Choi and S. Choi, “Gesture-Recognizing hand-held interface with vibrotactile feedback for 3d interaction”, IEEE Trans Consum Electron, vol. 55, no. 3, pp. 1169-1177, 2009.

[16] X. Wu, M. Su and P. Wang, “A hand-gesture-based control interface for a car-robot”, IEEE/RSJ Intl Conf on Intelligent Robots and Systems, Taipei, Taiwan, pp. 4644-4648, 2010.

[17] R. Xu, S. Zhou and W.J. Li, “MEMS Accelerometer Based Nonspecific-User Hand Gesture Recognition”, IEEE Sensors Journal, vol.12, no. 5, pp. 1166-1173, 2012.

[18] S. Zhou, Q. Shan, F. Fei, W.J. Li, C. Kwong, P. Wu, B. Meng, C. Chan and J. Liou, “Gesrure Recognition for Interactive Controllers Using MEMS Motion Sensors”, IEEE International Conf on Nano/Micro Engineered and Molecular Systems, Shenzhen, China, pp.935-940, 2009.

[19] S. Luczak, W. Oleksiuk and M. Bodnicki, “Sensing tilt with MEMS accelerometers”, IEEE Sensors Journal, vol. 6, no. 6, pp. 1669-1675, 2006.

[20] H. J. Luinge and P. H. Veltink, “Inclination measurement of human movement using a 3D accelerometer with autocalibration”, IEEE Trans. Neural Syst. Rehabil. Eng., vol. 12, no. 1, pp. 112-121, 2004.

[21] B. Lee, W. Bang, J. D. K. Kim and C. Y. Kim, “Orientation estimation in mobile virtual environments with inertial sensors”, IEEE Trans Consum Electron, vol. 57, no. 2, pp. 802-810, 2011.

[22] M. Sigalas, H. Baltzakis and P. Trahanias, “Gesture recognition based on arm tracking for human-robot interaction”, IEEE/RSJ Intl Conf on Intelligent Robots and Systems, Taipei, Taiwan, pp. 5424-5429, 2010.

[23] M. Bergh, D. Carton, R. Nijs, N. Mitsou, C. Landsiedel, K. Kuehnlenz, D. Wollherr, L. Gool and M. Buss, “Real-Time 3D hand gesture interaction with a robot for understanding directions from humans”, in IEEE Intl Symp on Robot and Human Interactive Communication, Atlanta, GA, USA, pp. 357-362, 2011.

[24] G. Yang, D. Ryu and Sungchul Kang, “Vibrotactile display for hand-held input device providing spatial and directional information”, Third Joint EuroHaptics conference and Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems, Salt Lake City, UT, USA, pp. 79-84, 2009

[25] Y. Zheng and J.B. Morrell, “A vibrotactile feedback approach to posture guidance”, IEEE Haptics Symp, Waltham, MA, USA, 2010, pp. 351-358.

[26] Q. An, Y. Matsuoka and C.E. Stepp, “Multi-day training with vibrotactile feedback for virtual object manipulation”, IEEE Intl Conf on Rehabilitation Robotics, Zurich, Switzerland, pp. 1-5, 2011.

10 Int J Adv Robotic Sy, 2013, Vol. 10, 270:2013 www.intechopen.com