research article kinect-based sliding mode control for lynxmotion robotic...

11
Research Article Kinect-Based Sliding Mode Control for Lynxmotion Robotic Arm Ismail Ben Abdallah, 1,2 Yassine Bouteraa, 1,2 and Chokri Rekik 1 1 Control and Energy Management Laboratory (CEM-Lab), National School of Engineers of Sfax, Sfax, Tunisia 2 Digital Research Center of Sfax, Technopark of Sfax, BP 275, Sakiet Ezzit, 3021 Sfax, Tunisia Correspondence should be addressed to Yassine Bouteraa; [email protected] Received 30 November 2015; Revised 24 May 2016; Accepted 7 June 2016 Academic Editor: Alessandra Agostini Copyright © 2016 Ismail Ben Abdallah et al. is is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited. Recently, the technological development of manipulator robot increases very quickly and provides a positive impact to human life. e implementation of the manipulator robot technology offers more efficiency and high performance for several human’s tasks. In reality, efforts published in this context are focused on implementing control algorithms with already preprogrammed desired trajectories (passive robots case) or trajectory generation based on feedback sensors (active robots case). However, gesture based control robot can be considered as another channel of system control which is not widely discussed. is paper focuses on a Kinect- based real-time interactive control system implementation. Based on LabVIEW integrated development environment (IDE), a developed human-machine-interface (HMI) allows user to control in real time a Lynxmotion robotic arm. e Kinect soſtware development kit (SDK) provides a tool to keep track of human body skeleton and abstract it into 3-dimensional coordinates. erefore, the Kinect sensor is integrated into our control system to detect the different user joints coordinates. e Lynxmotion dynamic has been implemented in a real-time sliding mode control algorithm. e experimental results are carried out to test the effectiveness of the system, and the results verify the tracking ability, stability, and robustness. 1. Introduction e use of manipulator robot controlled by a natural human- computer interaction offers new possibilities in the execution of complex tasks in dynamic workspaces. Human-robot interaction (HRI) is a key feature which differentiates the new generation of robots from conventional robots [1, 2]. Many manipulator robots have similar behavior with human arm. anks to the gesture recognition using Kinect sensor, the motion planning is easier when the robotic arm has the same degree of freedom of the human arm. Gesture recognition using Kinect sensor is based on the skeleton detection, although current development packages and toolkits provide base to identify the joints position of the detected skeleton. Kinect is a motion detection and recognition smart sensor which allows human/computer interaction without the need of any physical controllers [3]. Indeed, Kinect sensor is a motion sensor that provides a natural user interface available for several applications in different fields including game- based learning systems [4, 5], stroke rehabilitation [6, 7], helping visually impaired people [8, 9], navigation systems [10–12], and other fields [13, 14]. Based on its internal proces- sor, Kinect sensor can recognize human movement patterns to generate a corresponding skeleton coordinates that can be provided in a computer environment such as Matlab [15, 16] and LabVIEW [17, 18]. erefore, the dynamic gesture recognition technology has gained increased attention. Several approaches have been developed, in order to improve recognition or take advantage of the existing algo- rithms. Reference [19] proposes easy gesture recognition approach in order to reduce the effort involved in imple- menting gesture recognizers with Kinect; the practical results with these developed packages are acceptable. Furthermore, an eigenspace-based method was developed in [20] for identifying and recognizing human gestures by using Kinect Hindawi Publishing Corporation Advances in Human-Computer Interaction Volume 2016, Article ID 7921295, 10 pages http://dx.doi.org/10.1155/2016/7921295

Upload: others

Post on 25-Jun-2020

3 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Research Article Kinect-Based Sliding Mode Control for Lynxmotion Robotic …downloads.hindawi.com/journals/ahci/2016/7921295.pdf · 2019-07-30 · is LabVIEW-based user interface

Research ArticleKinect-Based Sliding Mode Control for LynxmotionRobotic Arm

Ismail Ben Abdallah12 Yassine Bouteraa12 and Chokri Rekik1

1Control and Energy Management Laboratory (CEM-Lab) National School of Engineers of Sfax Sfax Tunisia2Digital Research Center of Sfax Technopark of Sfax BP 275 Sakiet Ezzit 3021 Sfax Tunisia

Correspondence should be addressed to Yassine Bouteraa yassinebouteraagmailcom

Received 30 November 2015 Revised 24 May 2016 Accepted 7 June 2016

Academic Editor Alessandra Agostini

Copyright copy 2016 Ismail Ben Abdallah et al This is an open access article distributed under the Creative Commons AttributionLicense which permits unrestricted use distribution and reproduction in any medium provided the original work is properlycited

Recently the technological development of manipulator robot increases very quickly and provides a positive impact to human lifeThe implementation of the manipulator robot technology offers more efficiency and high performance for several humanrsquos tasksIn reality efforts published in this context are focused on implementing control algorithms with already preprogrammed desiredtrajectories (passive robots case) or trajectory generation based on feedback sensors (active robots case) However gesture basedcontrol robot can be considered as another channel of system control which is not widely discussedThis paper focuses on a Kinect-based real-time interactive control system implementation Based on LabVIEW integrated development environment (IDE) adeveloped human-machine-interface (HMI) allows user to control in real time a Lynxmotion robotic arm The Kinect softwaredevelopment kit (SDK) provides a tool to keep track of human body skeleton and abstract it into 3-dimensional coordinatesTherefore the Kinect sensor is integrated into our control system to detect the different user joints coordinates The Lynxmotiondynamic has been implemented in a real-time sliding mode control algorithm The experimental results are carried out to test theeffectiveness of the system and the results verify the tracking ability stability and robustness

1 Introduction

The use of manipulator robot controlled by a natural human-computer interaction offers new possibilities in the executionof complex tasks in dynamic workspaces Human-robotinteraction (HRI) is a key feature which differentiates the newgeneration of robots from conventional robots [1 2]

Many manipulator robots have similar behavior withhuman arm Thanks to the gesture recognition using Kinectsensor the motion planning is easier when the roboticarm has the same degree of freedom of the human armGesture recognition using Kinect sensor is based on theskeleton detection although current development packagesand toolkits provide base to identify the joints position of thedetected skeleton

Kinect is amotiondetection and recognition smart sensorwhich allows humancomputer interaction without the needof any physical controllers [3] Indeed Kinect sensor is a

motion sensor that provides a natural user interface availablefor several applications in different fields including game-based learning systems [4 5] stroke rehabilitation [6 7]helping visually impaired people [8 9] navigation systems[10ndash12] and other fields [13 14] Based on its internal proces-sor Kinect sensor can recognize human movement patternsto generate a corresponding skeleton coordinates that canbe provided in a computer environment such as Matlab [1516] and LabVIEW [17 18] Therefore the dynamic gesturerecognition technology has gained increased attention

Several approaches have been developed in order toimprove recognition or take advantage of the existing algo-rithms Reference [19] proposes easy gesture recognitionapproach in order to reduce the effort involved in imple-menting gesture recognizers with Kinect the practical resultswith these developed packages are acceptable Furthermorean eigenspace-based method was developed in [20] foridentifying and recognizing human gestures by using Kinect

Hindawi Publishing CorporationAdvances in Human-Computer InteractionVolume 2016 Article ID 7921295 10 pageshttpdxdoiorg10115520167921295

2 Advances in Human-Computer Interaction

3D data A user adaptation scheme for this method wasadded to enhance Eigen3Dgesture and the performance ofthe constructed eigenspace of Kinect 3D data Based onMicrosoftrsquos ldquoKinect for Windows SDKrdquo [21] uses the API formeasuring movement of people with Parkinsonrsquos disease

Gesture control is mainly used for telemanipulationin several modes In [22] a clientserver structured robotteleoperation application system is developed in networkedrobot mode Reference [23] describes gesture based tele-manipulation of an industrial robotic arm in master-slavemode for unstructured and hazardous environments a max-imum velocity drift control approach is applied allowingthe amateur user to control the robot by simple gesturesand force control was combined with the gesture basedcontrol to perform safe manipulation However this solutionis costly and depends on reliability of the force-torque sensorIn addition a method of human-robot interaction usingmarkerless Kinect-based tracking of the human hand forteleoperation of a dual robot manipulator is presented in[24] Based on hand motion the user can control the robotmanipulators to perform tasks of picking up and placing thework was applied in virtual environment

For robot teleoperation many researches have been doneproviding the possibility of controlling the robotic systemsdedicated for complex tasks [25] Several techniques are usedto control robot motion a human-robot interface based onhand-gesture recognition is developed in [26ndash30] Othershave based on hand-arm tracking [31] presents a method ofreal-time robot manipulator teleoperation using markerlessimage-based hand-arm tracking

Many problems occurred during the human-robot inter-action when using the Kinect sensor After the acquisitionof 3D data Kinect these data will be processed to controlthe robot Several approaches have been used to check thecontrol A Cartesian impedance control is used to control thedual robot arm [32] this approach allows the robot to followthe human arm motion without solving inverse kinematicproblems and avoiding self-collision problems Likewise aproportional derivative control is implemented with inversekinematic algorithm in [33] In [34] a PID controller is alsoused to control a 2-degree of freedom LegoMind stormNXTrobotic arm

Based on online HIL (Hardware-in-the-Loop) experi-mental data the acquired input data from the Kinect sensorare processed in a closed loop PID controller with feedbackfrommotors encoders Although these results are satisfactoryside position control they remain imperfect from velocitycontrol stand point due to the use of simple controller usedMoreover compared to the conventional computed torquecontrol used in [35] the Lynxmotion dynamic has beenimplemented in a real-time sliding mode control algorithm

To combine the real and virtual objects and the imple-mented algorithms we need to design a user interface thatmanages the overall system Indeed the human-machineinterface (HMI) is highly recommended in robotics fieldsand many human-machine interfaces have been developedto control the robotic systems [36 37] In [38] a human-machine interface is designed to control the SCARA robot

using Kinect sensor This interface is implemented in Cusing Kinect library OpenNI

In this paper we have used the package provided byVirtual Instrument Package Manager for LabVIEW Therecognition algorithm calculates the angle of each humanarm joint dedicated for control This algorithm is detailed inSection 1

Compared to Shirwalkar et al [23] firstly the proposedcontrol strategy takes into account the dynamics system Alsoour algorithm detects all user joints and not only the handwhile for the security side we propose an algorithm securitywhich avoids any type of parasite thatmay occur in theKinectsensing field Moreover the proposed algorithm avoids anysingularity position

Likewise a proportional derivative control is imple-mented with inverse kinematic algorithm [33] Howeveron the one hand this classic controller ignores the robotdynamics On the other hand this controller is limited toposition control and does not consider the motion control

Regarding thework in [38] we develop a generic interfacethat encompasses the different components of the global sys-tem virtual 3D manipulator Kinect sensor feedback valuesskeleton image implemented dynamic model implementedsliding mode control implemented recognition algorithmfor calculating the joints position and implemented securityalgorithm This LabVIEW-based user interface is describedin Section 5

The present paper is organized as follows in Section 2we presents the recognition method The dynamic modelof the Lynxmotion robotic arm is detailed in Section 3 InSection 4 the control design based on the sliding modecontrol approach is described Human-machine interface ispresented as well as experiment results in Section 5 Finallywe conclude with a work summary

2 Recognition Method

Several methods are used to detect the human arm move-ments Kinect sensor is the most popular thanks to skeletondetect and depth computing The Kinect is a motion sensorthat can measure three-dimensional motion of a person Infact Kinect sensor generates a depth and image (119909 119910 and 119911)of human body In addition the Kinect sensor can providespace coordinates for each joint

To enjoy these benefits we need an Application Program-merrsquos Interface (API) to the Kinect hardware and its skeletaltracking software Microsoftrsquos ldquoKinect for Windows SDKrdquowas used to provide an API to the Kinect hardware ThisAPI provides an estimate for the position of 20 anatomicallandmarks in 3D at a frequency of 30Hz and spatial and depthresolution of 640times480 pixels as shown in Figure 1The defaultsmoothing parameters are as follows

(i) Correction factor = 05(ii) Smoothing factor = 05(iii) Jitter radius = 005m(iv) Maximum deviation radius = 004m(v) Future prediction = 0 frames

Advances in Human-Computer Interaction 3

Figure 1 Kinect for Windows SDK detected joints

Start Config Read Render Stop

Figure 2 Operational Layout for the Kinect LabVIEW toolkit

Based on Microsoft Kinect SDK the Kinesthesia Toolkitfor Microsoft Kinect is developed at University of Leedsinitially for the medical rehabilitation and surgical tools Inaddition the toolkit helps the NI LabVIEW programmersto access and use the popular functions of the MicrosoftKinect camera such as RGB video depth camera and skeletaltracking The toolkit comes fully packaged using JKIrsquos VIPackage Manager

In addition the toolkit allows the user to initialize andclose the Kinectrsquos different components through polymorphicVIs (RGB camera depth camera and skeletal tracking)dependent on the functionalities they require The Opera-tional Layout for the Kinect LabVIEW toolkit is described inFigure 2

The first step is to initialize the Kinect this act opensthe device and returns a reference The second step is toconfigure the Kinect In this step we can choose the RGBstream format enable the skeleton smoothing and select theappropriate event (video feedback from the RGB camera 3Dskeleton information or depth information) In the next stepwe can get skeleton depth and video information from theKinect in the main loop such as while loop The additionalimplementations can be put in the same loop Finally we stopthe Kinect and close the communication

In this paper a user interface containing a skeleton displayscreen has been developed as shown in Figure 3 The screen

Figure 3 Skeleton and user display screen

x

y

z

120579112057921205793

Shoulder

x

y

z

Figure 4 Human arm and manipulator robot model

has been reserved to the skeleton display and the user videorecording

In this work we focus on the human arm portion ofthe skeleton data In fact the human arm can be consideredas being a 4DOF manipulator arm similar to Lynxmotionrobotic arm In our case a robot manipulator model hasbeen developed with a 3DOF Therefore we assume that theuser has the opportunity to control the robot based on hisright arm gestures The three considered movements are asfollows

(i) Flexion-extension for the right shoulder joint

(ii) Abduction-adduction for the right shoulder joint

(iii) Flexion-extension for the right elbow joint

The similarity of the human arm and the robot manipulatormodel is shown in Figure 4

After the acquisition of each elementary joint positionan additional calculation is performed to determine thedesired speed and acceleration All these parameters will beconsidered as the desired trajectories for the manipulatorrobot control algorithm

3 Dynamic Model of Lynxmotion Robotic Arm

The dynamic model deals with the torque and force causingthe motion of the mechanical structure In this sectionrobotic arm dynamic model has been computed based on

4 Advances in Human-Computer Interaction

Euler-Lagrange formulation The potential and kinetic ener-gies of each link have been computed by the two followingequations

119906119894= minus119898119894119892119879 119894

119894119875

119870119894=

1

2

119898119894V119879119888119894V119888119894+

1

2

119894

119894119908

119879 119894

119894119868119894

119894119908

(1)

where119898119894is the 119894th link mass V

119888119894is the 119894th link linear velocity

119894

119894119908 is the 119894th link angular velocity 119894

119894119868 is the 119894th inertia tensor

and 119894119894119875 is the 119894th link positionFurther calculation the dynamic model of a full actuated

robot manipulator is expressed as follows

119872(119902) + 119873 (119902 ) = 120591 (2)

where 120591 is 119899 times 1 vector consisting on applied generalizedtorque 119872(119902) is 119899 times 119899 symmetric and positive definite inertiamatrix 119873(119902 ) is the vector of nonlinearity term and 119902 isin

119877119899times119899The vector of nonlinearity term hails from centrifugal

Coriolis and gravity terms described in the following equa-tions

119873(119902 ) = 119881 (119902 ) + 119866 (119902)

119881 (119902 ) = 119861 (119902) [ ] + 119862 (119902) []2

(3)

Here robot manipulator dynamic equation can be written as

120591 = 119872(119902) + 119861 (119902) [ ] + 119862 (119902) []2+ 119866 (119902) (4)

where 119861(119902) is matrix of Coriolis torque 119862(119902) is matrix ofcentrifugal torque [ ] is vector of joint velocity obtainableby [1sdot 2 1sdot 3

1sdot 119899 2sdot 3 ]119879 and []

2 is vectorwitch can be obtained by [

1

2 2

2 3

2 ]

119879In our work a 3 DOF manipulator model has been

developed using the Lagrange formulation The goal wasto determine the Coriolis centrifugal and gravitationalmatrices for real implementation reasons

4 Control Design

Robot manipulator has highly nonlinear dynamic param-eters For this reason the design of a robust controller isrequired In this section we study the motion controllerldquosliding-mode controlrdquo

The dynamicmodel of the robotmanipulator is describedas follows

119872(119902) + 119862 (119902 ) + 119892 (119902) = 120591 (5)

where 119902 = [1199021 1199022 1199023]119879 and 120591 = [120591

1 1205912 1205913]119879

Let 119902119889(119905) denote the desired trajectoryThe tracking error

is defined as

119890 = 119902119889 (

119905) minus 119902 (119905) (6)

Define 119903= 119889+119904(119902119889minus119902) where s is a definite positivematrix

According to the linear characteristic of robotic weobtain

(119902) 119903+ (119902 )

119903+ (119902) = 120574 (119902

119903 119903) (7)

where is the robot estimated parameters vector and120574(119902 119902

119903 119903) is a repressor matrix

Define (119902) = 119867 (119902) minus (119902)

(119902) = 119862 (119902 ) minus (119902 )

(119902) = 119892 (119902) minus (119902)

(8)

The sliding vector is selected as

119904 = 119890 + 120583 lowast 119890 (9)

We propose the sliding mode control law as follows

120591 = (119902) 119903+ (119902 )

119903+ (119902) + 119904 + 119896 lowast sgn (119904) (10)

where 119896119894= sum119895120574119894119895119895

Define

119894Such forall119894

10038161003816100381610038161003816119894

10038161003816100381610038161003816le 119894

120574119894119895Such forall119894

10038161003816100381610038161003816120574119894119895

10038161003816100381610038161003816le 120574119894119895

(11)

Theorem 1 The proposed controller guarantees the asymptoticstability of the system

Proof Select the LEC as

119881 (119905) =

1

2

119904119879119872(119902) 119904 997904rArr

(119905) = 119904119879119872(119902) 119904 +

1

2

119904119879 (119902) 119904

= 119904119879119872(119902) 119904 + 119904

119879119862 (119902) 119904

= 119904119879[119872 (119902)

119903+ 119862 (119902 )

119903+ 119892 (119902) minus 120591]

(12)

Replacing 120591 by its expression (10) yields

(119905)

= 119904119879[ (119902)

119903+ (119902 )

119903+ (119902) minus 119896 lowast sgn (119904) minus 119904]

= 119904119879[120574 (119902

119903 119903) minus 119896 sdot sgn (119904) minus 119904]

(13)

where 120574(119902 119903 119903) = [120574

119894119895] such as |120574

119894119895| le 120574119894119895

And = [119894] such as |

119894| le 119894119895

Yield

(119905) = sum

119894

sum

119895

119904119894120574119894119895119895minus sum119904

119894119870119894sgn (119904

119894) minus sum

119894

1199042

119894

= sum

119894

sum

119895

119904119894120574119894119895119895minus sum

119894

sum

119895

1003816100381610038161003816119904119894

1003816100381610038161003816120574119894119895minus sum

119894

1199042

119894le minussum

119894

1199042

119894

le 0

(14)

This proves the stability

Advances in Human-Computer Interaction 5

Kinect sensor

USB 20

Ges

ture

Skeleton data

User

Kinect toolkit

Joint positions

Joint velocities

Joint accelerations

Desired positions

Virtual robot

Human machine interface

Dynamic model

Sliding mode control

Mi(qi)qi + Ci(qi qi)qi + gi(qi) = 120591i

Desired pulse width

Real pulse width

Com

pute

d to

rque

USB

20

Real

join

t pos

ition

s

Arduino board

Lynxmotion AL5D

Control designand simulation

+ +minus

Figure 5 System overview

Figure 6 First screen of the user interface

Figure 7 Third screen of the user interface

5 Experiment Results

In the beginning we had some packages to install Indeedthe JKIrsquos VI Package Manager such as Kinesthesia Toolkit forMicrosoft Kinect LabVIEW Interface for Arduino (LIFA)and control design and simulation module are strictlyrequired for the implementation and then for the propersystem functioning

Figure 8 Fourth screen of the user interface

In this section we present a LabVIEW-based human-machine interface The virtual robot has been designed in3D LabVIEW virtual environment The virtual robot has thesame kinematics parameters of the real robot such as jointsaxes and links distances A user interface screen has beenreserved to display the virtual robotmovement versus the realrobot

The control law based on the sliding mode controlhas been successfully implemented with the parametersdiscussed in the previous section

Going from gesture applied by the human arm Kinectsensor captures the human arm motion and abstracts itin 3D coordinates Using Kinect for Windows SDK theskeleton data are provided and transmitted to the recognitionalgorithm Based onKinesthesia Toolkit forMicrosoftKinectthe human arm joint positions velocities and accelerationsare computed and considered as desired trajectories of therobotic arm By helping of LabVIEW control design and sim-ulation the developed dynamic model of 3DOF Lynxmotionrobotic arm is implemented Also a sliding mode controlalgorithm is implemented ensuring the high performance inthe trajectories tracking Furthermore the computed torque

6 Advances in Human-Computer Interaction

Figure 9 User position checking process

Size (x)

Filtered X

(XXf)Smoothing

Signal

Order

Low cutoff freq

X

Samplingfrequency

Enablesmoothing

Figure 10 Butterworth-based designed filter

Angle in Invalid position

Angle out

AvoidNaN

values

Figure 11 Avoid invalid position sub-VI

is transmitted to the servomotors moving the mechanicalstructure of the robotic arm Moreover the desired jointpositions are applied in the virtual environment to movethe virtual robot in order to prove the compatibility of thedesired motion in both real and virtual environment Hencean augmented reality approach is proposed to validate thecontrol system

The system overview can be presented by Figure 5In our experiment we only use the first three joints and

the remaining joints are not considered Note that the userinterface contains four screens

The first screen is designed to the setting and hardwareconfiguration The screen displays the tracking process statesuch us invalid positions joint positions and 3D coordinatesand user position checking A security algorithm has beendeveloped to secure the tracking process Note that wherethe security program indicates a skeleton detection errorfor one reason or another the last calculated angle (already

validated) of each joint will be sent to the robot until theoperator solves the problem Based on the feedback nodewhich returns the last valid result with a nonnumeric inputthe security algorithm is developed and abstracted in sub-VI for importing in the user interface Figure 11 illustratesthe imported sub-VI In addition the user can also set theArduino board communication such us connection type andbaud rate and Arduino board type and select the Arduinoresource

In the tracking process the operatorrsquos arm can twitchleading to unwanted random movement In order to avoidthis undesired movement we propose a Butterworth lowpass filter-based smoothing signal algorithm In the samescreen the user can enable the smoothing signal algorithmby pressing on the bouton control The designed filter isabstracted in a sub-VI as shown in Figure 10 The filteredtrajectories presented as the desired trajectories in Figures15 16 and 17 demonstrate the effectiveness of the filteringalgorithm

Finally the stop button is set in the developed interface tostop the tracking process and close all connected hardwareThis screen is developed as shown in Figure 6

The second screen presented in Figure 3 is designed torecord the userrsquos video and skeleton display Figure 7 showsthe third screen to prove the augmented reality In this pagewe show the virtual and real environment The user andthe virtual robot will be displayed in the user video portionwhereas the virtual robot will be presented in the virtualenvironment software

The remaining screen describes the joints graphs andoperatorrsquos video recording during the experimentation asshown in Figure 8

After running the operator must select the first screencalled Kinect Feedback to adjust its position A user positionchecking is triggered to secure the operatorrsquos position inthe Kinect sensing field A green color of the signalingLED indicates the good state of the recognition processOtherwise the LED color changes to red to indicate the

Advances in Human-Computer Interaction 7

Figure 12 Augmented reality process

Figure 13 User and skeleton display

problem of tracking Figure 9 describes the executed screenin the user position checking process

Based on Kinect toolkit for LabVIEW the operatorrsquosskeleton data will be read by the LabVIEW interface Thena gesture recognition method is executed to determine eachjoint angle The calculated joint positions are considered asdesired trajectories for the slidingmode control implementedin the developed software Ultimately the computed torquewill be sent to the Atmel-based Arduino card via USBprotocol Therefore the servomotors of the three first joints

will be actuated to move the mechanical structure of theLynxmotion robot

A three-degree of freedom virtual robot is used toperform this experiment Essentially the augmented realityevaluated both the ability of the robot manipulator to copyhuman hand arm motions and the ability of the user touse the human-robot manipulator interface The proposedcontroller is validated by the augmented reality In fact bybuilding a virtual robot by using LabVIEW other than thereal robot as shown in Figure 12 virtual robot movements

8 Advances in Human-Computer Interaction

Figure 14 Gesture control process and real-time graphs

are similar to the real robot Therefore the proposed controlmethod is validated

Figure 13 proves how the user can control themanipulatorrobot based in his detected skeleton

Photographs series depicting gesture control processexecution each five seconds with real-time graphs have beenpresented in Figure 14

In the teleoperation process the tracking error conver-gence is very important and the performance of our system isbased on the tracking error optimization Indeed in the casethat the elementary joint cannot track his similar human armjoint the robotic arm will not be able to perform the desiredtask because of the accumulation of each joint position errorIn this case the end-effector localization is different regardingthe desired placement Therefore the user must adapt thegesture with the current end-effector position to reduce theend-effector placement error Hence the adaptive gesturesestimated by the user require a particular intelligence andsuch expertise In another case when the user applies thedesired gesture with a variable velocity in this situation thecontrol algorithm must respect the applied velocities andaccelerations to conserve the convergence of the tracking

error In order to overcome this constraint the use of highperformance sliding mode controller is highly required

Figures 15 16 and 17 show trajectories of each joint andthe desired trajectory for each Both trajectories are as afunction of timeThe illustrated results show the effectivenessof the proposed control approachThe illustrations have onlyone second as response time In experimental conditionthis time can be regarded as excellent This control strategyshould be improved if we hope to exploit this system intelemedicine applications Moreover the developed teleoper-ation approach can be used for industrial applications in par-ticular for pic and place tasks in hazardous or unstructuredenvironments The real-time teleoperation system offers abenefit platform to manage the unexpected scenarios

6 Conclusion

This paper applies the gesture control to the robot manip-ulators Indeed in this work we use the Kinect technol-ogy to establish a human-machine interaction (HMI) Thedeveloped control approach can be used in all applications

Advances in Human-Computer Interaction 9

Desired joint 1 positionMeasured joint 1 position

200100 15050 250 300 350 40000Time (s)

040608

11214

Am

plitu

de (r

ads)

Figure 15 Desired and measured joint 1 position

50 100 150 200 250 300 350 40000Time (s)

002040608

112

Am

plitu

de (r

ads) Desired joint 2 position

Measured joint 2 position

Figure 16 Desired and measured joint 2 position

50 100 150 200 250 300 350 40000Time (s)

012345

Am

plitu

de (r

ads)

Desired joint 3 positionMeasured joint 3 position

Figure 17 Desired and measured joint 3 position

which require real-time human-robot cooperation Trajec-tory tracking and system stability have been ensured bysliding mode control (SMC) Control and supervision bothhave been provided by a high-friendly human-machine inter-face developed in LabVIEW Experiments have shown thehigh efficiency of the proposed approach Furthermore theKinect-based teleoperation system is validated by augmentedreality In conclusion the proposed control scheme providesa feasible teleoperation system for many applications such asmanipulating in hazardous and unstructured environmentHowever there are some constraints when applying theproposed method such as the sensibility of the desiredtrajectory generated by the human arm even in case ofrandom and unwanted movements Indeed the work spaceand the regional constraints must be respected in the controlschemes by filtering the unaccepted trajectories Moreoverfor pic and place applications grasping task is not consideredin the proposed telemanipulation system In perspective thisdrawback can be overcome by using the electromyographybiosignal for grasping task control

Competing Interests

The authors declare that they have no competing interests

References

[1] Y Bouteraa J Ghommam N Derbel and G Poisson ldquoNonlin-ear control and synchronization with time delays of multiagentrobotic systemsrdquo Journal of Control Science and Engineering vol2011 Article ID 632374 9 pages 2011

[2] Y Bouteraa J Ghommam G Poisson and N Derbel ldquoDis-tributed synchronization control to trajectory tracking of mul-tiple robot manipulatorsrdquo Journal of Robotics vol 2011 ArticleID 652785 10 pages 2011

[3] G A M Vasiljevic L C de Miranda and E E C de MirandaldquoA case study ofmastermind chess comparingmousekeyboardinteraction with kinect-based gestural interfacerdquo Advances inHuman-Computer Interaction vol 2016 Article ID 4602471 10pages 2016

[4] C-H Tsai Y-H Kuo K-C Chu and J-C Yen ldquoDevelop-ment and evaluation of game-based learning system using themicrosoft kinect sensorrdquo International Journal of DistributedSensor Networks vol 2015 Article ID 498560 10 pages 2015

[5] C-H Chuang Y-N Chen L-W Tsai C-C Lee andH-C TsaildquoImproving learning performance with happiness by interactivescenariosrdquo The Scientific World Journal vol 2014 Article ID807347 12 pages 2014

[6] K J Bower J Louie Y Landesrocha P Seedy A Gorelik and JBernhardt ldquoClinical feasibility of interactive motion-controlledgames for stroke rehabilitationrdquo Journal of NeuroEngineeringand Rehabilitation vol 12 no 1 article 63 2015

[7] M Strbac S Kocovic M Markovic and D B PopovicldquoMicrosoft kinect-based artificial perception system for controlof functional electrical stimulation assisted graspingrdquo BioMedResearch International vol 2014 Article ID 740469 12 pages2014

[8] N Kanwal E Bostanci K Currie andA F Clark ldquoA navigationsystem for the visually impaired a fusion of vision and depthsensorrdquo Applied Bionics and Biomechanics vol 2015 Article ID479857 16 pages 2015

[9] H-H Pham T-L Le and N Vuillerme ldquoReal-time obsta-cle detection system in indoor environment for the visuallyimpaired using microsoft kinect sensorrdquo Journal of Sensors vol2016 Article ID 3754918 13 pages 2016

[10] D Tuvshinjargal B Dorj and D J Lee ldquoHybrid motion plan-ning method for autonomous robots using kinect based sensorfusion and virtual plane approach in dynamic environmentsrdquoJournal of Sensors vol 2015 Article ID 471052 13 pages 2015

[11] T Xu S Jia Z Dong and X Li ldquoObstacles regions 3D-perception method for mobile robots based on visual saliencyrdquoJournal of Robotics vol 2015 Article ID 720174 10 pages 2015

[12] W Shang X Cao H Ma H Zang and P Wei ldquoKinect-based vision system of mine rescue robot for low illuminousenvironmentrdquo Journal of Sensors vol 2016 Article ID 82520159 pages 2016

[13] H Kim and I Kim ldquoDynamic arm gesture recognition usingspherical angle features and hidden markov modelsrdquo Advancesin Human-Computer Interaction vol 2015 Article ID 785349 7pages 2015

[14] A Chavez-Aragon R Macknojia P Payeur and R LaganiereldquoRapid 3D modeling and parts recognition on automotivevehicles using a network of RGB-D sensors for robot guidancerdquoJournal of Sensors vol 2013 Article ID 832963 16 pages 2013

[15] A Prochazka O Vysata M Valis and R Yadollahi ldquoThe MSkinect use for 3d modelling and gait analysis in the Matlab

10 Advances in Human-Computer Interaction

environmentrdquo in Proceedings of the 21th Annual ConferenceTechnical Computing 2013 Prague Czech Republic 2013

[16] B Y L Li A S Mian W Liu and A Krishna ldquoUsingKinect for face recognition under varying poses expressionsillumination and disguiserdquo in Proceedings of the IEEEWorkshopon Applications of Computer Vision (WACV rsquo13) pp 186ndash192Tampa Fla USA January 2013

[17] C Muhiddin D B Phillips M J Miles L Picco and D MCarberry ldquoKinect 4 holographic optical tweezersrdquo Journalof Optics vol 15 no 7 Article ID 075302 2013

[18] P Qiu C Ni J Zhang and H Cao ldquoResearch of virtualsimulation of traditional Chinese bone setting techniques basedon kinect skeletal informationrdquo Journal of Shandong Universityof Traditional Chinese Medicine no 3 pp 202ndash204 2014

[19] R Ibanez A Soria A Teyseyre and M Campo ldquoEasy gesturerecognition for Kinectrdquo Advances in Engineering Software vol76 pp 171ndash180 2014

[20] I-J Ding and C-W Chang ldquoAn eigenspace-based method witha user adaptation scheme for human gesture recognition byusing Kinect 3D datardquo Applied Mathematical Modelling vol 39no 19 pp 5769ndash5777 2015

[21] B Galna G Barry D Jackson D Mhiripiri P Olivier andL Rochester ldquoAccuracy of the Microsoft Kinect sensor formeasuring movement in people with Parkinsonrsquos diseaserdquo Gaitamp Posture vol 39 no 4 pp 1062ndash1068 2014

[22] K Qian H Yang and J Niu ldquoDeveloping a gesture basedremote human-robot interaction system using kinectrdquo Interna-tional Journal of Smart Home vol 7 no 4 pp 203ndash208 2013

[23] S Shirwalkar A Singh K Sharma and N Singh ldquoTelemanip-ulation of an industrial robotic arm using gesture recognitionwithKinectrdquo inProceedings of the IEEE International ConferenceonControl Automation Robotics and Embedded Systems (CARErsquo13) Jabalpur India December 2013

[24] G Du and P Zhang ldquoMarkerless humanndashrobot interface fordual robot manipulators using Kinect sensorrdquo Robotics andComputer-Integrated Manufacturing vol 30 no 2 pp 150ndash1592014

[25] A Hernansanz A Casals and J Amat ldquoAmulti-robot coopera-tion strategy for dexterous task oriented teleoperationrdquoRoboticsand Autonomous Systems vol 68 pp 156ndash172 2015

[26] F Terrence F Conti G Sebastien and B Charles ldquonovelinterfaces for remote driving gesture haptic and PDArdquo in SPIETelemanipulator and Telepresence Technologies VII vol 4195 pp300ndash311 2000

[27] E Ueda Y Matsumoto M Imai and T Ogasawara ldquoA hand-pose estimation for vision-based human interfacesrdquo IEEETransactions on Industrial Electronics vol 50 no 4 pp 676ndash684 2003

[28] J Zhang and A Knoll ldquoA two-arm situated artificial commu-nicator for humanndashrobot cooperative assemblyrdquo IEEE Transac-tions on Industrial Electronics vol 50 no 4 pp 651ndash658 2003

[29] B Ionescu D Coquin P Lambert and V Buzuloiu ldquoDynamichand gesture recognition using the skeleton of the handrdquoEURASIP Journal on Applied Signal Processing vol 13 pp 2101ndash2109 2005

[30] Y Kim S Leonard A Shademan A Krieger and P C W KimldquoKinect technology for hand tracking control of surgical robotstechnical and surgical skill comparison to current roboticmastersrdquo Surgical Endoscopy vol 28 no 6 pp 1993ndash2000 2014

[31] J Kofman S Verma and X Wu ldquoRobot-manipulator teleop-eration by markerless vision-based hand-arm trackingrdquo Inter-national Journal of Optomechatronics vol 1 no 3 pp 331ndash3572007

[32] R C Luo B-H Shih and T-W Lin ldquoReal time humanmotion imitation of anthropomorphic dual arm robot based onCartesian impedance controlrdquo in Proceedings of the 11th IEEEInternational Symposium on Robotic and Sensors Environments(ROSE rsquo13) pp 25ndash30 Washington DC USA October 2013

[33] R Afthoni A Rizal and E Susanto ldquoProportional deriva-tive control based robot arm system using microsoft kinectrdquoin Proceedings of the International Conference on RoboticsBiomimetics Intelligent Computational Systems (ROBIONETICSrsquo13) pp 24ndash29 IEEE Jogjakarta Indonesia November 2013

[34] M Al-Shabi ldquoSimulation and implementation of real-timevision-based control system for 2-DoF robotic arm using PIDwith hardware-in-the-looprdquo Intelligent Control andAutomationvol 6 no 2 pp 147ndash157 2015

[35] I Benabdallah Y Bouteraa R Boucetta and C Rekik ldquoKinect-based Computed Torque Control for lynxmotion robotic armrdquoin Proceedings of the 7th International Conference on ModellingIdentification and Control (ICMIC rsquo15) pp 1ndash6 Sousse TunisiaDecember 2015

[36] H Mehdi and O Boubaker ldquoRobot-assisted therapy designcontrol and optimizationrdquo International Journal on Smart Sens-ing and Intelligent Systems vol 5 no 4 pp 1044ndash1062 2012

[37] I Ben Aabdallah Y Bouteraa and C Rekik ldquoDesign of smartrobot for wrist rehabilitationrdquo International journal of smartsensing and intelligent systems vol 9 no 2 2016

[38] C Pillajo and J E Sierra ldquoHuman machine interface HMIusing Kinect sensor to control a SCARA robotrdquo in Proceedingsof the IEEE Colombian Conference on Communications andComputing (COLCOM rsquo13) pp 1ndash5 Medellin Colo USA May2013

Submit your manuscripts athttpwwwhindawicom

Computer Games Technology

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Distributed Sensor Networks

International Journal of

Advances in

FuzzySystems

Hindawi Publishing Corporationhttpwwwhindawicom

Volume 2014

International Journal of

ReconfigurableComputing

Hindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Applied Computational Intelligence and Soft Computing

thinspAdvancesthinspinthinsp

Artificial Intelligence

HindawithinspPublishingthinspCorporationhttpwwwhindawicom Volumethinsp2014

Advances inSoftware EngineeringHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Electrical and Computer Engineering

Journal of

Journal of

Computer Networks and Communications

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporation

httpwwwhindawicom Volume 2014

Advances in

Multimedia

International Journal of

Biomedical Imaging

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

ArtificialNeural Systems

Advances in

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

RoboticsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Computational Intelligence and Neuroscience

Industrial EngineeringJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Modelling amp Simulation in EngineeringHindawi Publishing Corporation httpwwwhindawicom Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Human-ComputerInteraction

Advances in

Computer EngineeringAdvances in

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Page 2: Research Article Kinect-Based Sliding Mode Control for Lynxmotion Robotic …downloads.hindawi.com/journals/ahci/2016/7921295.pdf · 2019-07-30 · is LabVIEW-based user interface

2 Advances in Human-Computer Interaction

3D data A user adaptation scheme for this method wasadded to enhance Eigen3Dgesture and the performance ofthe constructed eigenspace of Kinect 3D data Based onMicrosoftrsquos ldquoKinect for Windows SDKrdquo [21] uses the API formeasuring movement of people with Parkinsonrsquos disease

Gesture control is mainly used for telemanipulationin several modes In [22] a clientserver structured robotteleoperation application system is developed in networkedrobot mode Reference [23] describes gesture based tele-manipulation of an industrial robotic arm in master-slavemode for unstructured and hazardous environments a max-imum velocity drift control approach is applied allowingthe amateur user to control the robot by simple gesturesand force control was combined with the gesture basedcontrol to perform safe manipulation However this solutionis costly and depends on reliability of the force-torque sensorIn addition a method of human-robot interaction usingmarkerless Kinect-based tracking of the human hand forteleoperation of a dual robot manipulator is presented in[24] Based on hand motion the user can control the robotmanipulators to perform tasks of picking up and placing thework was applied in virtual environment

For robot teleoperation many researches have been doneproviding the possibility of controlling the robotic systemsdedicated for complex tasks [25] Several techniques are usedto control robot motion a human-robot interface based onhand-gesture recognition is developed in [26ndash30] Othershave based on hand-arm tracking [31] presents a method ofreal-time robot manipulator teleoperation using markerlessimage-based hand-arm tracking

Many problems occurred during the human-robot inter-action when using the Kinect sensor After the acquisitionof 3D data Kinect these data will be processed to controlthe robot Several approaches have been used to check thecontrol A Cartesian impedance control is used to control thedual robot arm [32] this approach allows the robot to followthe human arm motion without solving inverse kinematicproblems and avoiding self-collision problems Likewise aproportional derivative control is implemented with inversekinematic algorithm in [33] In [34] a PID controller is alsoused to control a 2-degree of freedom LegoMind stormNXTrobotic arm

Based on online HIL (Hardware-in-the-Loop) experi-mental data the acquired input data from the Kinect sensorare processed in a closed loop PID controller with feedbackfrommotors encoders Although these results are satisfactoryside position control they remain imperfect from velocitycontrol stand point due to the use of simple controller usedMoreover compared to the conventional computed torquecontrol used in [35] the Lynxmotion dynamic has beenimplemented in a real-time sliding mode control algorithm

To combine the real and virtual objects and the imple-mented algorithms we need to design a user interface thatmanages the overall system Indeed the human-machineinterface (HMI) is highly recommended in robotics fieldsand many human-machine interfaces have been developedto control the robotic systems [36 37] In [38] a human-machine interface is designed to control the SCARA robot

using Kinect sensor This interface is implemented in Cusing Kinect library OpenNI

In this paper we have used the package provided byVirtual Instrument Package Manager for LabVIEW Therecognition algorithm calculates the angle of each humanarm joint dedicated for control This algorithm is detailed inSection 1

Compared to Shirwalkar et al [23] firstly the proposedcontrol strategy takes into account the dynamics system Alsoour algorithm detects all user joints and not only the handwhile for the security side we propose an algorithm securitywhich avoids any type of parasite thatmay occur in theKinectsensing field Moreover the proposed algorithm avoids anysingularity position

Likewise a proportional derivative control is imple-mented with inverse kinematic algorithm [33] Howeveron the one hand this classic controller ignores the robotdynamics On the other hand this controller is limited toposition control and does not consider the motion control

Regarding thework in [38] we develop a generic interfacethat encompasses the different components of the global sys-tem virtual 3D manipulator Kinect sensor feedback valuesskeleton image implemented dynamic model implementedsliding mode control implemented recognition algorithmfor calculating the joints position and implemented securityalgorithm This LabVIEW-based user interface is describedin Section 5

The present paper is organized as follows in Section 2we presents the recognition method The dynamic modelof the Lynxmotion robotic arm is detailed in Section 3 InSection 4 the control design based on the sliding modecontrol approach is described Human-machine interface ispresented as well as experiment results in Section 5 Finallywe conclude with a work summary

2 Recognition Method

Several methods are used to detect the human arm move-ments Kinect sensor is the most popular thanks to skeletondetect and depth computing The Kinect is a motion sensorthat can measure three-dimensional motion of a person Infact Kinect sensor generates a depth and image (119909 119910 and 119911)of human body In addition the Kinect sensor can providespace coordinates for each joint

To enjoy these benefits we need an Application Program-merrsquos Interface (API) to the Kinect hardware and its skeletaltracking software Microsoftrsquos ldquoKinect for Windows SDKrdquowas used to provide an API to the Kinect hardware ThisAPI provides an estimate for the position of 20 anatomicallandmarks in 3D at a frequency of 30Hz and spatial and depthresolution of 640times480 pixels as shown in Figure 1The defaultsmoothing parameters are as follows

(i) Correction factor = 05(ii) Smoothing factor = 05(iii) Jitter radius = 005m(iv) Maximum deviation radius = 004m(v) Future prediction = 0 frames

Advances in Human-Computer Interaction 3

Figure 1 Kinect for Windows SDK detected joints

Start Config Read Render Stop

Figure 2 Operational Layout for the Kinect LabVIEW toolkit

Based on Microsoft Kinect SDK the Kinesthesia Toolkitfor Microsoft Kinect is developed at University of Leedsinitially for the medical rehabilitation and surgical tools Inaddition the toolkit helps the NI LabVIEW programmersto access and use the popular functions of the MicrosoftKinect camera such as RGB video depth camera and skeletaltracking The toolkit comes fully packaged using JKIrsquos VIPackage Manager

In addition the toolkit allows the user to initialize andclose the Kinectrsquos different components through polymorphicVIs (RGB camera depth camera and skeletal tracking)dependent on the functionalities they require The Opera-tional Layout for the Kinect LabVIEW toolkit is described inFigure 2

The first step is to initialize the Kinect this act opensthe device and returns a reference The second step is toconfigure the Kinect In this step we can choose the RGBstream format enable the skeleton smoothing and select theappropriate event (video feedback from the RGB camera 3Dskeleton information or depth information) In the next stepwe can get skeleton depth and video information from theKinect in the main loop such as while loop The additionalimplementations can be put in the same loop Finally we stopthe Kinect and close the communication

In this paper a user interface containing a skeleton displayscreen has been developed as shown in Figure 3 The screen

Figure 3 Skeleton and user display screen

x

y

z

120579112057921205793

Shoulder

x

y

z

Figure 4 Human arm and manipulator robot model

has been reserved to the skeleton display and the user videorecording

In this work we focus on the human arm portion ofthe skeleton data In fact the human arm can be consideredas being a 4DOF manipulator arm similar to Lynxmotionrobotic arm In our case a robot manipulator model hasbeen developed with a 3DOF Therefore we assume that theuser has the opportunity to control the robot based on hisright arm gestures The three considered movements are asfollows

(i) Flexion-extension for the right shoulder joint

(ii) Abduction-adduction for the right shoulder joint

(iii) Flexion-extension for the right elbow joint

The similarity of the human arm and the robot manipulatormodel is shown in Figure 4

After the acquisition of each elementary joint positionan additional calculation is performed to determine thedesired speed and acceleration All these parameters will beconsidered as the desired trajectories for the manipulatorrobot control algorithm

3 Dynamic Model of Lynxmotion Robotic Arm

The dynamic model deals with the torque and force causingthe motion of the mechanical structure In this sectionrobotic arm dynamic model has been computed based on

4 Advances in Human-Computer Interaction

Euler-Lagrange formulation The potential and kinetic ener-gies of each link have been computed by the two followingequations

119906119894= minus119898119894119892119879 119894

119894119875

119870119894=

1

2

119898119894V119879119888119894V119888119894+

1

2

119894

119894119908

119879 119894

119894119868119894

119894119908

(1)

where119898119894is the 119894th link mass V

119888119894is the 119894th link linear velocity

119894

119894119908 is the 119894th link angular velocity 119894

119894119868 is the 119894th inertia tensor

and 119894119894119875 is the 119894th link positionFurther calculation the dynamic model of a full actuated

robot manipulator is expressed as follows

119872(119902) + 119873 (119902 ) = 120591 (2)

where 120591 is 119899 times 1 vector consisting on applied generalizedtorque 119872(119902) is 119899 times 119899 symmetric and positive definite inertiamatrix 119873(119902 ) is the vector of nonlinearity term and 119902 isin

119877119899times119899The vector of nonlinearity term hails from centrifugal

Coriolis and gravity terms described in the following equa-tions

119873(119902 ) = 119881 (119902 ) + 119866 (119902)

119881 (119902 ) = 119861 (119902) [ ] + 119862 (119902) []2

(3)

Here robot manipulator dynamic equation can be written as

120591 = 119872(119902) + 119861 (119902) [ ] + 119862 (119902) []2+ 119866 (119902) (4)

where 119861(119902) is matrix of Coriolis torque 119862(119902) is matrix ofcentrifugal torque [ ] is vector of joint velocity obtainableby [1sdot 2 1sdot 3

1sdot 119899 2sdot 3 ]119879 and []

2 is vectorwitch can be obtained by [

1

2 2

2 3

2 ]

119879In our work a 3 DOF manipulator model has been

developed using the Lagrange formulation The goal wasto determine the Coriolis centrifugal and gravitationalmatrices for real implementation reasons

4 Control Design

Robot manipulator has highly nonlinear dynamic param-eters For this reason the design of a robust controller isrequired In this section we study the motion controllerldquosliding-mode controlrdquo

The dynamicmodel of the robotmanipulator is describedas follows

119872(119902) + 119862 (119902 ) + 119892 (119902) = 120591 (5)

where 119902 = [1199021 1199022 1199023]119879 and 120591 = [120591

1 1205912 1205913]119879

Let 119902119889(119905) denote the desired trajectoryThe tracking error

is defined as

119890 = 119902119889 (

119905) minus 119902 (119905) (6)

Define 119903= 119889+119904(119902119889minus119902) where s is a definite positivematrix

According to the linear characteristic of robotic weobtain

(119902) 119903+ (119902 )

119903+ (119902) = 120574 (119902

119903 119903) (7)

where is the robot estimated parameters vector and120574(119902 119902

119903 119903) is a repressor matrix

Define (119902) = 119867 (119902) minus (119902)

(119902) = 119862 (119902 ) minus (119902 )

(119902) = 119892 (119902) minus (119902)

(8)

The sliding vector is selected as

119904 = 119890 + 120583 lowast 119890 (9)

We propose the sliding mode control law as follows

120591 = (119902) 119903+ (119902 )

119903+ (119902) + 119904 + 119896 lowast sgn (119904) (10)

where 119896119894= sum119895120574119894119895119895

Define

119894Such forall119894

10038161003816100381610038161003816119894

10038161003816100381610038161003816le 119894

120574119894119895Such forall119894

10038161003816100381610038161003816120574119894119895

10038161003816100381610038161003816le 120574119894119895

(11)

Theorem 1 The proposed controller guarantees the asymptoticstability of the system

Proof Select the LEC as

119881 (119905) =

1

2

119904119879119872(119902) 119904 997904rArr

(119905) = 119904119879119872(119902) 119904 +

1

2

119904119879 (119902) 119904

= 119904119879119872(119902) 119904 + 119904

119879119862 (119902) 119904

= 119904119879[119872 (119902)

119903+ 119862 (119902 )

119903+ 119892 (119902) minus 120591]

(12)

Replacing 120591 by its expression (10) yields

(119905)

= 119904119879[ (119902)

119903+ (119902 )

119903+ (119902) minus 119896 lowast sgn (119904) minus 119904]

= 119904119879[120574 (119902

119903 119903) minus 119896 sdot sgn (119904) minus 119904]

(13)

where 120574(119902 119903 119903) = [120574

119894119895] such as |120574

119894119895| le 120574119894119895

And = [119894] such as |

119894| le 119894119895

Yield

(119905) = sum

119894

sum

119895

119904119894120574119894119895119895minus sum119904

119894119870119894sgn (119904

119894) minus sum

119894

1199042

119894

= sum

119894

sum

119895

119904119894120574119894119895119895minus sum

119894

sum

119895

1003816100381610038161003816119904119894

1003816100381610038161003816120574119894119895minus sum

119894

1199042

119894le minussum

119894

1199042

119894

le 0

(14)

This proves the stability

Advances in Human-Computer Interaction 5

Kinect sensor

USB 20

Ges

ture

Skeleton data

User

Kinect toolkit

Joint positions

Joint velocities

Joint accelerations

Desired positions

Virtual robot

Human machine interface

Dynamic model

Sliding mode control

Mi(qi)qi + Ci(qi qi)qi + gi(qi) = 120591i

Desired pulse width

Real pulse width

Com

pute

d to

rque

USB

20

Real

join

t pos

ition

s

Arduino board

Lynxmotion AL5D

Control designand simulation

+ +minus

Figure 5 System overview

Figure 6 First screen of the user interface

Figure 7 Third screen of the user interface

5 Experiment Results

In the beginning we had some packages to install Indeedthe JKIrsquos VI Package Manager such as Kinesthesia Toolkit forMicrosoft Kinect LabVIEW Interface for Arduino (LIFA)and control design and simulation module are strictlyrequired for the implementation and then for the propersystem functioning

Figure 8 Fourth screen of the user interface

In this section we present a LabVIEW-based human-machine interface The virtual robot has been designed in3D LabVIEW virtual environment The virtual robot has thesame kinematics parameters of the real robot such as jointsaxes and links distances A user interface screen has beenreserved to display the virtual robotmovement versus the realrobot

The control law based on the sliding mode controlhas been successfully implemented with the parametersdiscussed in the previous section

Going from gesture applied by the human arm Kinectsensor captures the human arm motion and abstracts itin 3D coordinates Using Kinect for Windows SDK theskeleton data are provided and transmitted to the recognitionalgorithm Based onKinesthesia Toolkit forMicrosoftKinectthe human arm joint positions velocities and accelerationsare computed and considered as desired trajectories of therobotic arm By helping of LabVIEW control design and sim-ulation the developed dynamic model of 3DOF Lynxmotionrobotic arm is implemented Also a sliding mode controlalgorithm is implemented ensuring the high performance inthe trajectories tracking Furthermore the computed torque

6 Advances in Human-Computer Interaction

Figure 9 User position checking process

Size (x)

Filtered X

(XXf)Smoothing

Signal

Order

Low cutoff freq

X

Samplingfrequency

Enablesmoothing

Figure 10 Butterworth-based designed filter

Angle in Invalid position

Angle out

AvoidNaN

values

Figure 11 Avoid invalid position sub-VI

is transmitted to the servomotors moving the mechanicalstructure of the robotic arm Moreover the desired jointpositions are applied in the virtual environment to movethe virtual robot in order to prove the compatibility of thedesired motion in both real and virtual environment Hencean augmented reality approach is proposed to validate thecontrol system

The system overview can be presented by Figure 5In our experiment we only use the first three joints and

the remaining joints are not considered Note that the userinterface contains four screens

The first screen is designed to the setting and hardwareconfiguration The screen displays the tracking process statesuch us invalid positions joint positions and 3D coordinatesand user position checking A security algorithm has beendeveloped to secure the tracking process Note that wherethe security program indicates a skeleton detection errorfor one reason or another the last calculated angle (already

validated) of each joint will be sent to the robot until theoperator solves the problem Based on the feedback nodewhich returns the last valid result with a nonnumeric inputthe security algorithm is developed and abstracted in sub-VI for importing in the user interface Figure 11 illustratesthe imported sub-VI In addition the user can also set theArduino board communication such us connection type andbaud rate and Arduino board type and select the Arduinoresource

In the tracking process the operatorrsquos arm can twitchleading to unwanted random movement In order to avoidthis undesired movement we propose a Butterworth lowpass filter-based smoothing signal algorithm In the samescreen the user can enable the smoothing signal algorithmby pressing on the bouton control The designed filter isabstracted in a sub-VI as shown in Figure 10 The filteredtrajectories presented as the desired trajectories in Figures15 16 and 17 demonstrate the effectiveness of the filteringalgorithm

Finally the stop button is set in the developed interface tostop the tracking process and close all connected hardwareThis screen is developed as shown in Figure 6

The second screen presented in Figure 3 is designed torecord the userrsquos video and skeleton display Figure 7 showsthe third screen to prove the augmented reality In this pagewe show the virtual and real environment The user andthe virtual robot will be displayed in the user video portionwhereas the virtual robot will be presented in the virtualenvironment software

The remaining screen describes the joints graphs andoperatorrsquos video recording during the experimentation asshown in Figure 8

After running the operator must select the first screencalled Kinect Feedback to adjust its position A user positionchecking is triggered to secure the operatorrsquos position inthe Kinect sensing field A green color of the signalingLED indicates the good state of the recognition processOtherwise the LED color changes to red to indicate the

Advances in Human-Computer Interaction 7

Figure 12 Augmented reality process

Figure 13 User and skeleton display

problem of tracking Figure 9 describes the executed screenin the user position checking process

Based on Kinect toolkit for LabVIEW the operatorrsquosskeleton data will be read by the LabVIEW interface Thena gesture recognition method is executed to determine eachjoint angle The calculated joint positions are considered asdesired trajectories for the slidingmode control implementedin the developed software Ultimately the computed torquewill be sent to the Atmel-based Arduino card via USBprotocol Therefore the servomotors of the three first joints

will be actuated to move the mechanical structure of theLynxmotion robot

A three-degree of freedom virtual robot is used toperform this experiment Essentially the augmented realityevaluated both the ability of the robot manipulator to copyhuman hand arm motions and the ability of the user touse the human-robot manipulator interface The proposedcontroller is validated by the augmented reality In fact bybuilding a virtual robot by using LabVIEW other than thereal robot as shown in Figure 12 virtual robot movements

8 Advances in Human-Computer Interaction

Figure 14 Gesture control process and real-time graphs

are similar to the real robot Therefore the proposed controlmethod is validated

Figure 13 proves how the user can control themanipulatorrobot based in his detected skeleton

Photographs series depicting gesture control processexecution each five seconds with real-time graphs have beenpresented in Figure 14

In the teleoperation process the tracking error conver-gence is very important and the performance of our system isbased on the tracking error optimization Indeed in the casethat the elementary joint cannot track his similar human armjoint the robotic arm will not be able to perform the desiredtask because of the accumulation of each joint position errorIn this case the end-effector localization is different regardingthe desired placement Therefore the user must adapt thegesture with the current end-effector position to reduce theend-effector placement error Hence the adaptive gesturesestimated by the user require a particular intelligence andsuch expertise In another case when the user applies thedesired gesture with a variable velocity in this situation thecontrol algorithm must respect the applied velocities andaccelerations to conserve the convergence of the tracking

error In order to overcome this constraint the use of highperformance sliding mode controller is highly required

Figures 15 16 and 17 show trajectories of each joint andthe desired trajectory for each Both trajectories are as afunction of timeThe illustrated results show the effectivenessof the proposed control approachThe illustrations have onlyone second as response time In experimental conditionthis time can be regarded as excellent This control strategyshould be improved if we hope to exploit this system intelemedicine applications Moreover the developed teleoper-ation approach can be used for industrial applications in par-ticular for pic and place tasks in hazardous or unstructuredenvironments The real-time teleoperation system offers abenefit platform to manage the unexpected scenarios

6 Conclusion

This paper applies the gesture control to the robot manip-ulators Indeed in this work we use the Kinect technol-ogy to establish a human-machine interaction (HMI) Thedeveloped control approach can be used in all applications

Advances in Human-Computer Interaction 9

Desired joint 1 positionMeasured joint 1 position

200100 15050 250 300 350 40000Time (s)

040608

11214

Am

plitu

de (r

ads)

Figure 15 Desired and measured joint 1 position

50 100 150 200 250 300 350 40000Time (s)

002040608

112

Am

plitu

de (r

ads) Desired joint 2 position

Measured joint 2 position

Figure 16 Desired and measured joint 2 position

50 100 150 200 250 300 350 40000Time (s)

012345

Am

plitu

de (r

ads)

Desired joint 3 positionMeasured joint 3 position

Figure 17 Desired and measured joint 3 position

which require real-time human-robot cooperation Trajec-tory tracking and system stability have been ensured bysliding mode control (SMC) Control and supervision bothhave been provided by a high-friendly human-machine inter-face developed in LabVIEW Experiments have shown thehigh efficiency of the proposed approach Furthermore theKinect-based teleoperation system is validated by augmentedreality In conclusion the proposed control scheme providesa feasible teleoperation system for many applications such asmanipulating in hazardous and unstructured environmentHowever there are some constraints when applying theproposed method such as the sensibility of the desiredtrajectory generated by the human arm even in case ofrandom and unwanted movements Indeed the work spaceand the regional constraints must be respected in the controlschemes by filtering the unaccepted trajectories Moreoverfor pic and place applications grasping task is not consideredin the proposed telemanipulation system In perspective thisdrawback can be overcome by using the electromyographybiosignal for grasping task control

Competing Interests

The authors declare that they have no competing interests

References

[1] Y Bouteraa J Ghommam N Derbel and G Poisson ldquoNonlin-ear control and synchronization with time delays of multiagentrobotic systemsrdquo Journal of Control Science and Engineering vol2011 Article ID 632374 9 pages 2011

[2] Y Bouteraa J Ghommam G Poisson and N Derbel ldquoDis-tributed synchronization control to trajectory tracking of mul-tiple robot manipulatorsrdquo Journal of Robotics vol 2011 ArticleID 652785 10 pages 2011

[3] G A M Vasiljevic L C de Miranda and E E C de MirandaldquoA case study ofmastermind chess comparingmousekeyboardinteraction with kinect-based gestural interfacerdquo Advances inHuman-Computer Interaction vol 2016 Article ID 4602471 10pages 2016

[4] C-H Tsai Y-H Kuo K-C Chu and J-C Yen ldquoDevelop-ment and evaluation of game-based learning system using themicrosoft kinect sensorrdquo International Journal of DistributedSensor Networks vol 2015 Article ID 498560 10 pages 2015

[5] C-H Chuang Y-N Chen L-W Tsai C-C Lee andH-C TsaildquoImproving learning performance with happiness by interactivescenariosrdquo The Scientific World Journal vol 2014 Article ID807347 12 pages 2014

[6] K J Bower J Louie Y Landesrocha P Seedy A Gorelik and JBernhardt ldquoClinical feasibility of interactive motion-controlledgames for stroke rehabilitationrdquo Journal of NeuroEngineeringand Rehabilitation vol 12 no 1 article 63 2015

[7] M Strbac S Kocovic M Markovic and D B PopovicldquoMicrosoft kinect-based artificial perception system for controlof functional electrical stimulation assisted graspingrdquo BioMedResearch International vol 2014 Article ID 740469 12 pages2014

[8] N Kanwal E Bostanci K Currie andA F Clark ldquoA navigationsystem for the visually impaired a fusion of vision and depthsensorrdquo Applied Bionics and Biomechanics vol 2015 Article ID479857 16 pages 2015

[9] H-H Pham T-L Le and N Vuillerme ldquoReal-time obsta-cle detection system in indoor environment for the visuallyimpaired using microsoft kinect sensorrdquo Journal of Sensors vol2016 Article ID 3754918 13 pages 2016

[10] D Tuvshinjargal B Dorj and D J Lee ldquoHybrid motion plan-ning method for autonomous robots using kinect based sensorfusion and virtual plane approach in dynamic environmentsrdquoJournal of Sensors vol 2015 Article ID 471052 13 pages 2015

[11] T Xu S Jia Z Dong and X Li ldquoObstacles regions 3D-perception method for mobile robots based on visual saliencyrdquoJournal of Robotics vol 2015 Article ID 720174 10 pages 2015

[12] W Shang X Cao H Ma H Zang and P Wei ldquoKinect-based vision system of mine rescue robot for low illuminousenvironmentrdquo Journal of Sensors vol 2016 Article ID 82520159 pages 2016

[13] H Kim and I Kim ldquoDynamic arm gesture recognition usingspherical angle features and hidden markov modelsrdquo Advancesin Human-Computer Interaction vol 2015 Article ID 785349 7pages 2015

[14] A Chavez-Aragon R Macknojia P Payeur and R LaganiereldquoRapid 3D modeling and parts recognition on automotivevehicles using a network of RGB-D sensors for robot guidancerdquoJournal of Sensors vol 2013 Article ID 832963 16 pages 2013

[15] A Prochazka O Vysata M Valis and R Yadollahi ldquoThe MSkinect use for 3d modelling and gait analysis in the Matlab

10 Advances in Human-Computer Interaction

environmentrdquo in Proceedings of the 21th Annual ConferenceTechnical Computing 2013 Prague Czech Republic 2013

[16] B Y L Li A S Mian W Liu and A Krishna ldquoUsingKinect for face recognition under varying poses expressionsillumination and disguiserdquo in Proceedings of the IEEEWorkshopon Applications of Computer Vision (WACV rsquo13) pp 186ndash192Tampa Fla USA January 2013

[17] C Muhiddin D B Phillips M J Miles L Picco and D MCarberry ldquoKinect 4 holographic optical tweezersrdquo Journalof Optics vol 15 no 7 Article ID 075302 2013

[18] P Qiu C Ni J Zhang and H Cao ldquoResearch of virtualsimulation of traditional Chinese bone setting techniques basedon kinect skeletal informationrdquo Journal of Shandong Universityof Traditional Chinese Medicine no 3 pp 202ndash204 2014

[19] R Ibanez A Soria A Teyseyre and M Campo ldquoEasy gesturerecognition for Kinectrdquo Advances in Engineering Software vol76 pp 171ndash180 2014

[20] I-J Ding and C-W Chang ldquoAn eigenspace-based method witha user adaptation scheme for human gesture recognition byusing Kinect 3D datardquo Applied Mathematical Modelling vol 39no 19 pp 5769ndash5777 2015

[21] B Galna G Barry D Jackson D Mhiripiri P Olivier andL Rochester ldquoAccuracy of the Microsoft Kinect sensor formeasuring movement in people with Parkinsonrsquos diseaserdquo Gaitamp Posture vol 39 no 4 pp 1062ndash1068 2014

[22] K Qian H Yang and J Niu ldquoDeveloping a gesture basedremote human-robot interaction system using kinectrdquo Interna-tional Journal of Smart Home vol 7 no 4 pp 203ndash208 2013

[23] S Shirwalkar A Singh K Sharma and N Singh ldquoTelemanip-ulation of an industrial robotic arm using gesture recognitionwithKinectrdquo inProceedings of the IEEE International ConferenceonControl Automation Robotics and Embedded Systems (CARErsquo13) Jabalpur India December 2013

[24] G Du and P Zhang ldquoMarkerless humanndashrobot interface fordual robot manipulators using Kinect sensorrdquo Robotics andComputer-Integrated Manufacturing vol 30 no 2 pp 150ndash1592014

[25] A Hernansanz A Casals and J Amat ldquoAmulti-robot coopera-tion strategy for dexterous task oriented teleoperationrdquoRoboticsand Autonomous Systems vol 68 pp 156ndash172 2015

[26] F Terrence F Conti G Sebastien and B Charles ldquonovelinterfaces for remote driving gesture haptic and PDArdquo in SPIETelemanipulator and Telepresence Technologies VII vol 4195 pp300ndash311 2000

[27] E Ueda Y Matsumoto M Imai and T Ogasawara ldquoA hand-pose estimation for vision-based human interfacesrdquo IEEETransactions on Industrial Electronics vol 50 no 4 pp 676ndash684 2003

[28] J Zhang and A Knoll ldquoA two-arm situated artificial commu-nicator for humanndashrobot cooperative assemblyrdquo IEEE Transac-tions on Industrial Electronics vol 50 no 4 pp 651ndash658 2003

[29] B Ionescu D Coquin P Lambert and V Buzuloiu ldquoDynamichand gesture recognition using the skeleton of the handrdquoEURASIP Journal on Applied Signal Processing vol 13 pp 2101ndash2109 2005

[30] Y Kim S Leonard A Shademan A Krieger and P C W KimldquoKinect technology for hand tracking control of surgical robotstechnical and surgical skill comparison to current roboticmastersrdquo Surgical Endoscopy vol 28 no 6 pp 1993ndash2000 2014

[31] J Kofman S Verma and X Wu ldquoRobot-manipulator teleop-eration by markerless vision-based hand-arm trackingrdquo Inter-national Journal of Optomechatronics vol 1 no 3 pp 331ndash3572007

[32] R C Luo B-H Shih and T-W Lin ldquoReal time humanmotion imitation of anthropomorphic dual arm robot based onCartesian impedance controlrdquo in Proceedings of the 11th IEEEInternational Symposium on Robotic and Sensors Environments(ROSE rsquo13) pp 25ndash30 Washington DC USA October 2013

[33] R Afthoni A Rizal and E Susanto ldquoProportional deriva-tive control based robot arm system using microsoft kinectrdquoin Proceedings of the International Conference on RoboticsBiomimetics Intelligent Computational Systems (ROBIONETICSrsquo13) pp 24ndash29 IEEE Jogjakarta Indonesia November 2013

[34] M Al-Shabi ldquoSimulation and implementation of real-timevision-based control system for 2-DoF robotic arm using PIDwith hardware-in-the-looprdquo Intelligent Control andAutomationvol 6 no 2 pp 147ndash157 2015

[35] I Benabdallah Y Bouteraa R Boucetta and C Rekik ldquoKinect-based Computed Torque Control for lynxmotion robotic armrdquoin Proceedings of the 7th International Conference on ModellingIdentification and Control (ICMIC rsquo15) pp 1ndash6 Sousse TunisiaDecember 2015

[36] H Mehdi and O Boubaker ldquoRobot-assisted therapy designcontrol and optimizationrdquo International Journal on Smart Sens-ing and Intelligent Systems vol 5 no 4 pp 1044ndash1062 2012

[37] I Ben Aabdallah Y Bouteraa and C Rekik ldquoDesign of smartrobot for wrist rehabilitationrdquo International journal of smartsensing and intelligent systems vol 9 no 2 2016

[38] C Pillajo and J E Sierra ldquoHuman machine interface HMIusing Kinect sensor to control a SCARA robotrdquo in Proceedingsof the IEEE Colombian Conference on Communications andComputing (COLCOM rsquo13) pp 1ndash5 Medellin Colo USA May2013

Submit your manuscripts athttpwwwhindawicom

Computer Games Technology

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Distributed Sensor Networks

International Journal of

Advances in

FuzzySystems

Hindawi Publishing Corporationhttpwwwhindawicom

Volume 2014

International Journal of

ReconfigurableComputing

Hindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Applied Computational Intelligence and Soft Computing

thinspAdvancesthinspinthinsp

Artificial Intelligence

HindawithinspPublishingthinspCorporationhttpwwwhindawicom Volumethinsp2014

Advances inSoftware EngineeringHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Electrical and Computer Engineering

Journal of

Journal of

Computer Networks and Communications

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporation

httpwwwhindawicom Volume 2014

Advances in

Multimedia

International Journal of

Biomedical Imaging

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

ArtificialNeural Systems

Advances in

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

RoboticsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Computational Intelligence and Neuroscience

Industrial EngineeringJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Modelling amp Simulation in EngineeringHindawi Publishing Corporation httpwwwhindawicom Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Human-ComputerInteraction

Advances in

Computer EngineeringAdvances in

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Page 3: Research Article Kinect-Based Sliding Mode Control for Lynxmotion Robotic …downloads.hindawi.com/journals/ahci/2016/7921295.pdf · 2019-07-30 · is LabVIEW-based user interface

Advances in Human-Computer Interaction 3

Figure 1 Kinect for Windows SDK detected joints

Start Config Read Render Stop

Figure 2 Operational Layout for the Kinect LabVIEW toolkit

Based on Microsoft Kinect SDK the Kinesthesia Toolkitfor Microsoft Kinect is developed at University of Leedsinitially for the medical rehabilitation and surgical tools Inaddition the toolkit helps the NI LabVIEW programmersto access and use the popular functions of the MicrosoftKinect camera such as RGB video depth camera and skeletaltracking The toolkit comes fully packaged using JKIrsquos VIPackage Manager

In addition the toolkit allows the user to initialize andclose the Kinectrsquos different components through polymorphicVIs (RGB camera depth camera and skeletal tracking)dependent on the functionalities they require The Opera-tional Layout for the Kinect LabVIEW toolkit is described inFigure 2

The first step is to initialize the Kinect this act opensthe device and returns a reference The second step is toconfigure the Kinect In this step we can choose the RGBstream format enable the skeleton smoothing and select theappropriate event (video feedback from the RGB camera 3Dskeleton information or depth information) In the next stepwe can get skeleton depth and video information from theKinect in the main loop such as while loop The additionalimplementations can be put in the same loop Finally we stopthe Kinect and close the communication

In this paper a user interface containing a skeleton displayscreen has been developed as shown in Figure 3 The screen

Figure 3 Skeleton and user display screen

x

y

z

120579112057921205793

Shoulder

x

y

z

Figure 4 Human arm and manipulator robot model

has been reserved to the skeleton display and the user videorecording

In this work we focus on the human arm portion ofthe skeleton data In fact the human arm can be consideredas being a 4DOF manipulator arm similar to Lynxmotionrobotic arm In our case a robot manipulator model hasbeen developed with a 3DOF Therefore we assume that theuser has the opportunity to control the robot based on hisright arm gestures The three considered movements are asfollows

(i) Flexion-extension for the right shoulder joint

(ii) Abduction-adduction for the right shoulder joint

(iii) Flexion-extension for the right elbow joint

The similarity of the human arm and the robot manipulatormodel is shown in Figure 4

After the acquisition of each elementary joint positionan additional calculation is performed to determine thedesired speed and acceleration All these parameters will beconsidered as the desired trajectories for the manipulatorrobot control algorithm

3 Dynamic Model of Lynxmotion Robotic Arm

The dynamic model deals with the torque and force causingthe motion of the mechanical structure In this sectionrobotic arm dynamic model has been computed based on

4 Advances in Human-Computer Interaction

Euler-Lagrange formulation The potential and kinetic ener-gies of each link have been computed by the two followingequations

119906119894= minus119898119894119892119879 119894

119894119875

119870119894=

1

2

119898119894V119879119888119894V119888119894+

1

2

119894

119894119908

119879 119894

119894119868119894

119894119908

(1)

where119898119894is the 119894th link mass V

119888119894is the 119894th link linear velocity

119894

119894119908 is the 119894th link angular velocity 119894

119894119868 is the 119894th inertia tensor

and 119894119894119875 is the 119894th link positionFurther calculation the dynamic model of a full actuated

robot manipulator is expressed as follows

119872(119902) + 119873 (119902 ) = 120591 (2)

where 120591 is 119899 times 1 vector consisting on applied generalizedtorque 119872(119902) is 119899 times 119899 symmetric and positive definite inertiamatrix 119873(119902 ) is the vector of nonlinearity term and 119902 isin

119877119899times119899The vector of nonlinearity term hails from centrifugal

Coriolis and gravity terms described in the following equa-tions

119873(119902 ) = 119881 (119902 ) + 119866 (119902)

119881 (119902 ) = 119861 (119902) [ ] + 119862 (119902) []2

(3)

Here robot manipulator dynamic equation can be written as

120591 = 119872(119902) + 119861 (119902) [ ] + 119862 (119902) []2+ 119866 (119902) (4)

where 119861(119902) is matrix of Coriolis torque 119862(119902) is matrix ofcentrifugal torque [ ] is vector of joint velocity obtainableby [1sdot 2 1sdot 3

1sdot 119899 2sdot 3 ]119879 and []

2 is vectorwitch can be obtained by [

1

2 2

2 3

2 ]

119879In our work a 3 DOF manipulator model has been

developed using the Lagrange formulation The goal wasto determine the Coriolis centrifugal and gravitationalmatrices for real implementation reasons

4 Control Design

Robot manipulator has highly nonlinear dynamic param-eters For this reason the design of a robust controller isrequired In this section we study the motion controllerldquosliding-mode controlrdquo

The dynamicmodel of the robotmanipulator is describedas follows

119872(119902) + 119862 (119902 ) + 119892 (119902) = 120591 (5)

where 119902 = [1199021 1199022 1199023]119879 and 120591 = [120591

1 1205912 1205913]119879

Let 119902119889(119905) denote the desired trajectoryThe tracking error

is defined as

119890 = 119902119889 (

119905) minus 119902 (119905) (6)

Define 119903= 119889+119904(119902119889minus119902) where s is a definite positivematrix

According to the linear characteristic of robotic weobtain

(119902) 119903+ (119902 )

119903+ (119902) = 120574 (119902

119903 119903) (7)

where is the robot estimated parameters vector and120574(119902 119902

119903 119903) is a repressor matrix

Define (119902) = 119867 (119902) minus (119902)

(119902) = 119862 (119902 ) minus (119902 )

(119902) = 119892 (119902) minus (119902)

(8)

The sliding vector is selected as

119904 = 119890 + 120583 lowast 119890 (9)

We propose the sliding mode control law as follows

120591 = (119902) 119903+ (119902 )

119903+ (119902) + 119904 + 119896 lowast sgn (119904) (10)

where 119896119894= sum119895120574119894119895119895

Define

119894Such forall119894

10038161003816100381610038161003816119894

10038161003816100381610038161003816le 119894

120574119894119895Such forall119894

10038161003816100381610038161003816120574119894119895

10038161003816100381610038161003816le 120574119894119895

(11)

Theorem 1 The proposed controller guarantees the asymptoticstability of the system

Proof Select the LEC as

119881 (119905) =

1

2

119904119879119872(119902) 119904 997904rArr

(119905) = 119904119879119872(119902) 119904 +

1

2

119904119879 (119902) 119904

= 119904119879119872(119902) 119904 + 119904

119879119862 (119902) 119904

= 119904119879[119872 (119902)

119903+ 119862 (119902 )

119903+ 119892 (119902) minus 120591]

(12)

Replacing 120591 by its expression (10) yields

(119905)

= 119904119879[ (119902)

119903+ (119902 )

119903+ (119902) minus 119896 lowast sgn (119904) minus 119904]

= 119904119879[120574 (119902

119903 119903) minus 119896 sdot sgn (119904) minus 119904]

(13)

where 120574(119902 119903 119903) = [120574

119894119895] such as |120574

119894119895| le 120574119894119895

And = [119894] such as |

119894| le 119894119895

Yield

(119905) = sum

119894

sum

119895

119904119894120574119894119895119895minus sum119904

119894119870119894sgn (119904

119894) minus sum

119894

1199042

119894

= sum

119894

sum

119895

119904119894120574119894119895119895minus sum

119894

sum

119895

1003816100381610038161003816119904119894

1003816100381610038161003816120574119894119895minus sum

119894

1199042

119894le minussum

119894

1199042

119894

le 0

(14)

This proves the stability

Advances in Human-Computer Interaction 5

Kinect sensor

USB 20

Ges

ture

Skeleton data

User

Kinect toolkit

Joint positions

Joint velocities

Joint accelerations

Desired positions

Virtual robot

Human machine interface

Dynamic model

Sliding mode control

Mi(qi)qi + Ci(qi qi)qi + gi(qi) = 120591i

Desired pulse width

Real pulse width

Com

pute

d to

rque

USB

20

Real

join

t pos

ition

s

Arduino board

Lynxmotion AL5D

Control designand simulation

+ +minus

Figure 5 System overview

Figure 6 First screen of the user interface

Figure 7 Third screen of the user interface

5 Experiment Results

In the beginning we had some packages to install Indeedthe JKIrsquos VI Package Manager such as Kinesthesia Toolkit forMicrosoft Kinect LabVIEW Interface for Arduino (LIFA)and control design and simulation module are strictlyrequired for the implementation and then for the propersystem functioning

Figure 8 Fourth screen of the user interface

In this section we present a LabVIEW-based human-machine interface The virtual robot has been designed in3D LabVIEW virtual environment The virtual robot has thesame kinematics parameters of the real robot such as jointsaxes and links distances A user interface screen has beenreserved to display the virtual robotmovement versus the realrobot

The control law based on the sliding mode controlhas been successfully implemented with the parametersdiscussed in the previous section

Going from gesture applied by the human arm Kinectsensor captures the human arm motion and abstracts itin 3D coordinates Using Kinect for Windows SDK theskeleton data are provided and transmitted to the recognitionalgorithm Based onKinesthesia Toolkit forMicrosoftKinectthe human arm joint positions velocities and accelerationsare computed and considered as desired trajectories of therobotic arm By helping of LabVIEW control design and sim-ulation the developed dynamic model of 3DOF Lynxmotionrobotic arm is implemented Also a sliding mode controlalgorithm is implemented ensuring the high performance inthe trajectories tracking Furthermore the computed torque

6 Advances in Human-Computer Interaction

Figure 9 User position checking process

Size (x)

Filtered X

(XXf)Smoothing

Signal

Order

Low cutoff freq

X

Samplingfrequency

Enablesmoothing

Figure 10 Butterworth-based designed filter

Angle in Invalid position

Angle out

AvoidNaN

values

Figure 11 Avoid invalid position sub-VI

is transmitted to the servomotors moving the mechanicalstructure of the robotic arm Moreover the desired jointpositions are applied in the virtual environment to movethe virtual robot in order to prove the compatibility of thedesired motion in both real and virtual environment Hencean augmented reality approach is proposed to validate thecontrol system

The system overview can be presented by Figure 5In our experiment we only use the first three joints and

the remaining joints are not considered Note that the userinterface contains four screens

The first screen is designed to the setting and hardwareconfiguration The screen displays the tracking process statesuch us invalid positions joint positions and 3D coordinatesand user position checking A security algorithm has beendeveloped to secure the tracking process Note that wherethe security program indicates a skeleton detection errorfor one reason or another the last calculated angle (already

validated) of each joint will be sent to the robot until theoperator solves the problem Based on the feedback nodewhich returns the last valid result with a nonnumeric inputthe security algorithm is developed and abstracted in sub-VI for importing in the user interface Figure 11 illustratesthe imported sub-VI In addition the user can also set theArduino board communication such us connection type andbaud rate and Arduino board type and select the Arduinoresource

In the tracking process the operatorrsquos arm can twitchleading to unwanted random movement In order to avoidthis undesired movement we propose a Butterworth lowpass filter-based smoothing signal algorithm In the samescreen the user can enable the smoothing signal algorithmby pressing on the bouton control The designed filter isabstracted in a sub-VI as shown in Figure 10 The filteredtrajectories presented as the desired trajectories in Figures15 16 and 17 demonstrate the effectiveness of the filteringalgorithm

Finally the stop button is set in the developed interface tostop the tracking process and close all connected hardwareThis screen is developed as shown in Figure 6

The second screen presented in Figure 3 is designed torecord the userrsquos video and skeleton display Figure 7 showsthe third screen to prove the augmented reality In this pagewe show the virtual and real environment The user andthe virtual robot will be displayed in the user video portionwhereas the virtual robot will be presented in the virtualenvironment software

The remaining screen describes the joints graphs andoperatorrsquos video recording during the experimentation asshown in Figure 8

After running the operator must select the first screencalled Kinect Feedback to adjust its position A user positionchecking is triggered to secure the operatorrsquos position inthe Kinect sensing field A green color of the signalingLED indicates the good state of the recognition processOtherwise the LED color changes to red to indicate the

Advances in Human-Computer Interaction 7

Figure 12 Augmented reality process

Figure 13 User and skeleton display

problem of tracking Figure 9 describes the executed screenin the user position checking process

Based on Kinect toolkit for LabVIEW the operatorrsquosskeleton data will be read by the LabVIEW interface Thena gesture recognition method is executed to determine eachjoint angle The calculated joint positions are considered asdesired trajectories for the slidingmode control implementedin the developed software Ultimately the computed torquewill be sent to the Atmel-based Arduino card via USBprotocol Therefore the servomotors of the three first joints

will be actuated to move the mechanical structure of theLynxmotion robot

A three-degree of freedom virtual robot is used toperform this experiment Essentially the augmented realityevaluated both the ability of the robot manipulator to copyhuman hand arm motions and the ability of the user touse the human-robot manipulator interface The proposedcontroller is validated by the augmented reality In fact bybuilding a virtual robot by using LabVIEW other than thereal robot as shown in Figure 12 virtual robot movements

8 Advances in Human-Computer Interaction

Figure 14 Gesture control process and real-time graphs

are similar to the real robot Therefore the proposed controlmethod is validated

Figure 13 proves how the user can control themanipulatorrobot based in his detected skeleton

Photographs series depicting gesture control processexecution each five seconds with real-time graphs have beenpresented in Figure 14

In the teleoperation process the tracking error conver-gence is very important and the performance of our system isbased on the tracking error optimization Indeed in the casethat the elementary joint cannot track his similar human armjoint the robotic arm will not be able to perform the desiredtask because of the accumulation of each joint position errorIn this case the end-effector localization is different regardingthe desired placement Therefore the user must adapt thegesture with the current end-effector position to reduce theend-effector placement error Hence the adaptive gesturesestimated by the user require a particular intelligence andsuch expertise In another case when the user applies thedesired gesture with a variable velocity in this situation thecontrol algorithm must respect the applied velocities andaccelerations to conserve the convergence of the tracking

error In order to overcome this constraint the use of highperformance sliding mode controller is highly required

Figures 15 16 and 17 show trajectories of each joint andthe desired trajectory for each Both trajectories are as afunction of timeThe illustrated results show the effectivenessof the proposed control approachThe illustrations have onlyone second as response time In experimental conditionthis time can be regarded as excellent This control strategyshould be improved if we hope to exploit this system intelemedicine applications Moreover the developed teleoper-ation approach can be used for industrial applications in par-ticular for pic and place tasks in hazardous or unstructuredenvironments The real-time teleoperation system offers abenefit platform to manage the unexpected scenarios

6 Conclusion

This paper applies the gesture control to the robot manip-ulators Indeed in this work we use the Kinect technol-ogy to establish a human-machine interaction (HMI) Thedeveloped control approach can be used in all applications

Advances in Human-Computer Interaction 9

Desired joint 1 positionMeasured joint 1 position

200100 15050 250 300 350 40000Time (s)

040608

11214

Am

plitu

de (r

ads)

Figure 15 Desired and measured joint 1 position

50 100 150 200 250 300 350 40000Time (s)

002040608

112

Am

plitu

de (r

ads) Desired joint 2 position

Measured joint 2 position

Figure 16 Desired and measured joint 2 position

50 100 150 200 250 300 350 40000Time (s)

012345

Am

plitu

de (r

ads)

Desired joint 3 positionMeasured joint 3 position

Figure 17 Desired and measured joint 3 position

which require real-time human-robot cooperation Trajec-tory tracking and system stability have been ensured bysliding mode control (SMC) Control and supervision bothhave been provided by a high-friendly human-machine inter-face developed in LabVIEW Experiments have shown thehigh efficiency of the proposed approach Furthermore theKinect-based teleoperation system is validated by augmentedreality In conclusion the proposed control scheme providesa feasible teleoperation system for many applications such asmanipulating in hazardous and unstructured environmentHowever there are some constraints when applying theproposed method such as the sensibility of the desiredtrajectory generated by the human arm even in case ofrandom and unwanted movements Indeed the work spaceand the regional constraints must be respected in the controlschemes by filtering the unaccepted trajectories Moreoverfor pic and place applications grasping task is not consideredin the proposed telemanipulation system In perspective thisdrawback can be overcome by using the electromyographybiosignal for grasping task control

Competing Interests

The authors declare that they have no competing interests

References

[1] Y Bouteraa J Ghommam N Derbel and G Poisson ldquoNonlin-ear control and synchronization with time delays of multiagentrobotic systemsrdquo Journal of Control Science and Engineering vol2011 Article ID 632374 9 pages 2011

[2] Y Bouteraa J Ghommam G Poisson and N Derbel ldquoDis-tributed synchronization control to trajectory tracking of mul-tiple robot manipulatorsrdquo Journal of Robotics vol 2011 ArticleID 652785 10 pages 2011

[3] G A M Vasiljevic L C de Miranda and E E C de MirandaldquoA case study ofmastermind chess comparingmousekeyboardinteraction with kinect-based gestural interfacerdquo Advances inHuman-Computer Interaction vol 2016 Article ID 4602471 10pages 2016

[4] C-H Tsai Y-H Kuo K-C Chu and J-C Yen ldquoDevelop-ment and evaluation of game-based learning system using themicrosoft kinect sensorrdquo International Journal of DistributedSensor Networks vol 2015 Article ID 498560 10 pages 2015

[5] C-H Chuang Y-N Chen L-W Tsai C-C Lee andH-C TsaildquoImproving learning performance with happiness by interactivescenariosrdquo The Scientific World Journal vol 2014 Article ID807347 12 pages 2014

[6] K J Bower J Louie Y Landesrocha P Seedy A Gorelik and JBernhardt ldquoClinical feasibility of interactive motion-controlledgames for stroke rehabilitationrdquo Journal of NeuroEngineeringand Rehabilitation vol 12 no 1 article 63 2015

[7] M Strbac S Kocovic M Markovic and D B PopovicldquoMicrosoft kinect-based artificial perception system for controlof functional electrical stimulation assisted graspingrdquo BioMedResearch International vol 2014 Article ID 740469 12 pages2014

[8] N Kanwal E Bostanci K Currie andA F Clark ldquoA navigationsystem for the visually impaired a fusion of vision and depthsensorrdquo Applied Bionics and Biomechanics vol 2015 Article ID479857 16 pages 2015

[9] H-H Pham T-L Le and N Vuillerme ldquoReal-time obsta-cle detection system in indoor environment for the visuallyimpaired using microsoft kinect sensorrdquo Journal of Sensors vol2016 Article ID 3754918 13 pages 2016

[10] D Tuvshinjargal B Dorj and D J Lee ldquoHybrid motion plan-ning method for autonomous robots using kinect based sensorfusion and virtual plane approach in dynamic environmentsrdquoJournal of Sensors vol 2015 Article ID 471052 13 pages 2015

[11] T Xu S Jia Z Dong and X Li ldquoObstacles regions 3D-perception method for mobile robots based on visual saliencyrdquoJournal of Robotics vol 2015 Article ID 720174 10 pages 2015

[12] W Shang X Cao H Ma H Zang and P Wei ldquoKinect-based vision system of mine rescue robot for low illuminousenvironmentrdquo Journal of Sensors vol 2016 Article ID 82520159 pages 2016

[13] H Kim and I Kim ldquoDynamic arm gesture recognition usingspherical angle features and hidden markov modelsrdquo Advancesin Human-Computer Interaction vol 2015 Article ID 785349 7pages 2015

[14] A Chavez-Aragon R Macknojia P Payeur and R LaganiereldquoRapid 3D modeling and parts recognition on automotivevehicles using a network of RGB-D sensors for robot guidancerdquoJournal of Sensors vol 2013 Article ID 832963 16 pages 2013

[15] A Prochazka O Vysata M Valis and R Yadollahi ldquoThe MSkinect use for 3d modelling and gait analysis in the Matlab

10 Advances in Human-Computer Interaction

environmentrdquo in Proceedings of the 21th Annual ConferenceTechnical Computing 2013 Prague Czech Republic 2013

[16] B Y L Li A S Mian W Liu and A Krishna ldquoUsingKinect for face recognition under varying poses expressionsillumination and disguiserdquo in Proceedings of the IEEEWorkshopon Applications of Computer Vision (WACV rsquo13) pp 186ndash192Tampa Fla USA January 2013

[17] C Muhiddin D B Phillips M J Miles L Picco and D MCarberry ldquoKinect 4 holographic optical tweezersrdquo Journalof Optics vol 15 no 7 Article ID 075302 2013

[18] P Qiu C Ni J Zhang and H Cao ldquoResearch of virtualsimulation of traditional Chinese bone setting techniques basedon kinect skeletal informationrdquo Journal of Shandong Universityof Traditional Chinese Medicine no 3 pp 202ndash204 2014

[19] R Ibanez A Soria A Teyseyre and M Campo ldquoEasy gesturerecognition for Kinectrdquo Advances in Engineering Software vol76 pp 171ndash180 2014

[20] I-J Ding and C-W Chang ldquoAn eigenspace-based method witha user adaptation scheme for human gesture recognition byusing Kinect 3D datardquo Applied Mathematical Modelling vol 39no 19 pp 5769ndash5777 2015

[21] B Galna G Barry D Jackson D Mhiripiri P Olivier andL Rochester ldquoAccuracy of the Microsoft Kinect sensor formeasuring movement in people with Parkinsonrsquos diseaserdquo Gaitamp Posture vol 39 no 4 pp 1062ndash1068 2014

[22] K Qian H Yang and J Niu ldquoDeveloping a gesture basedremote human-robot interaction system using kinectrdquo Interna-tional Journal of Smart Home vol 7 no 4 pp 203ndash208 2013

[23] S Shirwalkar A Singh K Sharma and N Singh ldquoTelemanip-ulation of an industrial robotic arm using gesture recognitionwithKinectrdquo inProceedings of the IEEE International ConferenceonControl Automation Robotics and Embedded Systems (CARErsquo13) Jabalpur India December 2013

[24] G Du and P Zhang ldquoMarkerless humanndashrobot interface fordual robot manipulators using Kinect sensorrdquo Robotics andComputer-Integrated Manufacturing vol 30 no 2 pp 150ndash1592014

[25] A Hernansanz A Casals and J Amat ldquoAmulti-robot coopera-tion strategy for dexterous task oriented teleoperationrdquoRoboticsand Autonomous Systems vol 68 pp 156ndash172 2015

[26] F Terrence F Conti G Sebastien and B Charles ldquonovelinterfaces for remote driving gesture haptic and PDArdquo in SPIETelemanipulator and Telepresence Technologies VII vol 4195 pp300ndash311 2000

[27] E Ueda Y Matsumoto M Imai and T Ogasawara ldquoA hand-pose estimation for vision-based human interfacesrdquo IEEETransactions on Industrial Electronics vol 50 no 4 pp 676ndash684 2003

[28] J Zhang and A Knoll ldquoA two-arm situated artificial commu-nicator for humanndashrobot cooperative assemblyrdquo IEEE Transac-tions on Industrial Electronics vol 50 no 4 pp 651ndash658 2003

[29] B Ionescu D Coquin P Lambert and V Buzuloiu ldquoDynamichand gesture recognition using the skeleton of the handrdquoEURASIP Journal on Applied Signal Processing vol 13 pp 2101ndash2109 2005

[30] Y Kim S Leonard A Shademan A Krieger and P C W KimldquoKinect technology for hand tracking control of surgical robotstechnical and surgical skill comparison to current roboticmastersrdquo Surgical Endoscopy vol 28 no 6 pp 1993ndash2000 2014

[31] J Kofman S Verma and X Wu ldquoRobot-manipulator teleop-eration by markerless vision-based hand-arm trackingrdquo Inter-national Journal of Optomechatronics vol 1 no 3 pp 331ndash3572007

[32] R C Luo B-H Shih and T-W Lin ldquoReal time humanmotion imitation of anthropomorphic dual arm robot based onCartesian impedance controlrdquo in Proceedings of the 11th IEEEInternational Symposium on Robotic and Sensors Environments(ROSE rsquo13) pp 25ndash30 Washington DC USA October 2013

[33] R Afthoni A Rizal and E Susanto ldquoProportional deriva-tive control based robot arm system using microsoft kinectrdquoin Proceedings of the International Conference on RoboticsBiomimetics Intelligent Computational Systems (ROBIONETICSrsquo13) pp 24ndash29 IEEE Jogjakarta Indonesia November 2013

[34] M Al-Shabi ldquoSimulation and implementation of real-timevision-based control system for 2-DoF robotic arm using PIDwith hardware-in-the-looprdquo Intelligent Control andAutomationvol 6 no 2 pp 147ndash157 2015

[35] I Benabdallah Y Bouteraa R Boucetta and C Rekik ldquoKinect-based Computed Torque Control for lynxmotion robotic armrdquoin Proceedings of the 7th International Conference on ModellingIdentification and Control (ICMIC rsquo15) pp 1ndash6 Sousse TunisiaDecember 2015

[36] H Mehdi and O Boubaker ldquoRobot-assisted therapy designcontrol and optimizationrdquo International Journal on Smart Sens-ing and Intelligent Systems vol 5 no 4 pp 1044ndash1062 2012

[37] I Ben Aabdallah Y Bouteraa and C Rekik ldquoDesign of smartrobot for wrist rehabilitationrdquo International journal of smartsensing and intelligent systems vol 9 no 2 2016

[38] C Pillajo and J E Sierra ldquoHuman machine interface HMIusing Kinect sensor to control a SCARA robotrdquo in Proceedingsof the IEEE Colombian Conference on Communications andComputing (COLCOM rsquo13) pp 1ndash5 Medellin Colo USA May2013

Submit your manuscripts athttpwwwhindawicom

Computer Games Technology

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Distributed Sensor Networks

International Journal of

Advances in

FuzzySystems

Hindawi Publishing Corporationhttpwwwhindawicom

Volume 2014

International Journal of

ReconfigurableComputing

Hindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Applied Computational Intelligence and Soft Computing

thinspAdvancesthinspinthinsp

Artificial Intelligence

HindawithinspPublishingthinspCorporationhttpwwwhindawicom Volumethinsp2014

Advances inSoftware EngineeringHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Electrical and Computer Engineering

Journal of

Journal of

Computer Networks and Communications

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporation

httpwwwhindawicom Volume 2014

Advances in

Multimedia

International Journal of

Biomedical Imaging

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

ArtificialNeural Systems

Advances in

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

RoboticsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Computational Intelligence and Neuroscience

Industrial EngineeringJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Modelling amp Simulation in EngineeringHindawi Publishing Corporation httpwwwhindawicom Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Human-ComputerInteraction

Advances in

Computer EngineeringAdvances in

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Page 4: Research Article Kinect-Based Sliding Mode Control for Lynxmotion Robotic …downloads.hindawi.com/journals/ahci/2016/7921295.pdf · 2019-07-30 · is LabVIEW-based user interface

4 Advances in Human-Computer Interaction

Euler-Lagrange formulation The potential and kinetic ener-gies of each link have been computed by the two followingequations

119906119894= minus119898119894119892119879 119894

119894119875

119870119894=

1

2

119898119894V119879119888119894V119888119894+

1

2

119894

119894119908

119879 119894

119894119868119894

119894119908

(1)

where119898119894is the 119894th link mass V

119888119894is the 119894th link linear velocity

119894

119894119908 is the 119894th link angular velocity 119894

119894119868 is the 119894th inertia tensor

and 119894119894119875 is the 119894th link positionFurther calculation the dynamic model of a full actuated

robot manipulator is expressed as follows

119872(119902) + 119873 (119902 ) = 120591 (2)

where 120591 is 119899 times 1 vector consisting on applied generalizedtorque 119872(119902) is 119899 times 119899 symmetric and positive definite inertiamatrix 119873(119902 ) is the vector of nonlinearity term and 119902 isin

119877119899times119899The vector of nonlinearity term hails from centrifugal

Coriolis and gravity terms described in the following equa-tions

119873(119902 ) = 119881 (119902 ) + 119866 (119902)

119881 (119902 ) = 119861 (119902) [ ] + 119862 (119902) []2

(3)

Here robot manipulator dynamic equation can be written as

120591 = 119872(119902) + 119861 (119902) [ ] + 119862 (119902) []2+ 119866 (119902) (4)

where 119861(119902) is matrix of Coriolis torque 119862(119902) is matrix ofcentrifugal torque [ ] is vector of joint velocity obtainableby [1sdot 2 1sdot 3

1sdot 119899 2sdot 3 ]119879 and []

2 is vectorwitch can be obtained by [

1

2 2

2 3

2 ]

119879In our work a 3 DOF manipulator model has been

developed using the Lagrange formulation The goal wasto determine the Coriolis centrifugal and gravitationalmatrices for real implementation reasons

4 Control Design

Robot manipulator has highly nonlinear dynamic param-eters For this reason the design of a robust controller isrequired In this section we study the motion controllerldquosliding-mode controlrdquo

The dynamicmodel of the robotmanipulator is describedas follows

119872(119902) + 119862 (119902 ) + 119892 (119902) = 120591 (5)

where 119902 = [1199021 1199022 1199023]119879 and 120591 = [120591

1 1205912 1205913]119879

Let 119902119889(119905) denote the desired trajectoryThe tracking error

is defined as

119890 = 119902119889 (

119905) minus 119902 (119905) (6)

Define 119903= 119889+119904(119902119889minus119902) where s is a definite positivematrix

According to the linear characteristic of robotic weobtain

(119902) 119903+ (119902 )

119903+ (119902) = 120574 (119902

119903 119903) (7)

where is the robot estimated parameters vector and120574(119902 119902

119903 119903) is a repressor matrix

Define (119902) = 119867 (119902) minus (119902)

(119902) = 119862 (119902 ) minus (119902 )

(119902) = 119892 (119902) minus (119902)

(8)

The sliding vector is selected as

119904 = 119890 + 120583 lowast 119890 (9)

We propose the sliding mode control law as follows

120591 = (119902) 119903+ (119902 )

119903+ (119902) + 119904 + 119896 lowast sgn (119904) (10)

where 119896119894= sum119895120574119894119895119895

Define

119894Such forall119894

10038161003816100381610038161003816119894

10038161003816100381610038161003816le 119894

120574119894119895Such forall119894

10038161003816100381610038161003816120574119894119895

10038161003816100381610038161003816le 120574119894119895

(11)

Theorem 1 The proposed controller guarantees the asymptoticstability of the system

Proof Select the LEC as

119881 (119905) =

1

2

119904119879119872(119902) 119904 997904rArr

(119905) = 119904119879119872(119902) 119904 +

1

2

119904119879 (119902) 119904

= 119904119879119872(119902) 119904 + 119904

119879119862 (119902) 119904

= 119904119879[119872 (119902)

119903+ 119862 (119902 )

119903+ 119892 (119902) minus 120591]

(12)

Replacing 120591 by its expression (10) yields

(119905)

= 119904119879[ (119902)

119903+ (119902 )

119903+ (119902) minus 119896 lowast sgn (119904) minus 119904]

= 119904119879[120574 (119902

119903 119903) minus 119896 sdot sgn (119904) minus 119904]

(13)

where 120574(119902 119903 119903) = [120574

119894119895] such as |120574

119894119895| le 120574119894119895

And = [119894] such as |

119894| le 119894119895

Yield

(119905) = sum

119894

sum

119895

119904119894120574119894119895119895minus sum119904

119894119870119894sgn (119904

119894) minus sum

119894

1199042

119894

= sum

119894

sum

119895

119904119894120574119894119895119895minus sum

119894

sum

119895

1003816100381610038161003816119904119894

1003816100381610038161003816120574119894119895minus sum

119894

1199042

119894le minussum

119894

1199042

119894

le 0

(14)

This proves the stability

Advances in Human-Computer Interaction 5

Kinect sensor

USB 20

Ges

ture

Skeleton data

User

Kinect toolkit

Joint positions

Joint velocities

Joint accelerations

Desired positions

Virtual robot

Human machine interface

Dynamic model

Sliding mode control

Mi(qi)qi + Ci(qi qi)qi + gi(qi) = 120591i

Desired pulse width

Real pulse width

Com

pute

d to

rque

USB

20

Real

join

t pos

ition

s

Arduino board

Lynxmotion AL5D

Control designand simulation

+ +minus

Figure 5 System overview

Figure 6 First screen of the user interface

Figure 7 Third screen of the user interface

5 Experiment Results

In the beginning we had some packages to install Indeedthe JKIrsquos VI Package Manager such as Kinesthesia Toolkit forMicrosoft Kinect LabVIEW Interface for Arduino (LIFA)and control design and simulation module are strictlyrequired for the implementation and then for the propersystem functioning

Figure 8 Fourth screen of the user interface

In this section we present a LabVIEW-based human-machine interface The virtual robot has been designed in3D LabVIEW virtual environment The virtual robot has thesame kinematics parameters of the real robot such as jointsaxes and links distances A user interface screen has beenreserved to display the virtual robotmovement versus the realrobot

The control law based on the sliding mode controlhas been successfully implemented with the parametersdiscussed in the previous section

Going from gesture applied by the human arm Kinectsensor captures the human arm motion and abstracts itin 3D coordinates Using Kinect for Windows SDK theskeleton data are provided and transmitted to the recognitionalgorithm Based onKinesthesia Toolkit forMicrosoftKinectthe human arm joint positions velocities and accelerationsare computed and considered as desired trajectories of therobotic arm By helping of LabVIEW control design and sim-ulation the developed dynamic model of 3DOF Lynxmotionrobotic arm is implemented Also a sliding mode controlalgorithm is implemented ensuring the high performance inthe trajectories tracking Furthermore the computed torque

6 Advances in Human-Computer Interaction

Figure 9 User position checking process

Size (x)

Filtered X

(XXf)Smoothing

Signal

Order

Low cutoff freq

X

Samplingfrequency

Enablesmoothing

Figure 10 Butterworth-based designed filter

Angle in Invalid position

Angle out

AvoidNaN

values

Figure 11 Avoid invalid position sub-VI

is transmitted to the servomotors moving the mechanicalstructure of the robotic arm Moreover the desired jointpositions are applied in the virtual environment to movethe virtual robot in order to prove the compatibility of thedesired motion in both real and virtual environment Hencean augmented reality approach is proposed to validate thecontrol system

The system overview can be presented by Figure 5In our experiment we only use the first three joints and

the remaining joints are not considered Note that the userinterface contains four screens

The first screen is designed to the setting and hardwareconfiguration The screen displays the tracking process statesuch us invalid positions joint positions and 3D coordinatesand user position checking A security algorithm has beendeveloped to secure the tracking process Note that wherethe security program indicates a skeleton detection errorfor one reason or another the last calculated angle (already

validated) of each joint will be sent to the robot until theoperator solves the problem Based on the feedback nodewhich returns the last valid result with a nonnumeric inputthe security algorithm is developed and abstracted in sub-VI for importing in the user interface Figure 11 illustratesthe imported sub-VI In addition the user can also set theArduino board communication such us connection type andbaud rate and Arduino board type and select the Arduinoresource

In the tracking process the operatorrsquos arm can twitchleading to unwanted random movement In order to avoidthis undesired movement we propose a Butterworth lowpass filter-based smoothing signal algorithm In the samescreen the user can enable the smoothing signal algorithmby pressing on the bouton control The designed filter isabstracted in a sub-VI as shown in Figure 10 The filteredtrajectories presented as the desired trajectories in Figures15 16 and 17 demonstrate the effectiveness of the filteringalgorithm

Finally the stop button is set in the developed interface tostop the tracking process and close all connected hardwareThis screen is developed as shown in Figure 6

The second screen presented in Figure 3 is designed torecord the userrsquos video and skeleton display Figure 7 showsthe third screen to prove the augmented reality In this pagewe show the virtual and real environment The user andthe virtual robot will be displayed in the user video portionwhereas the virtual robot will be presented in the virtualenvironment software

The remaining screen describes the joints graphs andoperatorrsquos video recording during the experimentation asshown in Figure 8

After running the operator must select the first screencalled Kinect Feedback to adjust its position A user positionchecking is triggered to secure the operatorrsquos position inthe Kinect sensing field A green color of the signalingLED indicates the good state of the recognition processOtherwise the LED color changes to red to indicate the

Advances in Human-Computer Interaction 7

Figure 12 Augmented reality process

Figure 13 User and skeleton display

problem of tracking Figure 9 describes the executed screenin the user position checking process

Based on Kinect toolkit for LabVIEW the operatorrsquosskeleton data will be read by the LabVIEW interface Thena gesture recognition method is executed to determine eachjoint angle The calculated joint positions are considered asdesired trajectories for the slidingmode control implementedin the developed software Ultimately the computed torquewill be sent to the Atmel-based Arduino card via USBprotocol Therefore the servomotors of the three first joints

will be actuated to move the mechanical structure of theLynxmotion robot

A three-degree of freedom virtual robot is used toperform this experiment Essentially the augmented realityevaluated both the ability of the robot manipulator to copyhuman hand arm motions and the ability of the user touse the human-robot manipulator interface The proposedcontroller is validated by the augmented reality In fact bybuilding a virtual robot by using LabVIEW other than thereal robot as shown in Figure 12 virtual robot movements

8 Advances in Human-Computer Interaction

Figure 14 Gesture control process and real-time graphs

are similar to the real robot Therefore the proposed controlmethod is validated

Figure 13 proves how the user can control themanipulatorrobot based in his detected skeleton

Photographs series depicting gesture control processexecution each five seconds with real-time graphs have beenpresented in Figure 14

In the teleoperation process the tracking error conver-gence is very important and the performance of our system isbased on the tracking error optimization Indeed in the casethat the elementary joint cannot track his similar human armjoint the robotic arm will not be able to perform the desiredtask because of the accumulation of each joint position errorIn this case the end-effector localization is different regardingthe desired placement Therefore the user must adapt thegesture with the current end-effector position to reduce theend-effector placement error Hence the adaptive gesturesestimated by the user require a particular intelligence andsuch expertise In another case when the user applies thedesired gesture with a variable velocity in this situation thecontrol algorithm must respect the applied velocities andaccelerations to conserve the convergence of the tracking

error In order to overcome this constraint the use of highperformance sliding mode controller is highly required

Figures 15 16 and 17 show trajectories of each joint andthe desired trajectory for each Both trajectories are as afunction of timeThe illustrated results show the effectivenessof the proposed control approachThe illustrations have onlyone second as response time In experimental conditionthis time can be regarded as excellent This control strategyshould be improved if we hope to exploit this system intelemedicine applications Moreover the developed teleoper-ation approach can be used for industrial applications in par-ticular for pic and place tasks in hazardous or unstructuredenvironments The real-time teleoperation system offers abenefit platform to manage the unexpected scenarios

6 Conclusion

This paper applies the gesture control to the robot manip-ulators Indeed in this work we use the Kinect technol-ogy to establish a human-machine interaction (HMI) Thedeveloped control approach can be used in all applications

Advances in Human-Computer Interaction 9

Desired joint 1 positionMeasured joint 1 position

200100 15050 250 300 350 40000Time (s)

040608

11214

Am

plitu

de (r

ads)

Figure 15 Desired and measured joint 1 position

50 100 150 200 250 300 350 40000Time (s)

002040608

112

Am

plitu

de (r

ads) Desired joint 2 position

Measured joint 2 position

Figure 16 Desired and measured joint 2 position

50 100 150 200 250 300 350 40000Time (s)

012345

Am

plitu

de (r

ads)

Desired joint 3 positionMeasured joint 3 position

Figure 17 Desired and measured joint 3 position

which require real-time human-robot cooperation Trajec-tory tracking and system stability have been ensured bysliding mode control (SMC) Control and supervision bothhave been provided by a high-friendly human-machine inter-face developed in LabVIEW Experiments have shown thehigh efficiency of the proposed approach Furthermore theKinect-based teleoperation system is validated by augmentedreality In conclusion the proposed control scheme providesa feasible teleoperation system for many applications such asmanipulating in hazardous and unstructured environmentHowever there are some constraints when applying theproposed method such as the sensibility of the desiredtrajectory generated by the human arm even in case ofrandom and unwanted movements Indeed the work spaceand the regional constraints must be respected in the controlschemes by filtering the unaccepted trajectories Moreoverfor pic and place applications grasping task is not consideredin the proposed telemanipulation system In perspective thisdrawback can be overcome by using the electromyographybiosignal for grasping task control

Competing Interests

The authors declare that they have no competing interests

References

[1] Y Bouteraa J Ghommam N Derbel and G Poisson ldquoNonlin-ear control and synchronization with time delays of multiagentrobotic systemsrdquo Journal of Control Science and Engineering vol2011 Article ID 632374 9 pages 2011

[2] Y Bouteraa J Ghommam G Poisson and N Derbel ldquoDis-tributed synchronization control to trajectory tracking of mul-tiple robot manipulatorsrdquo Journal of Robotics vol 2011 ArticleID 652785 10 pages 2011

[3] G A M Vasiljevic L C de Miranda and E E C de MirandaldquoA case study ofmastermind chess comparingmousekeyboardinteraction with kinect-based gestural interfacerdquo Advances inHuman-Computer Interaction vol 2016 Article ID 4602471 10pages 2016

[4] C-H Tsai Y-H Kuo K-C Chu and J-C Yen ldquoDevelop-ment and evaluation of game-based learning system using themicrosoft kinect sensorrdquo International Journal of DistributedSensor Networks vol 2015 Article ID 498560 10 pages 2015

[5] C-H Chuang Y-N Chen L-W Tsai C-C Lee andH-C TsaildquoImproving learning performance with happiness by interactivescenariosrdquo The Scientific World Journal vol 2014 Article ID807347 12 pages 2014

[6] K J Bower J Louie Y Landesrocha P Seedy A Gorelik and JBernhardt ldquoClinical feasibility of interactive motion-controlledgames for stroke rehabilitationrdquo Journal of NeuroEngineeringand Rehabilitation vol 12 no 1 article 63 2015

[7] M Strbac S Kocovic M Markovic and D B PopovicldquoMicrosoft kinect-based artificial perception system for controlof functional electrical stimulation assisted graspingrdquo BioMedResearch International vol 2014 Article ID 740469 12 pages2014

[8] N Kanwal E Bostanci K Currie andA F Clark ldquoA navigationsystem for the visually impaired a fusion of vision and depthsensorrdquo Applied Bionics and Biomechanics vol 2015 Article ID479857 16 pages 2015

[9] H-H Pham T-L Le and N Vuillerme ldquoReal-time obsta-cle detection system in indoor environment for the visuallyimpaired using microsoft kinect sensorrdquo Journal of Sensors vol2016 Article ID 3754918 13 pages 2016

[10] D Tuvshinjargal B Dorj and D J Lee ldquoHybrid motion plan-ning method for autonomous robots using kinect based sensorfusion and virtual plane approach in dynamic environmentsrdquoJournal of Sensors vol 2015 Article ID 471052 13 pages 2015

[11] T Xu S Jia Z Dong and X Li ldquoObstacles regions 3D-perception method for mobile robots based on visual saliencyrdquoJournal of Robotics vol 2015 Article ID 720174 10 pages 2015

[12] W Shang X Cao H Ma H Zang and P Wei ldquoKinect-based vision system of mine rescue robot for low illuminousenvironmentrdquo Journal of Sensors vol 2016 Article ID 82520159 pages 2016

[13] H Kim and I Kim ldquoDynamic arm gesture recognition usingspherical angle features and hidden markov modelsrdquo Advancesin Human-Computer Interaction vol 2015 Article ID 785349 7pages 2015

[14] A Chavez-Aragon R Macknojia P Payeur and R LaganiereldquoRapid 3D modeling and parts recognition on automotivevehicles using a network of RGB-D sensors for robot guidancerdquoJournal of Sensors vol 2013 Article ID 832963 16 pages 2013

[15] A Prochazka O Vysata M Valis and R Yadollahi ldquoThe MSkinect use for 3d modelling and gait analysis in the Matlab

10 Advances in Human-Computer Interaction

environmentrdquo in Proceedings of the 21th Annual ConferenceTechnical Computing 2013 Prague Czech Republic 2013

[16] B Y L Li A S Mian W Liu and A Krishna ldquoUsingKinect for face recognition under varying poses expressionsillumination and disguiserdquo in Proceedings of the IEEEWorkshopon Applications of Computer Vision (WACV rsquo13) pp 186ndash192Tampa Fla USA January 2013

[17] C Muhiddin D B Phillips M J Miles L Picco and D MCarberry ldquoKinect 4 holographic optical tweezersrdquo Journalof Optics vol 15 no 7 Article ID 075302 2013

[18] P Qiu C Ni J Zhang and H Cao ldquoResearch of virtualsimulation of traditional Chinese bone setting techniques basedon kinect skeletal informationrdquo Journal of Shandong Universityof Traditional Chinese Medicine no 3 pp 202ndash204 2014

[19] R Ibanez A Soria A Teyseyre and M Campo ldquoEasy gesturerecognition for Kinectrdquo Advances in Engineering Software vol76 pp 171ndash180 2014

[20] I-J Ding and C-W Chang ldquoAn eigenspace-based method witha user adaptation scheme for human gesture recognition byusing Kinect 3D datardquo Applied Mathematical Modelling vol 39no 19 pp 5769ndash5777 2015

[21] B Galna G Barry D Jackson D Mhiripiri P Olivier andL Rochester ldquoAccuracy of the Microsoft Kinect sensor formeasuring movement in people with Parkinsonrsquos diseaserdquo Gaitamp Posture vol 39 no 4 pp 1062ndash1068 2014

[22] K Qian H Yang and J Niu ldquoDeveloping a gesture basedremote human-robot interaction system using kinectrdquo Interna-tional Journal of Smart Home vol 7 no 4 pp 203ndash208 2013

[23] S Shirwalkar A Singh K Sharma and N Singh ldquoTelemanip-ulation of an industrial robotic arm using gesture recognitionwithKinectrdquo inProceedings of the IEEE International ConferenceonControl Automation Robotics and Embedded Systems (CARErsquo13) Jabalpur India December 2013

[24] G Du and P Zhang ldquoMarkerless humanndashrobot interface fordual robot manipulators using Kinect sensorrdquo Robotics andComputer-Integrated Manufacturing vol 30 no 2 pp 150ndash1592014

[25] A Hernansanz A Casals and J Amat ldquoAmulti-robot coopera-tion strategy for dexterous task oriented teleoperationrdquoRoboticsand Autonomous Systems vol 68 pp 156ndash172 2015

[26] F Terrence F Conti G Sebastien and B Charles ldquonovelinterfaces for remote driving gesture haptic and PDArdquo in SPIETelemanipulator and Telepresence Technologies VII vol 4195 pp300ndash311 2000

[27] E Ueda Y Matsumoto M Imai and T Ogasawara ldquoA hand-pose estimation for vision-based human interfacesrdquo IEEETransactions on Industrial Electronics vol 50 no 4 pp 676ndash684 2003

[28] J Zhang and A Knoll ldquoA two-arm situated artificial commu-nicator for humanndashrobot cooperative assemblyrdquo IEEE Transac-tions on Industrial Electronics vol 50 no 4 pp 651ndash658 2003

[29] B Ionescu D Coquin P Lambert and V Buzuloiu ldquoDynamichand gesture recognition using the skeleton of the handrdquoEURASIP Journal on Applied Signal Processing vol 13 pp 2101ndash2109 2005

[30] Y Kim S Leonard A Shademan A Krieger and P C W KimldquoKinect technology for hand tracking control of surgical robotstechnical and surgical skill comparison to current roboticmastersrdquo Surgical Endoscopy vol 28 no 6 pp 1993ndash2000 2014

[31] J Kofman S Verma and X Wu ldquoRobot-manipulator teleop-eration by markerless vision-based hand-arm trackingrdquo Inter-national Journal of Optomechatronics vol 1 no 3 pp 331ndash3572007

[32] R C Luo B-H Shih and T-W Lin ldquoReal time humanmotion imitation of anthropomorphic dual arm robot based onCartesian impedance controlrdquo in Proceedings of the 11th IEEEInternational Symposium on Robotic and Sensors Environments(ROSE rsquo13) pp 25ndash30 Washington DC USA October 2013

[33] R Afthoni A Rizal and E Susanto ldquoProportional deriva-tive control based robot arm system using microsoft kinectrdquoin Proceedings of the International Conference on RoboticsBiomimetics Intelligent Computational Systems (ROBIONETICSrsquo13) pp 24ndash29 IEEE Jogjakarta Indonesia November 2013

[34] M Al-Shabi ldquoSimulation and implementation of real-timevision-based control system for 2-DoF robotic arm using PIDwith hardware-in-the-looprdquo Intelligent Control andAutomationvol 6 no 2 pp 147ndash157 2015

[35] I Benabdallah Y Bouteraa R Boucetta and C Rekik ldquoKinect-based Computed Torque Control for lynxmotion robotic armrdquoin Proceedings of the 7th International Conference on ModellingIdentification and Control (ICMIC rsquo15) pp 1ndash6 Sousse TunisiaDecember 2015

[36] H Mehdi and O Boubaker ldquoRobot-assisted therapy designcontrol and optimizationrdquo International Journal on Smart Sens-ing and Intelligent Systems vol 5 no 4 pp 1044ndash1062 2012

[37] I Ben Aabdallah Y Bouteraa and C Rekik ldquoDesign of smartrobot for wrist rehabilitationrdquo International journal of smartsensing and intelligent systems vol 9 no 2 2016

[38] C Pillajo and J E Sierra ldquoHuman machine interface HMIusing Kinect sensor to control a SCARA robotrdquo in Proceedingsof the IEEE Colombian Conference on Communications andComputing (COLCOM rsquo13) pp 1ndash5 Medellin Colo USA May2013

Submit your manuscripts athttpwwwhindawicom

Computer Games Technology

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Distributed Sensor Networks

International Journal of

Advances in

FuzzySystems

Hindawi Publishing Corporationhttpwwwhindawicom

Volume 2014

International Journal of

ReconfigurableComputing

Hindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Applied Computational Intelligence and Soft Computing

thinspAdvancesthinspinthinsp

Artificial Intelligence

HindawithinspPublishingthinspCorporationhttpwwwhindawicom Volumethinsp2014

Advances inSoftware EngineeringHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Electrical and Computer Engineering

Journal of

Journal of

Computer Networks and Communications

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporation

httpwwwhindawicom Volume 2014

Advances in

Multimedia

International Journal of

Biomedical Imaging

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

ArtificialNeural Systems

Advances in

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

RoboticsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Computational Intelligence and Neuroscience

Industrial EngineeringJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Modelling amp Simulation in EngineeringHindawi Publishing Corporation httpwwwhindawicom Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Human-ComputerInteraction

Advances in

Computer EngineeringAdvances in

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Page 5: Research Article Kinect-Based Sliding Mode Control for Lynxmotion Robotic …downloads.hindawi.com/journals/ahci/2016/7921295.pdf · 2019-07-30 · is LabVIEW-based user interface

Advances in Human-Computer Interaction 5

Kinect sensor

USB 20

Ges

ture

Skeleton data

User

Kinect toolkit

Joint positions

Joint velocities

Joint accelerations

Desired positions

Virtual robot

Human machine interface

Dynamic model

Sliding mode control

Mi(qi)qi + Ci(qi qi)qi + gi(qi) = 120591i

Desired pulse width

Real pulse width

Com

pute

d to

rque

USB

20

Real

join

t pos

ition

s

Arduino board

Lynxmotion AL5D

Control designand simulation

+ +minus

Figure 5 System overview

Figure 6 First screen of the user interface

Figure 7 Third screen of the user interface

5 Experiment Results

In the beginning we had some packages to install Indeedthe JKIrsquos VI Package Manager such as Kinesthesia Toolkit forMicrosoft Kinect LabVIEW Interface for Arduino (LIFA)and control design and simulation module are strictlyrequired for the implementation and then for the propersystem functioning

Figure 8 Fourth screen of the user interface

In this section we present a LabVIEW-based human-machine interface The virtual robot has been designed in3D LabVIEW virtual environment The virtual robot has thesame kinematics parameters of the real robot such as jointsaxes and links distances A user interface screen has beenreserved to display the virtual robotmovement versus the realrobot

The control law based on the sliding mode controlhas been successfully implemented with the parametersdiscussed in the previous section

Going from gesture applied by the human arm Kinectsensor captures the human arm motion and abstracts itin 3D coordinates Using Kinect for Windows SDK theskeleton data are provided and transmitted to the recognitionalgorithm Based onKinesthesia Toolkit forMicrosoftKinectthe human arm joint positions velocities and accelerationsare computed and considered as desired trajectories of therobotic arm By helping of LabVIEW control design and sim-ulation the developed dynamic model of 3DOF Lynxmotionrobotic arm is implemented Also a sliding mode controlalgorithm is implemented ensuring the high performance inthe trajectories tracking Furthermore the computed torque

6 Advances in Human-Computer Interaction

Figure 9 User position checking process

Size (x)

Filtered X

(XXf)Smoothing

Signal

Order

Low cutoff freq

X

Samplingfrequency

Enablesmoothing

Figure 10 Butterworth-based designed filter

Angle in Invalid position

Angle out

AvoidNaN

values

Figure 11 Avoid invalid position sub-VI

is transmitted to the servomotors moving the mechanicalstructure of the robotic arm Moreover the desired jointpositions are applied in the virtual environment to movethe virtual robot in order to prove the compatibility of thedesired motion in both real and virtual environment Hencean augmented reality approach is proposed to validate thecontrol system

The system overview can be presented by Figure 5In our experiment we only use the first three joints and

the remaining joints are not considered Note that the userinterface contains four screens

The first screen is designed to the setting and hardwareconfiguration The screen displays the tracking process statesuch us invalid positions joint positions and 3D coordinatesand user position checking A security algorithm has beendeveloped to secure the tracking process Note that wherethe security program indicates a skeleton detection errorfor one reason or another the last calculated angle (already

validated) of each joint will be sent to the robot until theoperator solves the problem Based on the feedback nodewhich returns the last valid result with a nonnumeric inputthe security algorithm is developed and abstracted in sub-VI for importing in the user interface Figure 11 illustratesthe imported sub-VI In addition the user can also set theArduino board communication such us connection type andbaud rate and Arduino board type and select the Arduinoresource

In the tracking process the operatorrsquos arm can twitchleading to unwanted random movement In order to avoidthis undesired movement we propose a Butterworth lowpass filter-based smoothing signal algorithm In the samescreen the user can enable the smoothing signal algorithmby pressing on the bouton control The designed filter isabstracted in a sub-VI as shown in Figure 10 The filteredtrajectories presented as the desired trajectories in Figures15 16 and 17 demonstrate the effectiveness of the filteringalgorithm

Finally the stop button is set in the developed interface tostop the tracking process and close all connected hardwareThis screen is developed as shown in Figure 6

The second screen presented in Figure 3 is designed torecord the userrsquos video and skeleton display Figure 7 showsthe third screen to prove the augmented reality In this pagewe show the virtual and real environment The user andthe virtual robot will be displayed in the user video portionwhereas the virtual robot will be presented in the virtualenvironment software

The remaining screen describes the joints graphs andoperatorrsquos video recording during the experimentation asshown in Figure 8

After running the operator must select the first screencalled Kinect Feedback to adjust its position A user positionchecking is triggered to secure the operatorrsquos position inthe Kinect sensing field A green color of the signalingLED indicates the good state of the recognition processOtherwise the LED color changes to red to indicate the

Advances in Human-Computer Interaction 7

Figure 12 Augmented reality process

Figure 13 User and skeleton display

problem of tracking Figure 9 describes the executed screenin the user position checking process

Based on Kinect toolkit for LabVIEW the operatorrsquosskeleton data will be read by the LabVIEW interface Thena gesture recognition method is executed to determine eachjoint angle The calculated joint positions are considered asdesired trajectories for the slidingmode control implementedin the developed software Ultimately the computed torquewill be sent to the Atmel-based Arduino card via USBprotocol Therefore the servomotors of the three first joints

will be actuated to move the mechanical structure of theLynxmotion robot

A three-degree of freedom virtual robot is used toperform this experiment Essentially the augmented realityevaluated both the ability of the robot manipulator to copyhuman hand arm motions and the ability of the user touse the human-robot manipulator interface The proposedcontroller is validated by the augmented reality In fact bybuilding a virtual robot by using LabVIEW other than thereal robot as shown in Figure 12 virtual robot movements

8 Advances in Human-Computer Interaction

Figure 14 Gesture control process and real-time graphs

are similar to the real robot Therefore the proposed controlmethod is validated

Figure 13 proves how the user can control themanipulatorrobot based in his detected skeleton

Photographs series depicting gesture control processexecution each five seconds with real-time graphs have beenpresented in Figure 14

In the teleoperation process the tracking error conver-gence is very important and the performance of our system isbased on the tracking error optimization Indeed in the casethat the elementary joint cannot track his similar human armjoint the robotic arm will not be able to perform the desiredtask because of the accumulation of each joint position errorIn this case the end-effector localization is different regardingthe desired placement Therefore the user must adapt thegesture with the current end-effector position to reduce theend-effector placement error Hence the adaptive gesturesestimated by the user require a particular intelligence andsuch expertise In another case when the user applies thedesired gesture with a variable velocity in this situation thecontrol algorithm must respect the applied velocities andaccelerations to conserve the convergence of the tracking

error In order to overcome this constraint the use of highperformance sliding mode controller is highly required

Figures 15 16 and 17 show trajectories of each joint andthe desired trajectory for each Both trajectories are as afunction of timeThe illustrated results show the effectivenessof the proposed control approachThe illustrations have onlyone second as response time In experimental conditionthis time can be regarded as excellent This control strategyshould be improved if we hope to exploit this system intelemedicine applications Moreover the developed teleoper-ation approach can be used for industrial applications in par-ticular for pic and place tasks in hazardous or unstructuredenvironments The real-time teleoperation system offers abenefit platform to manage the unexpected scenarios

6 Conclusion

This paper applies the gesture control to the robot manip-ulators Indeed in this work we use the Kinect technol-ogy to establish a human-machine interaction (HMI) Thedeveloped control approach can be used in all applications

Advances in Human-Computer Interaction 9

Desired joint 1 positionMeasured joint 1 position

200100 15050 250 300 350 40000Time (s)

040608

11214

Am

plitu

de (r

ads)

Figure 15 Desired and measured joint 1 position

50 100 150 200 250 300 350 40000Time (s)

002040608

112

Am

plitu

de (r

ads) Desired joint 2 position

Measured joint 2 position

Figure 16 Desired and measured joint 2 position

50 100 150 200 250 300 350 40000Time (s)

012345

Am

plitu

de (r

ads)

Desired joint 3 positionMeasured joint 3 position

Figure 17 Desired and measured joint 3 position

which require real-time human-robot cooperation Trajec-tory tracking and system stability have been ensured bysliding mode control (SMC) Control and supervision bothhave been provided by a high-friendly human-machine inter-face developed in LabVIEW Experiments have shown thehigh efficiency of the proposed approach Furthermore theKinect-based teleoperation system is validated by augmentedreality In conclusion the proposed control scheme providesa feasible teleoperation system for many applications such asmanipulating in hazardous and unstructured environmentHowever there are some constraints when applying theproposed method such as the sensibility of the desiredtrajectory generated by the human arm even in case ofrandom and unwanted movements Indeed the work spaceand the regional constraints must be respected in the controlschemes by filtering the unaccepted trajectories Moreoverfor pic and place applications grasping task is not consideredin the proposed telemanipulation system In perspective thisdrawback can be overcome by using the electromyographybiosignal for grasping task control

Competing Interests

The authors declare that they have no competing interests

References

[1] Y Bouteraa J Ghommam N Derbel and G Poisson ldquoNonlin-ear control and synchronization with time delays of multiagentrobotic systemsrdquo Journal of Control Science and Engineering vol2011 Article ID 632374 9 pages 2011

[2] Y Bouteraa J Ghommam G Poisson and N Derbel ldquoDis-tributed synchronization control to trajectory tracking of mul-tiple robot manipulatorsrdquo Journal of Robotics vol 2011 ArticleID 652785 10 pages 2011

[3] G A M Vasiljevic L C de Miranda and E E C de MirandaldquoA case study ofmastermind chess comparingmousekeyboardinteraction with kinect-based gestural interfacerdquo Advances inHuman-Computer Interaction vol 2016 Article ID 4602471 10pages 2016

[4] C-H Tsai Y-H Kuo K-C Chu and J-C Yen ldquoDevelop-ment and evaluation of game-based learning system using themicrosoft kinect sensorrdquo International Journal of DistributedSensor Networks vol 2015 Article ID 498560 10 pages 2015

[5] C-H Chuang Y-N Chen L-W Tsai C-C Lee andH-C TsaildquoImproving learning performance with happiness by interactivescenariosrdquo The Scientific World Journal vol 2014 Article ID807347 12 pages 2014

[6] K J Bower J Louie Y Landesrocha P Seedy A Gorelik and JBernhardt ldquoClinical feasibility of interactive motion-controlledgames for stroke rehabilitationrdquo Journal of NeuroEngineeringand Rehabilitation vol 12 no 1 article 63 2015

[7] M Strbac S Kocovic M Markovic and D B PopovicldquoMicrosoft kinect-based artificial perception system for controlof functional electrical stimulation assisted graspingrdquo BioMedResearch International vol 2014 Article ID 740469 12 pages2014

[8] N Kanwal E Bostanci K Currie andA F Clark ldquoA navigationsystem for the visually impaired a fusion of vision and depthsensorrdquo Applied Bionics and Biomechanics vol 2015 Article ID479857 16 pages 2015

[9] H-H Pham T-L Le and N Vuillerme ldquoReal-time obsta-cle detection system in indoor environment for the visuallyimpaired using microsoft kinect sensorrdquo Journal of Sensors vol2016 Article ID 3754918 13 pages 2016

[10] D Tuvshinjargal B Dorj and D J Lee ldquoHybrid motion plan-ning method for autonomous robots using kinect based sensorfusion and virtual plane approach in dynamic environmentsrdquoJournal of Sensors vol 2015 Article ID 471052 13 pages 2015

[11] T Xu S Jia Z Dong and X Li ldquoObstacles regions 3D-perception method for mobile robots based on visual saliencyrdquoJournal of Robotics vol 2015 Article ID 720174 10 pages 2015

[12] W Shang X Cao H Ma H Zang and P Wei ldquoKinect-based vision system of mine rescue robot for low illuminousenvironmentrdquo Journal of Sensors vol 2016 Article ID 82520159 pages 2016

[13] H Kim and I Kim ldquoDynamic arm gesture recognition usingspherical angle features and hidden markov modelsrdquo Advancesin Human-Computer Interaction vol 2015 Article ID 785349 7pages 2015

[14] A Chavez-Aragon R Macknojia P Payeur and R LaganiereldquoRapid 3D modeling and parts recognition on automotivevehicles using a network of RGB-D sensors for robot guidancerdquoJournal of Sensors vol 2013 Article ID 832963 16 pages 2013

[15] A Prochazka O Vysata M Valis and R Yadollahi ldquoThe MSkinect use for 3d modelling and gait analysis in the Matlab

10 Advances in Human-Computer Interaction

environmentrdquo in Proceedings of the 21th Annual ConferenceTechnical Computing 2013 Prague Czech Republic 2013

[16] B Y L Li A S Mian W Liu and A Krishna ldquoUsingKinect for face recognition under varying poses expressionsillumination and disguiserdquo in Proceedings of the IEEEWorkshopon Applications of Computer Vision (WACV rsquo13) pp 186ndash192Tampa Fla USA January 2013

[17] C Muhiddin D B Phillips M J Miles L Picco and D MCarberry ldquoKinect 4 holographic optical tweezersrdquo Journalof Optics vol 15 no 7 Article ID 075302 2013

[18] P Qiu C Ni J Zhang and H Cao ldquoResearch of virtualsimulation of traditional Chinese bone setting techniques basedon kinect skeletal informationrdquo Journal of Shandong Universityof Traditional Chinese Medicine no 3 pp 202ndash204 2014

[19] R Ibanez A Soria A Teyseyre and M Campo ldquoEasy gesturerecognition for Kinectrdquo Advances in Engineering Software vol76 pp 171ndash180 2014

[20] I-J Ding and C-W Chang ldquoAn eigenspace-based method witha user adaptation scheme for human gesture recognition byusing Kinect 3D datardquo Applied Mathematical Modelling vol 39no 19 pp 5769ndash5777 2015

[21] B Galna G Barry D Jackson D Mhiripiri P Olivier andL Rochester ldquoAccuracy of the Microsoft Kinect sensor formeasuring movement in people with Parkinsonrsquos diseaserdquo Gaitamp Posture vol 39 no 4 pp 1062ndash1068 2014

[22] K Qian H Yang and J Niu ldquoDeveloping a gesture basedremote human-robot interaction system using kinectrdquo Interna-tional Journal of Smart Home vol 7 no 4 pp 203ndash208 2013

[23] S Shirwalkar A Singh K Sharma and N Singh ldquoTelemanip-ulation of an industrial robotic arm using gesture recognitionwithKinectrdquo inProceedings of the IEEE International ConferenceonControl Automation Robotics and Embedded Systems (CARErsquo13) Jabalpur India December 2013

[24] G Du and P Zhang ldquoMarkerless humanndashrobot interface fordual robot manipulators using Kinect sensorrdquo Robotics andComputer-Integrated Manufacturing vol 30 no 2 pp 150ndash1592014

[25] A Hernansanz A Casals and J Amat ldquoAmulti-robot coopera-tion strategy for dexterous task oriented teleoperationrdquoRoboticsand Autonomous Systems vol 68 pp 156ndash172 2015

[26] F Terrence F Conti G Sebastien and B Charles ldquonovelinterfaces for remote driving gesture haptic and PDArdquo in SPIETelemanipulator and Telepresence Technologies VII vol 4195 pp300ndash311 2000

[27] E Ueda Y Matsumoto M Imai and T Ogasawara ldquoA hand-pose estimation for vision-based human interfacesrdquo IEEETransactions on Industrial Electronics vol 50 no 4 pp 676ndash684 2003

[28] J Zhang and A Knoll ldquoA two-arm situated artificial commu-nicator for humanndashrobot cooperative assemblyrdquo IEEE Transac-tions on Industrial Electronics vol 50 no 4 pp 651ndash658 2003

[29] B Ionescu D Coquin P Lambert and V Buzuloiu ldquoDynamichand gesture recognition using the skeleton of the handrdquoEURASIP Journal on Applied Signal Processing vol 13 pp 2101ndash2109 2005

[30] Y Kim S Leonard A Shademan A Krieger and P C W KimldquoKinect technology for hand tracking control of surgical robotstechnical and surgical skill comparison to current roboticmastersrdquo Surgical Endoscopy vol 28 no 6 pp 1993ndash2000 2014

[31] J Kofman S Verma and X Wu ldquoRobot-manipulator teleop-eration by markerless vision-based hand-arm trackingrdquo Inter-national Journal of Optomechatronics vol 1 no 3 pp 331ndash3572007

[32] R C Luo B-H Shih and T-W Lin ldquoReal time humanmotion imitation of anthropomorphic dual arm robot based onCartesian impedance controlrdquo in Proceedings of the 11th IEEEInternational Symposium on Robotic and Sensors Environments(ROSE rsquo13) pp 25ndash30 Washington DC USA October 2013

[33] R Afthoni A Rizal and E Susanto ldquoProportional deriva-tive control based robot arm system using microsoft kinectrdquoin Proceedings of the International Conference on RoboticsBiomimetics Intelligent Computational Systems (ROBIONETICSrsquo13) pp 24ndash29 IEEE Jogjakarta Indonesia November 2013

[34] M Al-Shabi ldquoSimulation and implementation of real-timevision-based control system for 2-DoF robotic arm using PIDwith hardware-in-the-looprdquo Intelligent Control andAutomationvol 6 no 2 pp 147ndash157 2015

[35] I Benabdallah Y Bouteraa R Boucetta and C Rekik ldquoKinect-based Computed Torque Control for lynxmotion robotic armrdquoin Proceedings of the 7th International Conference on ModellingIdentification and Control (ICMIC rsquo15) pp 1ndash6 Sousse TunisiaDecember 2015

[36] H Mehdi and O Boubaker ldquoRobot-assisted therapy designcontrol and optimizationrdquo International Journal on Smart Sens-ing and Intelligent Systems vol 5 no 4 pp 1044ndash1062 2012

[37] I Ben Aabdallah Y Bouteraa and C Rekik ldquoDesign of smartrobot for wrist rehabilitationrdquo International journal of smartsensing and intelligent systems vol 9 no 2 2016

[38] C Pillajo and J E Sierra ldquoHuman machine interface HMIusing Kinect sensor to control a SCARA robotrdquo in Proceedingsof the IEEE Colombian Conference on Communications andComputing (COLCOM rsquo13) pp 1ndash5 Medellin Colo USA May2013

Submit your manuscripts athttpwwwhindawicom

Computer Games Technology

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Distributed Sensor Networks

International Journal of

Advances in

FuzzySystems

Hindawi Publishing Corporationhttpwwwhindawicom

Volume 2014

International Journal of

ReconfigurableComputing

Hindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Applied Computational Intelligence and Soft Computing

thinspAdvancesthinspinthinsp

Artificial Intelligence

HindawithinspPublishingthinspCorporationhttpwwwhindawicom Volumethinsp2014

Advances inSoftware EngineeringHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Electrical and Computer Engineering

Journal of

Journal of

Computer Networks and Communications

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporation

httpwwwhindawicom Volume 2014

Advances in

Multimedia

International Journal of

Biomedical Imaging

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

ArtificialNeural Systems

Advances in

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

RoboticsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Computational Intelligence and Neuroscience

Industrial EngineeringJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Modelling amp Simulation in EngineeringHindawi Publishing Corporation httpwwwhindawicom Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Human-ComputerInteraction

Advances in

Computer EngineeringAdvances in

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Page 6: Research Article Kinect-Based Sliding Mode Control for Lynxmotion Robotic …downloads.hindawi.com/journals/ahci/2016/7921295.pdf · 2019-07-30 · is LabVIEW-based user interface

6 Advances in Human-Computer Interaction

Figure 9 User position checking process

Size (x)

Filtered X

(XXf)Smoothing

Signal

Order

Low cutoff freq

X

Samplingfrequency

Enablesmoothing

Figure 10 Butterworth-based designed filter

Angle in Invalid position

Angle out

AvoidNaN

values

Figure 11 Avoid invalid position sub-VI

is transmitted to the servomotors moving the mechanicalstructure of the robotic arm Moreover the desired jointpositions are applied in the virtual environment to movethe virtual robot in order to prove the compatibility of thedesired motion in both real and virtual environment Hencean augmented reality approach is proposed to validate thecontrol system

The system overview can be presented by Figure 5In our experiment we only use the first three joints and

the remaining joints are not considered Note that the userinterface contains four screens

The first screen is designed to the setting and hardwareconfiguration The screen displays the tracking process statesuch us invalid positions joint positions and 3D coordinatesand user position checking A security algorithm has beendeveloped to secure the tracking process Note that wherethe security program indicates a skeleton detection errorfor one reason or another the last calculated angle (already

validated) of each joint will be sent to the robot until theoperator solves the problem Based on the feedback nodewhich returns the last valid result with a nonnumeric inputthe security algorithm is developed and abstracted in sub-VI for importing in the user interface Figure 11 illustratesthe imported sub-VI In addition the user can also set theArduino board communication such us connection type andbaud rate and Arduino board type and select the Arduinoresource

In the tracking process the operatorrsquos arm can twitchleading to unwanted random movement In order to avoidthis undesired movement we propose a Butterworth lowpass filter-based smoothing signal algorithm In the samescreen the user can enable the smoothing signal algorithmby pressing on the bouton control The designed filter isabstracted in a sub-VI as shown in Figure 10 The filteredtrajectories presented as the desired trajectories in Figures15 16 and 17 demonstrate the effectiveness of the filteringalgorithm

Finally the stop button is set in the developed interface tostop the tracking process and close all connected hardwareThis screen is developed as shown in Figure 6

The second screen presented in Figure 3 is designed torecord the userrsquos video and skeleton display Figure 7 showsthe third screen to prove the augmented reality In this pagewe show the virtual and real environment The user andthe virtual robot will be displayed in the user video portionwhereas the virtual robot will be presented in the virtualenvironment software

The remaining screen describes the joints graphs andoperatorrsquos video recording during the experimentation asshown in Figure 8

After running the operator must select the first screencalled Kinect Feedback to adjust its position A user positionchecking is triggered to secure the operatorrsquos position inthe Kinect sensing field A green color of the signalingLED indicates the good state of the recognition processOtherwise the LED color changes to red to indicate the

Advances in Human-Computer Interaction 7

Figure 12 Augmented reality process

Figure 13 User and skeleton display

problem of tracking Figure 9 describes the executed screenin the user position checking process

Based on Kinect toolkit for LabVIEW the operatorrsquosskeleton data will be read by the LabVIEW interface Thena gesture recognition method is executed to determine eachjoint angle The calculated joint positions are considered asdesired trajectories for the slidingmode control implementedin the developed software Ultimately the computed torquewill be sent to the Atmel-based Arduino card via USBprotocol Therefore the servomotors of the three first joints

will be actuated to move the mechanical structure of theLynxmotion robot

A three-degree of freedom virtual robot is used toperform this experiment Essentially the augmented realityevaluated both the ability of the robot manipulator to copyhuman hand arm motions and the ability of the user touse the human-robot manipulator interface The proposedcontroller is validated by the augmented reality In fact bybuilding a virtual robot by using LabVIEW other than thereal robot as shown in Figure 12 virtual robot movements

8 Advances in Human-Computer Interaction

Figure 14 Gesture control process and real-time graphs

are similar to the real robot Therefore the proposed controlmethod is validated

Figure 13 proves how the user can control themanipulatorrobot based in his detected skeleton

Photographs series depicting gesture control processexecution each five seconds with real-time graphs have beenpresented in Figure 14

In the teleoperation process the tracking error conver-gence is very important and the performance of our system isbased on the tracking error optimization Indeed in the casethat the elementary joint cannot track his similar human armjoint the robotic arm will not be able to perform the desiredtask because of the accumulation of each joint position errorIn this case the end-effector localization is different regardingthe desired placement Therefore the user must adapt thegesture with the current end-effector position to reduce theend-effector placement error Hence the adaptive gesturesestimated by the user require a particular intelligence andsuch expertise In another case when the user applies thedesired gesture with a variable velocity in this situation thecontrol algorithm must respect the applied velocities andaccelerations to conserve the convergence of the tracking

error In order to overcome this constraint the use of highperformance sliding mode controller is highly required

Figures 15 16 and 17 show trajectories of each joint andthe desired trajectory for each Both trajectories are as afunction of timeThe illustrated results show the effectivenessof the proposed control approachThe illustrations have onlyone second as response time In experimental conditionthis time can be regarded as excellent This control strategyshould be improved if we hope to exploit this system intelemedicine applications Moreover the developed teleoper-ation approach can be used for industrial applications in par-ticular for pic and place tasks in hazardous or unstructuredenvironments The real-time teleoperation system offers abenefit platform to manage the unexpected scenarios

6 Conclusion

This paper applies the gesture control to the robot manip-ulators Indeed in this work we use the Kinect technol-ogy to establish a human-machine interaction (HMI) Thedeveloped control approach can be used in all applications

Advances in Human-Computer Interaction 9

Desired joint 1 positionMeasured joint 1 position

200100 15050 250 300 350 40000Time (s)

040608

11214

Am

plitu

de (r

ads)

Figure 15 Desired and measured joint 1 position

50 100 150 200 250 300 350 40000Time (s)

002040608

112

Am

plitu

de (r

ads) Desired joint 2 position

Measured joint 2 position

Figure 16 Desired and measured joint 2 position

50 100 150 200 250 300 350 40000Time (s)

012345

Am

plitu

de (r

ads)

Desired joint 3 positionMeasured joint 3 position

Figure 17 Desired and measured joint 3 position

which require real-time human-robot cooperation Trajec-tory tracking and system stability have been ensured bysliding mode control (SMC) Control and supervision bothhave been provided by a high-friendly human-machine inter-face developed in LabVIEW Experiments have shown thehigh efficiency of the proposed approach Furthermore theKinect-based teleoperation system is validated by augmentedreality In conclusion the proposed control scheme providesa feasible teleoperation system for many applications such asmanipulating in hazardous and unstructured environmentHowever there are some constraints when applying theproposed method such as the sensibility of the desiredtrajectory generated by the human arm even in case ofrandom and unwanted movements Indeed the work spaceand the regional constraints must be respected in the controlschemes by filtering the unaccepted trajectories Moreoverfor pic and place applications grasping task is not consideredin the proposed telemanipulation system In perspective thisdrawback can be overcome by using the electromyographybiosignal for grasping task control

Competing Interests

The authors declare that they have no competing interests

References

[1] Y Bouteraa J Ghommam N Derbel and G Poisson ldquoNonlin-ear control and synchronization with time delays of multiagentrobotic systemsrdquo Journal of Control Science and Engineering vol2011 Article ID 632374 9 pages 2011

[2] Y Bouteraa J Ghommam G Poisson and N Derbel ldquoDis-tributed synchronization control to trajectory tracking of mul-tiple robot manipulatorsrdquo Journal of Robotics vol 2011 ArticleID 652785 10 pages 2011

[3] G A M Vasiljevic L C de Miranda and E E C de MirandaldquoA case study ofmastermind chess comparingmousekeyboardinteraction with kinect-based gestural interfacerdquo Advances inHuman-Computer Interaction vol 2016 Article ID 4602471 10pages 2016

[4] C-H Tsai Y-H Kuo K-C Chu and J-C Yen ldquoDevelop-ment and evaluation of game-based learning system using themicrosoft kinect sensorrdquo International Journal of DistributedSensor Networks vol 2015 Article ID 498560 10 pages 2015

[5] C-H Chuang Y-N Chen L-W Tsai C-C Lee andH-C TsaildquoImproving learning performance with happiness by interactivescenariosrdquo The Scientific World Journal vol 2014 Article ID807347 12 pages 2014

[6] K J Bower J Louie Y Landesrocha P Seedy A Gorelik and JBernhardt ldquoClinical feasibility of interactive motion-controlledgames for stroke rehabilitationrdquo Journal of NeuroEngineeringand Rehabilitation vol 12 no 1 article 63 2015

[7] M Strbac S Kocovic M Markovic and D B PopovicldquoMicrosoft kinect-based artificial perception system for controlof functional electrical stimulation assisted graspingrdquo BioMedResearch International vol 2014 Article ID 740469 12 pages2014

[8] N Kanwal E Bostanci K Currie andA F Clark ldquoA navigationsystem for the visually impaired a fusion of vision and depthsensorrdquo Applied Bionics and Biomechanics vol 2015 Article ID479857 16 pages 2015

[9] H-H Pham T-L Le and N Vuillerme ldquoReal-time obsta-cle detection system in indoor environment for the visuallyimpaired using microsoft kinect sensorrdquo Journal of Sensors vol2016 Article ID 3754918 13 pages 2016

[10] D Tuvshinjargal B Dorj and D J Lee ldquoHybrid motion plan-ning method for autonomous robots using kinect based sensorfusion and virtual plane approach in dynamic environmentsrdquoJournal of Sensors vol 2015 Article ID 471052 13 pages 2015

[11] T Xu S Jia Z Dong and X Li ldquoObstacles regions 3D-perception method for mobile robots based on visual saliencyrdquoJournal of Robotics vol 2015 Article ID 720174 10 pages 2015

[12] W Shang X Cao H Ma H Zang and P Wei ldquoKinect-based vision system of mine rescue robot for low illuminousenvironmentrdquo Journal of Sensors vol 2016 Article ID 82520159 pages 2016

[13] H Kim and I Kim ldquoDynamic arm gesture recognition usingspherical angle features and hidden markov modelsrdquo Advancesin Human-Computer Interaction vol 2015 Article ID 785349 7pages 2015

[14] A Chavez-Aragon R Macknojia P Payeur and R LaganiereldquoRapid 3D modeling and parts recognition on automotivevehicles using a network of RGB-D sensors for robot guidancerdquoJournal of Sensors vol 2013 Article ID 832963 16 pages 2013

[15] A Prochazka O Vysata M Valis and R Yadollahi ldquoThe MSkinect use for 3d modelling and gait analysis in the Matlab

10 Advances in Human-Computer Interaction

environmentrdquo in Proceedings of the 21th Annual ConferenceTechnical Computing 2013 Prague Czech Republic 2013

[16] B Y L Li A S Mian W Liu and A Krishna ldquoUsingKinect for face recognition under varying poses expressionsillumination and disguiserdquo in Proceedings of the IEEEWorkshopon Applications of Computer Vision (WACV rsquo13) pp 186ndash192Tampa Fla USA January 2013

[17] C Muhiddin D B Phillips M J Miles L Picco and D MCarberry ldquoKinect 4 holographic optical tweezersrdquo Journalof Optics vol 15 no 7 Article ID 075302 2013

[18] P Qiu C Ni J Zhang and H Cao ldquoResearch of virtualsimulation of traditional Chinese bone setting techniques basedon kinect skeletal informationrdquo Journal of Shandong Universityof Traditional Chinese Medicine no 3 pp 202ndash204 2014

[19] R Ibanez A Soria A Teyseyre and M Campo ldquoEasy gesturerecognition for Kinectrdquo Advances in Engineering Software vol76 pp 171ndash180 2014

[20] I-J Ding and C-W Chang ldquoAn eigenspace-based method witha user adaptation scheme for human gesture recognition byusing Kinect 3D datardquo Applied Mathematical Modelling vol 39no 19 pp 5769ndash5777 2015

[21] B Galna G Barry D Jackson D Mhiripiri P Olivier andL Rochester ldquoAccuracy of the Microsoft Kinect sensor formeasuring movement in people with Parkinsonrsquos diseaserdquo Gaitamp Posture vol 39 no 4 pp 1062ndash1068 2014

[22] K Qian H Yang and J Niu ldquoDeveloping a gesture basedremote human-robot interaction system using kinectrdquo Interna-tional Journal of Smart Home vol 7 no 4 pp 203ndash208 2013

[23] S Shirwalkar A Singh K Sharma and N Singh ldquoTelemanip-ulation of an industrial robotic arm using gesture recognitionwithKinectrdquo inProceedings of the IEEE International ConferenceonControl Automation Robotics and Embedded Systems (CARErsquo13) Jabalpur India December 2013

[24] G Du and P Zhang ldquoMarkerless humanndashrobot interface fordual robot manipulators using Kinect sensorrdquo Robotics andComputer-Integrated Manufacturing vol 30 no 2 pp 150ndash1592014

[25] A Hernansanz A Casals and J Amat ldquoAmulti-robot coopera-tion strategy for dexterous task oriented teleoperationrdquoRoboticsand Autonomous Systems vol 68 pp 156ndash172 2015

[26] F Terrence F Conti G Sebastien and B Charles ldquonovelinterfaces for remote driving gesture haptic and PDArdquo in SPIETelemanipulator and Telepresence Technologies VII vol 4195 pp300ndash311 2000

[27] E Ueda Y Matsumoto M Imai and T Ogasawara ldquoA hand-pose estimation for vision-based human interfacesrdquo IEEETransactions on Industrial Electronics vol 50 no 4 pp 676ndash684 2003

[28] J Zhang and A Knoll ldquoA two-arm situated artificial commu-nicator for humanndashrobot cooperative assemblyrdquo IEEE Transac-tions on Industrial Electronics vol 50 no 4 pp 651ndash658 2003

[29] B Ionescu D Coquin P Lambert and V Buzuloiu ldquoDynamichand gesture recognition using the skeleton of the handrdquoEURASIP Journal on Applied Signal Processing vol 13 pp 2101ndash2109 2005

[30] Y Kim S Leonard A Shademan A Krieger and P C W KimldquoKinect technology for hand tracking control of surgical robotstechnical and surgical skill comparison to current roboticmastersrdquo Surgical Endoscopy vol 28 no 6 pp 1993ndash2000 2014

[31] J Kofman S Verma and X Wu ldquoRobot-manipulator teleop-eration by markerless vision-based hand-arm trackingrdquo Inter-national Journal of Optomechatronics vol 1 no 3 pp 331ndash3572007

[32] R C Luo B-H Shih and T-W Lin ldquoReal time humanmotion imitation of anthropomorphic dual arm robot based onCartesian impedance controlrdquo in Proceedings of the 11th IEEEInternational Symposium on Robotic and Sensors Environments(ROSE rsquo13) pp 25ndash30 Washington DC USA October 2013

[33] R Afthoni A Rizal and E Susanto ldquoProportional deriva-tive control based robot arm system using microsoft kinectrdquoin Proceedings of the International Conference on RoboticsBiomimetics Intelligent Computational Systems (ROBIONETICSrsquo13) pp 24ndash29 IEEE Jogjakarta Indonesia November 2013

[34] M Al-Shabi ldquoSimulation and implementation of real-timevision-based control system for 2-DoF robotic arm using PIDwith hardware-in-the-looprdquo Intelligent Control andAutomationvol 6 no 2 pp 147ndash157 2015

[35] I Benabdallah Y Bouteraa R Boucetta and C Rekik ldquoKinect-based Computed Torque Control for lynxmotion robotic armrdquoin Proceedings of the 7th International Conference on ModellingIdentification and Control (ICMIC rsquo15) pp 1ndash6 Sousse TunisiaDecember 2015

[36] H Mehdi and O Boubaker ldquoRobot-assisted therapy designcontrol and optimizationrdquo International Journal on Smart Sens-ing and Intelligent Systems vol 5 no 4 pp 1044ndash1062 2012

[37] I Ben Aabdallah Y Bouteraa and C Rekik ldquoDesign of smartrobot for wrist rehabilitationrdquo International journal of smartsensing and intelligent systems vol 9 no 2 2016

[38] C Pillajo and J E Sierra ldquoHuman machine interface HMIusing Kinect sensor to control a SCARA robotrdquo in Proceedingsof the IEEE Colombian Conference on Communications andComputing (COLCOM rsquo13) pp 1ndash5 Medellin Colo USA May2013

Submit your manuscripts athttpwwwhindawicom

Computer Games Technology

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Distributed Sensor Networks

International Journal of

Advances in

FuzzySystems

Hindawi Publishing Corporationhttpwwwhindawicom

Volume 2014

International Journal of

ReconfigurableComputing

Hindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Applied Computational Intelligence and Soft Computing

thinspAdvancesthinspinthinsp

Artificial Intelligence

HindawithinspPublishingthinspCorporationhttpwwwhindawicom Volumethinsp2014

Advances inSoftware EngineeringHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Electrical and Computer Engineering

Journal of

Journal of

Computer Networks and Communications

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporation

httpwwwhindawicom Volume 2014

Advances in

Multimedia

International Journal of

Biomedical Imaging

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

ArtificialNeural Systems

Advances in

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

RoboticsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Computational Intelligence and Neuroscience

Industrial EngineeringJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Modelling amp Simulation in EngineeringHindawi Publishing Corporation httpwwwhindawicom Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Human-ComputerInteraction

Advances in

Computer EngineeringAdvances in

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Page 7: Research Article Kinect-Based Sliding Mode Control for Lynxmotion Robotic …downloads.hindawi.com/journals/ahci/2016/7921295.pdf · 2019-07-30 · is LabVIEW-based user interface

Advances in Human-Computer Interaction 7

Figure 12 Augmented reality process

Figure 13 User and skeleton display

problem of tracking Figure 9 describes the executed screenin the user position checking process

Based on Kinect toolkit for LabVIEW the operatorrsquosskeleton data will be read by the LabVIEW interface Thena gesture recognition method is executed to determine eachjoint angle The calculated joint positions are considered asdesired trajectories for the slidingmode control implementedin the developed software Ultimately the computed torquewill be sent to the Atmel-based Arduino card via USBprotocol Therefore the servomotors of the three first joints

will be actuated to move the mechanical structure of theLynxmotion robot

A three-degree of freedom virtual robot is used toperform this experiment Essentially the augmented realityevaluated both the ability of the robot manipulator to copyhuman hand arm motions and the ability of the user touse the human-robot manipulator interface The proposedcontroller is validated by the augmented reality In fact bybuilding a virtual robot by using LabVIEW other than thereal robot as shown in Figure 12 virtual robot movements

8 Advances in Human-Computer Interaction

Figure 14 Gesture control process and real-time graphs

are similar to the real robot Therefore the proposed controlmethod is validated

Figure 13 proves how the user can control themanipulatorrobot based in his detected skeleton

Photographs series depicting gesture control processexecution each five seconds with real-time graphs have beenpresented in Figure 14

In the teleoperation process the tracking error conver-gence is very important and the performance of our system isbased on the tracking error optimization Indeed in the casethat the elementary joint cannot track his similar human armjoint the robotic arm will not be able to perform the desiredtask because of the accumulation of each joint position errorIn this case the end-effector localization is different regardingthe desired placement Therefore the user must adapt thegesture with the current end-effector position to reduce theend-effector placement error Hence the adaptive gesturesestimated by the user require a particular intelligence andsuch expertise In another case when the user applies thedesired gesture with a variable velocity in this situation thecontrol algorithm must respect the applied velocities andaccelerations to conserve the convergence of the tracking

error In order to overcome this constraint the use of highperformance sliding mode controller is highly required

Figures 15 16 and 17 show trajectories of each joint andthe desired trajectory for each Both trajectories are as afunction of timeThe illustrated results show the effectivenessof the proposed control approachThe illustrations have onlyone second as response time In experimental conditionthis time can be regarded as excellent This control strategyshould be improved if we hope to exploit this system intelemedicine applications Moreover the developed teleoper-ation approach can be used for industrial applications in par-ticular for pic and place tasks in hazardous or unstructuredenvironments The real-time teleoperation system offers abenefit platform to manage the unexpected scenarios

6 Conclusion

This paper applies the gesture control to the robot manip-ulators Indeed in this work we use the Kinect technol-ogy to establish a human-machine interaction (HMI) Thedeveloped control approach can be used in all applications

Advances in Human-Computer Interaction 9

Desired joint 1 positionMeasured joint 1 position

200100 15050 250 300 350 40000Time (s)

040608

11214

Am

plitu

de (r

ads)

Figure 15 Desired and measured joint 1 position

50 100 150 200 250 300 350 40000Time (s)

002040608

112

Am

plitu

de (r

ads) Desired joint 2 position

Measured joint 2 position

Figure 16 Desired and measured joint 2 position

50 100 150 200 250 300 350 40000Time (s)

012345

Am

plitu

de (r

ads)

Desired joint 3 positionMeasured joint 3 position

Figure 17 Desired and measured joint 3 position

which require real-time human-robot cooperation Trajec-tory tracking and system stability have been ensured bysliding mode control (SMC) Control and supervision bothhave been provided by a high-friendly human-machine inter-face developed in LabVIEW Experiments have shown thehigh efficiency of the proposed approach Furthermore theKinect-based teleoperation system is validated by augmentedreality In conclusion the proposed control scheme providesa feasible teleoperation system for many applications such asmanipulating in hazardous and unstructured environmentHowever there are some constraints when applying theproposed method such as the sensibility of the desiredtrajectory generated by the human arm even in case ofrandom and unwanted movements Indeed the work spaceand the regional constraints must be respected in the controlschemes by filtering the unaccepted trajectories Moreoverfor pic and place applications grasping task is not consideredin the proposed telemanipulation system In perspective thisdrawback can be overcome by using the electromyographybiosignal for grasping task control

Competing Interests

The authors declare that they have no competing interests

References

[1] Y Bouteraa J Ghommam N Derbel and G Poisson ldquoNonlin-ear control and synchronization with time delays of multiagentrobotic systemsrdquo Journal of Control Science and Engineering vol2011 Article ID 632374 9 pages 2011

[2] Y Bouteraa J Ghommam G Poisson and N Derbel ldquoDis-tributed synchronization control to trajectory tracking of mul-tiple robot manipulatorsrdquo Journal of Robotics vol 2011 ArticleID 652785 10 pages 2011

[3] G A M Vasiljevic L C de Miranda and E E C de MirandaldquoA case study ofmastermind chess comparingmousekeyboardinteraction with kinect-based gestural interfacerdquo Advances inHuman-Computer Interaction vol 2016 Article ID 4602471 10pages 2016

[4] C-H Tsai Y-H Kuo K-C Chu and J-C Yen ldquoDevelop-ment and evaluation of game-based learning system using themicrosoft kinect sensorrdquo International Journal of DistributedSensor Networks vol 2015 Article ID 498560 10 pages 2015

[5] C-H Chuang Y-N Chen L-W Tsai C-C Lee andH-C TsaildquoImproving learning performance with happiness by interactivescenariosrdquo The Scientific World Journal vol 2014 Article ID807347 12 pages 2014

[6] K J Bower J Louie Y Landesrocha P Seedy A Gorelik and JBernhardt ldquoClinical feasibility of interactive motion-controlledgames for stroke rehabilitationrdquo Journal of NeuroEngineeringand Rehabilitation vol 12 no 1 article 63 2015

[7] M Strbac S Kocovic M Markovic and D B PopovicldquoMicrosoft kinect-based artificial perception system for controlof functional electrical stimulation assisted graspingrdquo BioMedResearch International vol 2014 Article ID 740469 12 pages2014

[8] N Kanwal E Bostanci K Currie andA F Clark ldquoA navigationsystem for the visually impaired a fusion of vision and depthsensorrdquo Applied Bionics and Biomechanics vol 2015 Article ID479857 16 pages 2015

[9] H-H Pham T-L Le and N Vuillerme ldquoReal-time obsta-cle detection system in indoor environment for the visuallyimpaired using microsoft kinect sensorrdquo Journal of Sensors vol2016 Article ID 3754918 13 pages 2016

[10] D Tuvshinjargal B Dorj and D J Lee ldquoHybrid motion plan-ning method for autonomous robots using kinect based sensorfusion and virtual plane approach in dynamic environmentsrdquoJournal of Sensors vol 2015 Article ID 471052 13 pages 2015

[11] T Xu S Jia Z Dong and X Li ldquoObstacles regions 3D-perception method for mobile robots based on visual saliencyrdquoJournal of Robotics vol 2015 Article ID 720174 10 pages 2015

[12] W Shang X Cao H Ma H Zang and P Wei ldquoKinect-based vision system of mine rescue robot for low illuminousenvironmentrdquo Journal of Sensors vol 2016 Article ID 82520159 pages 2016

[13] H Kim and I Kim ldquoDynamic arm gesture recognition usingspherical angle features and hidden markov modelsrdquo Advancesin Human-Computer Interaction vol 2015 Article ID 785349 7pages 2015

[14] A Chavez-Aragon R Macknojia P Payeur and R LaganiereldquoRapid 3D modeling and parts recognition on automotivevehicles using a network of RGB-D sensors for robot guidancerdquoJournal of Sensors vol 2013 Article ID 832963 16 pages 2013

[15] A Prochazka O Vysata M Valis and R Yadollahi ldquoThe MSkinect use for 3d modelling and gait analysis in the Matlab

10 Advances in Human-Computer Interaction

environmentrdquo in Proceedings of the 21th Annual ConferenceTechnical Computing 2013 Prague Czech Republic 2013

[16] B Y L Li A S Mian W Liu and A Krishna ldquoUsingKinect for face recognition under varying poses expressionsillumination and disguiserdquo in Proceedings of the IEEEWorkshopon Applications of Computer Vision (WACV rsquo13) pp 186ndash192Tampa Fla USA January 2013

[17] C Muhiddin D B Phillips M J Miles L Picco and D MCarberry ldquoKinect 4 holographic optical tweezersrdquo Journalof Optics vol 15 no 7 Article ID 075302 2013

[18] P Qiu C Ni J Zhang and H Cao ldquoResearch of virtualsimulation of traditional Chinese bone setting techniques basedon kinect skeletal informationrdquo Journal of Shandong Universityof Traditional Chinese Medicine no 3 pp 202ndash204 2014

[19] R Ibanez A Soria A Teyseyre and M Campo ldquoEasy gesturerecognition for Kinectrdquo Advances in Engineering Software vol76 pp 171ndash180 2014

[20] I-J Ding and C-W Chang ldquoAn eigenspace-based method witha user adaptation scheme for human gesture recognition byusing Kinect 3D datardquo Applied Mathematical Modelling vol 39no 19 pp 5769ndash5777 2015

[21] B Galna G Barry D Jackson D Mhiripiri P Olivier andL Rochester ldquoAccuracy of the Microsoft Kinect sensor formeasuring movement in people with Parkinsonrsquos diseaserdquo Gaitamp Posture vol 39 no 4 pp 1062ndash1068 2014

[22] K Qian H Yang and J Niu ldquoDeveloping a gesture basedremote human-robot interaction system using kinectrdquo Interna-tional Journal of Smart Home vol 7 no 4 pp 203ndash208 2013

[23] S Shirwalkar A Singh K Sharma and N Singh ldquoTelemanip-ulation of an industrial robotic arm using gesture recognitionwithKinectrdquo inProceedings of the IEEE International ConferenceonControl Automation Robotics and Embedded Systems (CARErsquo13) Jabalpur India December 2013

[24] G Du and P Zhang ldquoMarkerless humanndashrobot interface fordual robot manipulators using Kinect sensorrdquo Robotics andComputer-Integrated Manufacturing vol 30 no 2 pp 150ndash1592014

[25] A Hernansanz A Casals and J Amat ldquoAmulti-robot coopera-tion strategy for dexterous task oriented teleoperationrdquoRoboticsand Autonomous Systems vol 68 pp 156ndash172 2015

[26] F Terrence F Conti G Sebastien and B Charles ldquonovelinterfaces for remote driving gesture haptic and PDArdquo in SPIETelemanipulator and Telepresence Technologies VII vol 4195 pp300ndash311 2000

[27] E Ueda Y Matsumoto M Imai and T Ogasawara ldquoA hand-pose estimation for vision-based human interfacesrdquo IEEETransactions on Industrial Electronics vol 50 no 4 pp 676ndash684 2003

[28] J Zhang and A Knoll ldquoA two-arm situated artificial commu-nicator for humanndashrobot cooperative assemblyrdquo IEEE Transac-tions on Industrial Electronics vol 50 no 4 pp 651ndash658 2003

[29] B Ionescu D Coquin P Lambert and V Buzuloiu ldquoDynamichand gesture recognition using the skeleton of the handrdquoEURASIP Journal on Applied Signal Processing vol 13 pp 2101ndash2109 2005

[30] Y Kim S Leonard A Shademan A Krieger and P C W KimldquoKinect technology for hand tracking control of surgical robotstechnical and surgical skill comparison to current roboticmastersrdquo Surgical Endoscopy vol 28 no 6 pp 1993ndash2000 2014

[31] J Kofman S Verma and X Wu ldquoRobot-manipulator teleop-eration by markerless vision-based hand-arm trackingrdquo Inter-national Journal of Optomechatronics vol 1 no 3 pp 331ndash3572007

[32] R C Luo B-H Shih and T-W Lin ldquoReal time humanmotion imitation of anthropomorphic dual arm robot based onCartesian impedance controlrdquo in Proceedings of the 11th IEEEInternational Symposium on Robotic and Sensors Environments(ROSE rsquo13) pp 25ndash30 Washington DC USA October 2013

[33] R Afthoni A Rizal and E Susanto ldquoProportional deriva-tive control based robot arm system using microsoft kinectrdquoin Proceedings of the International Conference on RoboticsBiomimetics Intelligent Computational Systems (ROBIONETICSrsquo13) pp 24ndash29 IEEE Jogjakarta Indonesia November 2013

[34] M Al-Shabi ldquoSimulation and implementation of real-timevision-based control system for 2-DoF robotic arm using PIDwith hardware-in-the-looprdquo Intelligent Control andAutomationvol 6 no 2 pp 147ndash157 2015

[35] I Benabdallah Y Bouteraa R Boucetta and C Rekik ldquoKinect-based Computed Torque Control for lynxmotion robotic armrdquoin Proceedings of the 7th International Conference on ModellingIdentification and Control (ICMIC rsquo15) pp 1ndash6 Sousse TunisiaDecember 2015

[36] H Mehdi and O Boubaker ldquoRobot-assisted therapy designcontrol and optimizationrdquo International Journal on Smart Sens-ing and Intelligent Systems vol 5 no 4 pp 1044ndash1062 2012

[37] I Ben Aabdallah Y Bouteraa and C Rekik ldquoDesign of smartrobot for wrist rehabilitationrdquo International journal of smartsensing and intelligent systems vol 9 no 2 2016

[38] C Pillajo and J E Sierra ldquoHuman machine interface HMIusing Kinect sensor to control a SCARA robotrdquo in Proceedingsof the IEEE Colombian Conference on Communications andComputing (COLCOM rsquo13) pp 1ndash5 Medellin Colo USA May2013

Submit your manuscripts athttpwwwhindawicom

Computer Games Technology

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Distributed Sensor Networks

International Journal of

Advances in

FuzzySystems

Hindawi Publishing Corporationhttpwwwhindawicom

Volume 2014

International Journal of

ReconfigurableComputing

Hindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Applied Computational Intelligence and Soft Computing

thinspAdvancesthinspinthinsp

Artificial Intelligence

HindawithinspPublishingthinspCorporationhttpwwwhindawicom Volumethinsp2014

Advances inSoftware EngineeringHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Electrical and Computer Engineering

Journal of

Journal of

Computer Networks and Communications

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporation

httpwwwhindawicom Volume 2014

Advances in

Multimedia

International Journal of

Biomedical Imaging

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

ArtificialNeural Systems

Advances in

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

RoboticsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Computational Intelligence and Neuroscience

Industrial EngineeringJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Modelling amp Simulation in EngineeringHindawi Publishing Corporation httpwwwhindawicom Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Human-ComputerInteraction

Advances in

Computer EngineeringAdvances in

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Page 8: Research Article Kinect-Based Sliding Mode Control for Lynxmotion Robotic …downloads.hindawi.com/journals/ahci/2016/7921295.pdf · 2019-07-30 · is LabVIEW-based user interface

8 Advances in Human-Computer Interaction

Figure 14 Gesture control process and real-time graphs

are similar to the real robot Therefore the proposed controlmethod is validated

Figure 13 proves how the user can control themanipulatorrobot based in his detected skeleton

Photographs series depicting gesture control processexecution each five seconds with real-time graphs have beenpresented in Figure 14

In the teleoperation process the tracking error conver-gence is very important and the performance of our system isbased on the tracking error optimization Indeed in the casethat the elementary joint cannot track his similar human armjoint the robotic arm will not be able to perform the desiredtask because of the accumulation of each joint position errorIn this case the end-effector localization is different regardingthe desired placement Therefore the user must adapt thegesture with the current end-effector position to reduce theend-effector placement error Hence the adaptive gesturesestimated by the user require a particular intelligence andsuch expertise In another case when the user applies thedesired gesture with a variable velocity in this situation thecontrol algorithm must respect the applied velocities andaccelerations to conserve the convergence of the tracking

error In order to overcome this constraint the use of highperformance sliding mode controller is highly required

Figures 15 16 and 17 show trajectories of each joint andthe desired trajectory for each Both trajectories are as afunction of timeThe illustrated results show the effectivenessof the proposed control approachThe illustrations have onlyone second as response time In experimental conditionthis time can be regarded as excellent This control strategyshould be improved if we hope to exploit this system intelemedicine applications Moreover the developed teleoper-ation approach can be used for industrial applications in par-ticular for pic and place tasks in hazardous or unstructuredenvironments The real-time teleoperation system offers abenefit platform to manage the unexpected scenarios

6 Conclusion

This paper applies the gesture control to the robot manip-ulators Indeed in this work we use the Kinect technol-ogy to establish a human-machine interaction (HMI) Thedeveloped control approach can be used in all applications

Advances in Human-Computer Interaction 9

Desired joint 1 positionMeasured joint 1 position

200100 15050 250 300 350 40000Time (s)

040608

11214

Am

plitu

de (r

ads)

Figure 15 Desired and measured joint 1 position

50 100 150 200 250 300 350 40000Time (s)

002040608

112

Am

plitu

de (r

ads) Desired joint 2 position

Measured joint 2 position

Figure 16 Desired and measured joint 2 position

50 100 150 200 250 300 350 40000Time (s)

012345

Am

plitu

de (r

ads)

Desired joint 3 positionMeasured joint 3 position

Figure 17 Desired and measured joint 3 position

which require real-time human-robot cooperation Trajec-tory tracking and system stability have been ensured bysliding mode control (SMC) Control and supervision bothhave been provided by a high-friendly human-machine inter-face developed in LabVIEW Experiments have shown thehigh efficiency of the proposed approach Furthermore theKinect-based teleoperation system is validated by augmentedreality In conclusion the proposed control scheme providesa feasible teleoperation system for many applications such asmanipulating in hazardous and unstructured environmentHowever there are some constraints when applying theproposed method such as the sensibility of the desiredtrajectory generated by the human arm even in case ofrandom and unwanted movements Indeed the work spaceand the regional constraints must be respected in the controlschemes by filtering the unaccepted trajectories Moreoverfor pic and place applications grasping task is not consideredin the proposed telemanipulation system In perspective thisdrawback can be overcome by using the electromyographybiosignal for grasping task control

Competing Interests

The authors declare that they have no competing interests

References

[1] Y Bouteraa J Ghommam N Derbel and G Poisson ldquoNonlin-ear control and synchronization with time delays of multiagentrobotic systemsrdquo Journal of Control Science and Engineering vol2011 Article ID 632374 9 pages 2011

[2] Y Bouteraa J Ghommam G Poisson and N Derbel ldquoDis-tributed synchronization control to trajectory tracking of mul-tiple robot manipulatorsrdquo Journal of Robotics vol 2011 ArticleID 652785 10 pages 2011

[3] G A M Vasiljevic L C de Miranda and E E C de MirandaldquoA case study ofmastermind chess comparingmousekeyboardinteraction with kinect-based gestural interfacerdquo Advances inHuman-Computer Interaction vol 2016 Article ID 4602471 10pages 2016

[4] C-H Tsai Y-H Kuo K-C Chu and J-C Yen ldquoDevelop-ment and evaluation of game-based learning system using themicrosoft kinect sensorrdquo International Journal of DistributedSensor Networks vol 2015 Article ID 498560 10 pages 2015

[5] C-H Chuang Y-N Chen L-W Tsai C-C Lee andH-C TsaildquoImproving learning performance with happiness by interactivescenariosrdquo The Scientific World Journal vol 2014 Article ID807347 12 pages 2014

[6] K J Bower J Louie Y Landesrocha P Seedy A Gorelik and JBernhardt ldquoClinical feasibility of interactive motion-controlledgames for stroke rehabilitationrdquo Journal of NeuroEngineeringand Rehabilitation vol 12 no 1 article 63 2015

[7] M Strbac S Kocovic M Markovic and D B PopovicldquoMicrosoft kinect-based artificial perception system for controlof functional electrical stimulation assisted graspingrdquo BioMedResearch International vol 2014 Article ID 740469 12 pages2014

[8] N Kanwal E Bostanci K Currie andA F Clark ldquoA navigationsystem for the visually impaired a fusion of vision and depthsensorrdquo Applied Bionics and Biomechanics vol 2015 Article ID479857 16 pages 2015

[9] H-H Pham T-L Le and N Vuillerme ldquoReal-time obsta-cle detection system in indoor environment for the visuallyimpaired using microsoft kinect sensorrdquo Journal of Sensors vol2016 Article ID 3754918 13 pages 2016

[10] D Tuvshinjargal B Dorj and D J Lee ldquoHybrid motion plan-ning method for autonomous robots using kinect based sensorfusion and virtual plane approach in dynamic environmentsrdquoJournal of Sensors vol 2015 Article ID 471052 13 pages 2015

[11] T Xu S Jia Z Dong and X Li ldquoObstacles regions 3D-perception method for mobile robots based on visual saliencyrdquoJournal of Robotics vol 2015 Article ID 720174 10 pages 2015

[12] W Shang X Cao H Ma H Zang and P Wei ldquoKinect-based vision system of mine rescue robot for low illuminousenvironmentrdquo Journal of Sensors vol 2016 Article ID 82520159 pages 2016

[13] H Kim and I Kim ldquoDynamic arm gesture recognition usingspherical angle features and hidden markov modelsrdquo Advancesin Human-Computer Interaction vol 2015 Article ID 785349 7pages 2015

[14] A Chavez-Aragon R Macknojia P Payeur and R LaganiereldquoRapid 3D modeling and parts recognition on automotivevehicles using a network of RGB-D sensors for robot guidancerdquoJournal of Sensors vol 2013 Article ID 832963 16 pages 2013

[15] A Prochazka O Vysata M Valis and R Yadollahi ldquoThe MSkinect use for 3d modelling and gait analysis in the Matlab

10 Advances in Human-Computer Interaction

environmentrdquo in Proceedings of the 21th Annual ConferenceTechnical Computing 2013 Prague Czech Republic 2013

[16] B Y L Li A S Mian W Liu and A Krishna ldquoUsingKinect for face recognition under varying poses expressionsillumination and disguiserdquo in Proceedings of the IEEEWorkshopon Applications of Computer Vision (WACV rsquo13) pp 186ndash192Tampa Fla USA January 2013

[17] C Muhiddin D B Phillips M J Miles L Picco and D MCarberry ldquoKinect 4 holographic optical tweezersrdquo Journalof Optics vol 15 no 7 Article ID 075302 2013

[18] P Qiu C Ni J Zhang and H Cao ldquoResearch of virtualsimulation of traditional Chinese bone setting techniques basedon kinect skeletal informationrdquo Journal of Shandong Universityof Traditional Chinese Medicine no 3 pp 202ndash204 2014

[19] R Ibanez A Soria A Teyseyre and M Campo ldquoEasy gesturerecognition for Kinectrdquo Advances in Engineering Software vol76 pp 171ndash180 2014

[20] I-J Ding and C-W Chang ldquoAn eigenspace-based method witha user adaptation scheme for human gesture recognition byusing Kinect 3D datardquo Applied Mathematical Modelling vol 39no 19 pp 5769ndash5777 2015

[21] B Galna G Barry D Jackson D Mhiripiri P Olivier andL Rochester ldquoAccuracy of the Microsoft Kinect sensor formeasuring movement in people with Parkinsonrsquos diseaserdquo Gaitamp Posture vol 39 no 4 pp 1062ndash1068 2014

[22] K Qian H Yang and J Niu ldquoDeveloping a gesture basedremote human-robot interaction system using kinectrdquo Interna-tional Journal of Smart Home vol 7 no 4 pp 203ndash208 2013

[23] S Shirwalkar A Singh K Sharma and N Singh ldquoTelemanip-ulation of an industrial robotic arm using gesture recognitionwithKinectrdquo inProceedings of the IEEE International ConferenceonControl Automation Robotics and Embedded Systems (CARErsquo13) Jabalpur India December 2013

[24] G Du and P Zhang ldquoMarkerless humanndashrobot interface fordual robot manipulators using Kinect sensorrdquo Robotics andComputer-Integrated Manufacturing vol 30 no 2 pp 150ndash1592014

[25] A Hernansanz A Casals and J Amat ldquoAmulti-robot coopera-tion strategy for dexterous task oriented teleoperationrdquoRoboticsand Autonomous Systems vol 68 pp 156ndash172 2015

[26] F Terrence F Conti G Sebastien and B Charles ldquonovelinterfaces for remote driving gesture haptic and PDArdquo in SPIETelemanipulator and Telepresence Technologies VII vol 4195 pp300ndash311 2000

[27] E Ueda Y Matsumoto M Imai and T Ogasawara ldquoA hand-pose estimation for vision-based human interfacesrdquo IEEETransactions on Industrial Electronics vol 50 no 4 pp 676ndash684 2003

[28] J Zhang and A Knoll ldquoA two-arm situated artificial commu-nicator for humanndashrobot cooperative assemblyrdquo IEEE Transac-tions on Industrial Electronics vol 50 no 4 pp 651ndash658 2003

[29] B Ionescu D Coquin P Lambert and V Buzuloiu ldquoDynamichand gesture recognition using the skeleton of the handrdquoEURASIP Journal on Applied Signal Processing vol 13 pp 2101ndash2109 2005

[30] Y Kim S Leonard A Shademan A Krieger and P C W KimldquoKinect technology for hand tracking control of surgical robotstechnical and surgical skill comparison to current roboticmastersrdquo Surgical Endoscopy vol 28 no 6 pp 1993ndash2000 2014

[31] J Kofman S Verma and X Wu ldquoRobot-manipulator teleop-eration by markerless vision-based hand-arm trackingrdquo Inter-national Journal of Optomechatronics vol 1 no 3 pp 331ndash3572007

[32] R C Luo B-H Shih and T-W Lin ldquoReal time humanmotion imitation of anthropomorphic dual arm robot based onCartesian impedance controlrdquo in Proceedings of the 11th IEEEInternational Symposium on Robotic and Sensors Environments(ROSE rsquo13) pp 25ndash30 Washington DC USA October 2013

[33] R Afthoni A Rizal and E Susanto ldquoProportional deriva-tive control based robot arm system using microsoft kinectrdquoin Proceedings of the International Conference on RoboticsBiomimetics Intelligent Computational Systems (ROBIONETICSrsquo13) pp 24ndash29 IEEE Jogjakarta Indonesia November 2013

[34] M Al-Shabi ldquoSimulation and implementation of real-timevision-based control system for 2-DoF robotic arm using PIDwith hardware-in-the-looprdquo Intelligent Control andAutomationvol 6 no 2 pp 147ndash157 2015

[35] I Benabdallah Y Bouteraa R Boucetta and C Rekik ldquoKinect-based Computed Torque Control for lynxmotion robotic armrdquoin Proceedings of the 7th International Conference on ModellingIdentification and Control (ICMIC rsquo15) pp 1ndash6 Sousse TunisiaDecember 2015

[36] H Mehdi and O Boubaker ldquoRobot-assisted therapy designcontrol and optimizationrdquo International Journal on Smart Sens-ing and Intelligent Systems vol 5 no 4 pp 1044ndash1062 2012

[37] I Ben Aabdallah Y Bouteraa and C Rekik ldquoDesign of smartrobot for wrist rehabilitationrdquo International journal of smartsensing and intelligent systems vol 9 no 2 2016

[38] C Pillajo and J E Sierra ldquoHuman machine interface HMIusing Kinect sensor to control a SCARA robotrdquo in Proceedingsof the IEEE Colombian Conference on Communications andComputing (COLCOM rsquo13) pp 1ndash5 Medellin Colo USA May2013

Submit your manuscripts athttpwwwhindawicom

Computer Games Technology

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Distributed Sensor Networks

International Journal of

Advances in

FuzzySystems

Hindawi Publishing Corporationhttpwwwhindawicom

Volume 2014

International Journal of

ReconfigurableComputing

Hindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Applied Computational Intelligence and Soft Computing

thinspAdvancesthinspinthinsp

Artificial Intelligence

HindawithinspPublishingthinspCorporationhttpwwwhindawicom Volumethinsp2014

Advances inSoftware EngineeringHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Electrical and Computer Engineering

Journal of

Journal of

Computer Networks and Communications

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporation

httpwwwhindawicom Volume 2014

Advances in

Multimedia

International Journal of

Biomedical Imaging

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

ArtificialNeural Systems

Advances in

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

RoboticsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Computational Intelligence and Neuroscience

Industrial EngineeringJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Modelling amp Simulation in EngineeringHindawi Publishing Corporation httpwwwhindawicom Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Human-ComputerInteraction

Advances in

Computer EngineeringAdvances in

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Page 9: Research Article Kinect-Based Sliding Mode Control for Lynxmotion Robotic …downloads.hindawi.com/journals/ahci/2016/7921295.pdf · 2019-07-30 · is LabVIEW-based user interface

Advances in Human-Computer Interaction 9

Desired joint 1 positionMeasured joint 1 position

200100 15050 250 300 350 40000Time (s)

040608

11214

Am

plitu

de (r

ads)

Figure 15 Desired and measured joint 1 position

50 100 150 200 250 300 350 40000Time (s)

002040608

112

Am

plitu

de (r

ads) Desired joint 2 position

Measured joint 2 position

Figure 16 Desired and measured joint 2 position

50 100 150 200 250 300 350 40000Time (s)

012345

Am

plitu

de (r

ads)

Desired joint 3 positionMeasured joint 3 position

Figure 17 Desired and measured joint 3 position

which require real-time human-robot cooperation Trajec-tory tracking and system stability have been ensured bysliding mode control (SMC) Control and supervision bothhave been provided by a high-friendly human-machine inter-face developed in LabVIEW Experiments have shown thehigh efficiency of the proposed approach Furthermore theKinect-based teleoperation system is validated by augmentedreality In conclusion the proposed control scheme providesa feasible teleoperation system for many applications such asmanipulating in hazardous and unstructured environmentHowever there are some constraints when applying theproposed method such as the sensibility of the desiredtrajectory generated by the human arm even in case ofrandom and unwanted movements Indeed the work spaceand the regional constraints must be respected in the controlschemes by filtering the unaccepted trajectories Moreoverfor pic and place applications grasping task is not consideredin the proposed telemanipulation system In perspective thisdrawback can be overcome by using the electromyographybiosignal for grasping task control

Competing Interests

The authors declare that they have no competing interests

References

[1] Y Bouteraa J Ghommam N Derbel and G Poisson ldquoNonlin-ear control and synchronization with time delays of multiagentrobotic systemsrdquo Journal of Control Science and Engineering vol2011 Article ID 632374 9 pages 2011

[2] Y Bouteraa J Ghommam G Poisson and N Derbel ldquoDis-tributed synchronization control to trajectory tracking of mul-tiple robot manipulatorsrdquo Journal of Robotics vol 2011 ArticleID 652785 10 pages 2011

[3] G A M Vasiljevic L C de Miranda and E E C de MirandaldquoA case study ofmastermind chess comparingmousekeyboardinteraction with kinect-based gestural interfacerdquo Advances inHuman-Computer Interaction vol 2016 Article ID 4602471 10pages 2016

[4] C-H Tsai Y-H Kuo K-C Chu and J-C Yen ldquoDevelop-ment and evaluation of game-based learning system using themicrosoft kinect sensorrdquo International Journal of DistributedSensor Networks vol 2015 Article ID 498560 10 pages 2015

[5] C-H Chuang Y-N Chen L-W Tsai C-C Lee andH-C TsaildquoImproving learning performance with happiness by interactivescenariosrdquo The Scientific World Journal vol 2014 Article ID807347 12 pages 2014

[6] K J Bower J Louie Y Landesrocha P Seedy A Gorelik and JBernhardt ldquoClinical feasibility of interactive motion-controlledgames for stroke rehabilitationrdquo Journal of NeuroEngineeringand Rehabilitation vol 12 no 1 article 63 2015

[7] M Strbac S Kocovic M Markovic and D B PopovicldquoMicrosoft kinect-based artificial perception system for controlof functional electrical stimulation assisted graspingrdquo BioMedResearch International vol 2014 Article ID 740469 12 pages2014

[8] N Kanwal E Bostanci K Currie andA F Clark ldquoA navigationsystem for the visually impaired a fusion of vision and depthsensorrdquo Applied Bionics and Biomechanics vol 2015 Article ID479857 16 pages 2015

[9] H-H Pham T-L Le and N Vuillerme ldquoReal-time obsta-cle detection system in indoor environment for the visuallyimpaired using microsoft kinect sensorrdquo Journal of Sensors vol2016 Article ID 3754918 13 pages 2016

[10] D Tuvshinjargal B Dorj and D J Lee ldquoHybrid motion plan-ning method for autonomous robots using kinect based sensorfusion and virtual plane approach in dynamic environmentsrdquoJournal of Sensors vol 2015 Article ID 471052 13 pages 2015

[11] T Xu S Jia Z Dong and X Li ldquoObstacles regions 3D-perception method for mobile robots based on visual saliencyrdquoJournal of Robotics vol 2015 Article ID 720174 10 pages 2015

[12] W Shang X Cao H Ma H Zang and P Wei ldquoKinect-based vision system of mine rescue robot for low illuminousenvironmentrdquo Journal of Sensors vol 2016 Article ID 82520159 pages 2016

[13] H Kim and I Kim ldquoDynamic arm gesture recognition usingspherical angle features and hidden markov modelsrdquo Advancesin Human-Computer Interaction vol 2015 Article ID 785349 7pages 2015

[14] A Chavez-Aragon R Macknojia P Payeur and R LaganiereldquoRapid 3D modeling and parts recognition on automotivevehicles using a network of RGB-D sensors for robot guidancerdquoJournal of Sensors vol 2013 Article ID 832963 16 pages 2013

[15] A Prochazka O Vysata M Valis and R Yadollahi ldquoThe MSkinect use for 3d modelling and gait analysis in the Matlab

10 Advances in Human-Computer Interaction

environmentrdquo in Proceedings of the 21th Annual ConferenceTechnical Computing 2013 Prague Czech Republic 2013

[16] B Y L Li A S Mian W Liu and A Krishna ldquoUsingKinect for face recognition under varying poses expressionsillumination and disguiserdquo in Proceedings of the IEEEWorkshopon Applications of Computer Vision (WACV rsquo13) pp 186ndash192Tampa Fla USA January 2013

[17] C Muhiddin D B Phillips M J Miles L Picco and D MCarberry ldquoKinect 4 holographic optical tweezersrdquo Journalof Optics vol 15 no 7 Article ID 075302 2013

[18] P Qiu C Ni J Zhang and H Cao ldquoResearch of virtualsimulation of traditional Chinese bone setting techniques basedon kinect skeletal informationrdquo Journal of Shandong Universityof Traditional Chinese Medicine no 3 pp 202ndash204 2014

[19] R Ibanez A Soria A Teyseyre and M Campo ldquoEasy gesturerecognition for Kinectrdquo Advances in Engineering Software vol76 pp 171ndash180 2014

[20] I-J Ding and C-W Chang ldquoAn eigenspace-based method witha user adaptation scheme for human gesture recognition byusing Kinect 3D datardquo Applied Mathematical Modelling vol 39no 19 pp 5769ndash5777 2015

[21] B Galna G Barry D Jackson D Mhiripiri P Olivier andL Rochester ldquoAccuracy of the Microsoft Kinect sensor formeasuring movement in people with Parkinsonrsquos diseaserdquo Gaitamp Posture vol 39 no 4 pp 1062ndash1068 2014

[22] K Qian H Yang and J Niu ldquoDeveloping a gesture basedremote human-robot interaction system using kinectrdquo Interna-tional Journal of Smart Home vol 7 no 4 pp 203ndash208 2013

[23] S Shirwalkar A Singh K Sharma and N Singh ldquoTelemanip-ulation of an industrial robotic arm using gesture recognitionwithKinectrdquo inProceedings of the IEEE International ConferenceonControl Automation Robotics and Embedded Systems (CARErsquo13) Jabalpur India December 2013

[24] G Du and P Zhang ldquoMarkerless humanndashrobot interface fordual robot manipulators using Kinect sensorrdquo Robotics andComputer-Integrated Manufacturing vol 30 no 2 pp 150ndash1592014

[25] A Hernansanz A Casals and J Amat ldquoAmulti-robot coopera-tion strategy for dexterous task oriented teleoperationrdquoRoboticsand Autonomous Systems vol 68 pp 156ndash172 2015

[26] F Terrence F Conti G Sebastien and B Charles ldquonovelinterfaces for remote driving gesture haptic and PDArdquo in SPIETelemanipulator and Telepresence Technologies VII vol 4195 pp300ndash311 2000

[27] E Ueda Y Matsumoto M Imai and T Ogasawara ldquoA hand-pose estimation for vision-based human interfacesrdquo IEEETransactions on Industrial Electronics vol 50 no 4 pp 676ndash684 2003

[28] J Zhang and A Knoll ldquoA two-arm situated artificial commu-nicator for humanndashrobot cooperative assemblyrdquo IEEE Transac-tions on Industrial Electronics vol 50 no 4 pp 651ndash658 2003

[29] B Ionescu D Coquin P Lambert and V Buzuloiu ldquoDynamichand gesture recognition using the skeleton of the handrdquoEURASIP Journal on Applied Signal Processing vol 13 pp 2101ndash2109 2005

[30] Y Kim S Leonard A Shademan A Krieger and P C W KimldquoKinect technology for hand tracking control of surgical robotstechnical and surgical skill comparison to current roboticmastersrdquo Surgical Endoscopy vol 28 no 6 pp 1993ndash2000 2014

[31] J Kofman S Verma and X Wu ldquoRobot-manipulator teleop-eration by markerless vision-based hand-arm trackingrdquo Inter-national Journal of Optomechatronics vol 1 no 3 pp 331ndash3572007

[32] R C Luo B-H Shih and T-W Lin ldquoReal time humanmotion imitation of anthropomorphic dual arm robot based onCartesian impedance controlrdquo in Proceedings of the 11th IEEEInternational Symposium on Robotic and Sensors Environments(ROSE rsquo13) pp 25ndash30 Washington DC USA October 2013

[33] R Afthoni A Rizal and E Susanto ldquoProportional deriva-tive control based robot arm system using microsoft kinectrdquoin Proceedings of the International Conference on RoboticsBiomimetics Intelligent Computational Systems (ROBIONETICSrsquo13) pp 24ndash29 IEEE Jogjakarta Indonesia November 2013

[34] M Al-Shabi ldquoSimulation and implementation of real-timevision-based control system for 2-DoF robotic arm using PIDwith hardware-in-the-looprdquo Intelligent Control andAutomationvol 6 no 2 pp 147ndash157 2015

[35] I Benabdallah Y Bouteraa R Boucetta and C Rekik ldquoKinect-based Computed Torque Control for lynxmotion robotic armrdquoin Proceedings of the 7th International Conference on ModellingIdentification and Control (ICMIC rsquo15) pp 1ndash6 Sousse TunisiaDecember 2015

[36] H Mehdi and O Boubaker ldquoRobot-assisted therapy designcontrol and optimizationrdquo International Journal on Smart Sens-ing and Intelligent Systems vol 5 no 4 pp 1044ndash1062 2012

[37] I Ben Aabdallah Y Bouteraa and C Rekik ldquoDesign of smartrobot for wrist rehabilitationrdquo International journal of smartsensing and intelligent systems vol 9 no 2 2016

[38] C Pillajo and J E Sierra ldquoHuman machine interface HMIusing Kinect sensor to control a SCARA robotrdquo in Proceedingsof the IEEE Colombian Conference on Communications andComputing (COLCOM rsquo13) pp 1ndash5 Medellin Colo USA May2013

Submit your manuscripts athttpwwwhindawicom

Computer Games Technology

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Distributed Sensor Networks

International Journal of

Advances in

FuzzySystems

Hindawi Publishing Corporationhttpwwwhindawicom

Volume 2014

International Journal of

ReconfigurableComputing

Hindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Applied Computational Intelligence and Soft Computing

thinspAdvancesthinspinthinsp

Artificial Intelligence

HindawithinspPublishingthinspCorporationhttpwwwhindawicom Volumethinsp2014

Advances inSoftware EngineeringHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Electrical and Computer Engineering

Journal of

Journal of

Computer Networks and Communications

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporation

httpwwwhindawicom Volume 2014

Advances in

Multimedia

International Journal of

Biomedical Imaging

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

ArtificialNeural Systems

Advances in

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

RoboticsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Computational Intelligence and Neuroscience

Industrial EngineeringJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Modelling amp Simulation in EngineeringHindawi Publishing Corporation httpwwwhindawicom Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Human-ComputerInteraction

Advances in

Computer EngineeringAdvances in

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Page 10: Research Article Kinect-Based Sliding Mode Control for Lynxmotion Robotic …downloads.hindawi.com/journals/ahci/2016/7921295.pdf · 2019-07-30 · is LabVIEW-based user interface

10 Advances in Human-Computer Interaction

environmentrdquo in Proceedings of the 21th Annual ConferenceTechnical Computing 2013 Prague Czech Republic 2013

[16] B Y L Li A S Mian W Liu and A Krishna ldquoUsingKinect for face recognition under varying poses expressionsillumination and disguiserdquo in Proceedings of the IEEEWorkshopon Applications of Computer Vision (WACV rsquo13) pp 186ndash192Tampa Fla USA January 2013

[17] C Muhiddin D B Phillips M J Miles L Picco and D MCarberry ldquoKinect 4 holographic optical tweezersrdquo Journalof Optics vol 15 no 7 Article ID 075302 2013

[18] P Qiu C Ni J Zhang and H Cao ldquoResearch of virtualsimulation of traditional Chinese bone setting techniques basedon kinect skeletal informationrdquo Journal of Shandong Universityof Traditional Chinese Medicine no 3 pp 202ndash204 2014

[19] R Ibanez A Soria A Teyseyre and M Campo ldquoEasy gesturerecognition for Kinectrdquo Advances in Engineering Software vol76 pp 171ndash180 2014

[20] I-J Ding and C-W Chang ldquoAn eigenspace-based method witha user adaptation scheme for human gesture recognition byusing Kinect 3D datardquo Applied Mathematical Modelling vol 39no 19 pp 5769ndash5777 2015

[21] B Galna G Barry D Jackson D Mhiripiri P Olivier andL Rochester ldquoAccuracy of the Microsoft Kinect sensor formeasuring movement in people with Parkinsonrsquos diseaserdquo Gaitamp Posture vol 39 no 4 pp 1062ndash1068 2014

[22] K Qian H Yang and J Niu ldquoDeveloping a gesture basedremote human-robot interaction system using kinectrdquo Interna-tional Journal of Smart Home vol 7 no 4 pp 203ndash208 2013

[23] S Shirwalkar A Singh K Sharma and N Singh ldquoTelemanip-ulation of an industrial robotic arm using gesture recognitionwithKinectrdquo inProceedings of the IEEE International ConferenceonControl Automation Robotics and Embedded Systems (CARErsquo13) Jabalpur India December 2013

[24] G Du and P Zhang ldquoMarkerless humanndashrobot interface fordual robot manipulators using Kinect sensorrdquo Robotics andComputer-Integrated Manufacturing vol 30 no 2 pp 150ndash1592014

[25] A Hernansanz A Casals and J Amat ldquoAmulti-robot coopera-tion strategy for dexterous task oriented teleoperationrdquoRoboticsand Autonomous Systems vol 68 pp 156ndash172 2015

[26] F Terrence F Conti G Sebastien and B Charles ldquonovelinterfaces for remote driving gesture haptic and PDArdquo in SPIETelemanipulator and Telepresence Technologies VII vol 4195 pp300ndash311 2000

[27] E Ueda Y Matsumoto M Imai and T Ogasawara ldquoA hand-pose estimation for vision-based human interfacesrdquo IEEETransactions on Industrial Electronics vol 50 no 4 pp 676ndash684 2003

[28] J Zhang and A Knoll ldquoA two-arm situated artificial commu-nicator for humanndashrobot cooperative assemblyrdquo IEEE Transac-tions on Industrial Electronics vol 50 no 4 pp 651ndash658 2003

[29] B Ionescu D Coquin P Lambert and V Buzuloiu ldquoDynamichand gesture recognition using the skeleton of the handrdquoEURASIP Journal on Applied Signal Processing vol 13 pp 2101ndash2109 2005

[30] Y Kim S Leonard A Shademan A Krieger and P C W KimldquoKinect technology for hand tracking control of surgical robotstechnical and surgical skill comparison to current roboticmastersrdquo Surgical Endoscopy vol 28 no 6 pp 1993ndash2000 2014

[31] J Kofman S Verma and X Wu ldquoRobot-manipulator teleop-eration by markerless vision-based hand-arm trackingrdquo Inter-national Journal of Optomechatronics vol 1 no 3 pp 331ndash3572007

[32] R C Luo B-H Shih and T-W Lin ldquoReal time humanmotion imitation of anthropomorphic dual arm robot based onCartesian impedance controlrdquo in Proceedings of the 11th IEEEInternational Symposium on Robotic and Sensors Environments(ROSE rsquo13) pp 25ndash30 Washington DC USA October 2013

[33] R Afthoni A Rizal and E Susanto ldquoProportional deriva-tive control based robot arm system using microsoft kinectrdquoin Proceedings of the International Conference on RoboticsBiomimetics Intelligent Computational Systems (ROBIONETICSrsquo13) pp 24ndash29 IEEE Jogjakarta Indonesia November 2013

[34] M Al-Shabi ldquoSimulation and implementation of real-timevision-based control system for 2-DoF robotic arm using PIDwith hardware-in-the-looprdquo Intelligent Control andAutomationvol 6 no 2 pp 147ndash157 2015

[35] I Benabdallah Y Bouteraa R Boucetta and C Rekik ldquoKinect-based Computed Torque Control for lynxmotion robotic armrdquoin Proceedings of the 7th International Conference on ModellingIdentification and Control (ICMIC rsquo15) pp 1ndash6 Sousse TunisiaDecember 2015

[36] H Mehdi and O Boubaker ldquoRobot-assisted therapy designcontrol and optimizationrdquo International Journal on Smart Sens-ing and Intelligent Systems vol 5 no 4 pp 1044ndash1062 2012

[37] I Ben Aabdallah Y Bouteraa and C Rekik ldquoDesign of smartrobot for wrist rehabilitationrdquo International journal of smartsensing and intelligent systems vol 9 no 2 2016

[38] C Pillajo and J E Sierra ldquoHuman machine interface HMIusing Kinect sensor to control a SCARA robotrdquo in Proceedingsof the IEEE Colombian Conference on Communications andComputing (COLCOM rsquo13) pp 1ndash5 Medellin Colo USA May2013

Submit your manuscripts athttpwwwhindawicom

Computer Games Technology

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Distributed Sensor Networks

International Journal of

Advances in

FuzzySystems

Hindawi Publishing Corporationhttpwwwhindawicom

Volume 2014

International Journal of

ReconfigurableComputing

Hindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Applied Computational Intelligence and Soft Computing

thinspAdvancesthinspinthinsp

Artificial Intelligence

HindawithinspPublishingthinspCorporationhttpwwwhindawicom Volumethinsp2014

Advances inSoftware EngineeringHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Electrical and Computer Engineering

Journal of

Journal of

Computer Networks and Communications

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporation

httpwwwhindawicom Volume 2014

Advances in

Multimedia

International Journal of

Biomedical Imaging

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

ArtificialNeural Systems

Advances in

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

RoboticsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Computational Intelligence and Neuroscience

Industrial EngineeringJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Modelling amp Simulation in EngineeringHindawi Publishing Corporation httpwwwhindawicom Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Human-ComputerInteraction

Advances in

Computer EngineeringAdvances in

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Page 11: Research Article Kinect-Based Sliding Mode Control for Lynxmotion Robotic …downloads.hindawi.com/journals/ahci/2016/7921295.pdf · 2019-07-30 · is LabVIEW-based user interface

Submit your manuscripts athttpwwwhindawicom

Computer Games Technology

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Distributed Sensor Networks

International Journal of

Advances in

FuzzySystems

Hindawi Publishing Corporationhttpwwwhindawicom

Volume 2014

International Journal of

ReconfigurableComputing

Hindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Applied Computational Intelligence and Soft Computing

thinspAdvancesthinspinthinsp

Artificial Intelligence

HindawithinspPublishingthinspCorporationhttpwwwhindawicom Volumethinsp2014

Advances inSoftware EngineeringHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Electrical and Computer Engineering

Journal of

Journal of

Computer Networks and Communications

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporation

httpwwwhindawicom Volume 2014

Advances in

Multimedia

International Journal of

Biomedical Imaging

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

ArtificialNeural Systems

Advances in

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

RoboticsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Computational Intelligence and Neuroscience

Industrial EngineeringJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Modelling amp Simulation in EngineeringHindawi Publishing Corporation httpwwwhindawicom Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Human-ComputerInteraction

Advances in

Computer EngineeringAdvances in

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014