bent fingersв€™ angle calculation using supervised ann to control

11
Bent fingers’ angle calculation using supervised ANN to control electro-mechanical robotic hand q,qq Ankit Chaudhary a,, J.L. Raheja b a Computer Vision Research Group, Dept. of Computer Science, BITS, Pilani, India b Machine Vision Lab, Digital Systems Group, Central Electronics Engineering Research Institute (CEERI)/Council of Scientific and Industrial Research (CSIR), Pilani, India article info Article history: Available online xxxx abstract The shape of human hand is such that it can perform many tedious tasks easily. It can reach narrow places and can perform difficult operations. It can bend its fingers at different angles to pick or hold different objects and apply force via fingers or palm area. It is very helpful in many difficult applications. However, there is risk of injury to the human hand, or even life in dangerous operations. It is not advisable to gamble of human body parts for applications like land mine removal. Hence, there is a need of a robotic hand which can perform the same operation as a human hand does in real time. This paper discusses a vision-based technique of controlling a robotic hand which has human hand like joints in fingers. The user has to show a gesture to the system with bare hand without any lim- itation on hand direction and the robotic hand would mimic that gesture. The positions of human hand fingers were calculated using supervised Artificial Neural Network. The pre- processing made the whole algorithm faster by cropping the region of interest from input image frame. The gesture was extracted from the input image and fingertips, centre of palm were detected. The animated simulation of robotic hand is done in Blender Ò software. Ó 2012 Elsevier Ltd. All rights reserved. 1. Introduction Human hands play an important role in execution of many important tasks in general life as well as for special pur- poses. There are many applications that only human hands can perform much more efficiently compared to a machine shaft because of their degrees of freedom and ability to bend fingers at different angle. However, in dangerous applica- tions, the life of the executer is always on the edge of death. There are situations like bomb detection/diffusion, land mine removal and many others where currently humans perform tasks with a few safety guards and casualties occur occasion- ally. To save these lives, there is an urgent need of a robotic hand, which can operate the same way as the human hand. The shape of a robotic hand should match the human hand so that it can be controlled by human hand gesture. It should have joints in the fingers, which it can bend like human in interaction mode. On the above, the requirement from the robotic hand is that it should be able to perform the operations in real time. This paper discusses a novel method to find out the angle values of bent fingers from the hand gesture in real time. The gesture was captured from the human live actions using a simple webcam. In the last few decades researchers have done a significant amount of research in the area of hand gesture recognition. Most of them used a wired glove in which sensors were planted or they have used colours on fingers to recognise the gesture 0045-7906/$ - see front matter Ó 2012 Elsevier Ltd. All rights reserved. http://dx.doi.org/10.1016/j.compeleceng.2012.07.012 q Reviews processed and recommended for publication to Editor-in-Chief by Guest Editor Dr. Sabu M. Thampi. qq A preliminary part of this work has been published in [1]. The block diagram of proposed system is presented in Fig. 1. Corresponding author. E-mail addresses: [email protected] (A. Chaudhary), [email protected] (J.L. Raheja). Computers and Electrical Engineering xxx (2012) xxx–xxx Contents lists available at SciVerse ScienceDirect Computers and Electrical Engineering journal homepage: www.elsevier.com/locate/compeleceng Please cite this article in press as: Chaudhary A, Raheja JL. Bent fingers’ angle calculation using supervised ANN to control electro-mechan- ical robotic hand. Comput Electr Eng (2012), http://dx.doi.org/10.1016/j.compeleceng.2012.07.012

Upload: truongthien

Post on 02-Jan-2017

218 views

Category:

Documents


1 download

TRANSCRIPT

Computers and Electrical Engineering xxx (2012) xxx–xxx

Contents lists available at SciVerse ScienceDirect

Computers and Electrical Engineering

journal homepage: www.elsevier .com/ locate/compeleceng

Bent fingers’ angle calculation using supervised ANN to controlelectro-mechanical robotic hand q,qq

Ankit Chaudhary a,⇑, J.L. Raheja b

a Computer Vision Research Group, Dept. of Computer Science, BITS, Pilani, Indiab Machine Vision Lab, Digital Systems Group, Central Electronics Engineering Research Institute (CEERI)/Council of Scientific and Industrial Research (CSIR),Pilani, India

a r t i c l e i n f o a b s t r a c t

Article history:Available online xxxx

0045-7906/$ - see front matter � 2012 Elsevier Ltdhttp://dx.doi.org/10.1016/j.compeleceng.2012.07.01

q Reviews processed and recommended for publicqq A preliminary part of this work has been publis⇑ Corresponding author.

E-mail addresses: [email protected] (A

Please cite this article in press as: Chaudhary Aical robotic hand. Comput Electr Eng (2012), h

The shape of human hand is such that it can perform many tedious tasks easily. It can reachnarrow places and can perform difficult operations. It can bend its fingers at differentangles to pick or hold different objects and apply force via fingers or palm area. It is veryhelpful in many difficult applications. However, there is risk of injury to the human hand,or even life in dangerous operations. It is not advisable to gamble of human body parts forapplications like land mine removal. Hence, there is a need of a robotic hand which canperform the same operation as a human hand does in real time. This paper discusses avision-based technique of controlling a robotic hand which has human hand like jointsin fingers. The user has to show a gesture to the system with bare hand without any lim-itation on hand direction and the robotic hand would mimic that gesture. The positions ofhuman hand fingers were calculated using supervised Artificial Neural Network. The pre-processing made the whole algorithm faster by cropping the region of interest from inputimage frame. The gesture was extracted from the input image and fingertips, centre ofpalm were detected. The animated simulation of robotic hand is done in Blender� software.

� 2012 Elsevier Ltd. All rights reserved.

1. Introduction

Human hands play an important role in execution of many important tasks in general life as well as for special pur-poses. There are many applications that only human hands can perform much more efficiently compared to a machineshaft because of their degrees of freedom and ability to bend fingers at different angle. However, in dangerous applica-tions, the life of the executer is always on the edge of death. There are situations like bomb detection/diffusion, land mineremoval and many others where currently humans perform tasks with a few safety guards and casualties occur occasion-ally. To save these lives, there is an urgent need of a robotic hand, which can operate the same way as the human hand.The shape of a robotic hand should match the human hand so that it can be controlled by human hand gesture. It shouldhave joints in the fingers, which it can bend like human in interaction mode. On the above, the requirement from therobotic hand is that it should be able to perform the operations in real time. This paper discusses a novel method to findout the angle values of bent fingers from the hand gesture in real time. The gesture was captured from the human liveactions using a simple webcam.

In the last few decades researchers have done a significant amount of research in the area of hand gesture recognition.Most of them used a wired glove in which sensors were planted or they have used colours on fingers to recognise the gesture

. All rights reserved.2

ation to Editor-in-Chief by Guest Editor Dr. Sabu M. Thampi.hed in [1]. The block diagram of proposed system is presented in Fig. 1.

. Chaudhary), [email protected] (J.L. Raheja).

, Raheja JL. Bent fingers’ angle calculation using supervised ANN to control electro-mechan-ttp://dx.doi.org/10.1016/j.compeleceng.2012.07.012

2 A. Chaudhary, J.L. Raheja / Computers and Electrical Engineering xxx (2012) xxx–xxx

clearly in the image. This work requires no sensors/colours/paper clips to detect fingers in the captured image frame. Theobvious expectation from this system is that the detection of moving fingers should be fast, robust and should operate inreal time. The processing time is a very critical factor in real time applications which depends on several factors. This visionbased system that can be used to control a remotely located robotic hand, would send signal to robotic hand to mimic humanhand positions in real time to perform critical operations. A user has to bend his fingers to hold objects (virtually) and therobotic hand will do the same to hold the actual objects on its location. The presented system captures image frames in 2Dand applies skin filter and segmentation method on it. This would result in the region of interest (ROI) even if there are skincolours like objects in the gesture background. It detects fingertips and centre of palm in ROI and calculates the distancesbetween each fingertip and centre of palm, and classifies it with supervised ANN. The block diagram of proposed systemis presented in Fig. 1.

2. Related work

Current state of the art on hand gesture recognition techniques and fingers’ angle calculation can be found out in [2,3]. Itis clear from existing scientific literature survey that very little work has been done in the area of bent fingers’ angle calcu-lation. Nolker and Ritter [4] presented GREFIT where they focus on large number of 3D hand postures. They used fingertips inhands as natural determinant of hand posture to reconstruct the image. Their technique takes grayscale images of 192 � 144resolutions to process. They used ANN based layered approach to detect fingertips. After obtaining fingertip vectors, it istransformed into finger joint angles to an articulated hand model. Separate network was trained for each finger on same fea-ture vectors, having input space in 35 dimensional while output dimensional was only 2. Lee and Park [5] used Hidden Mar-kov Model (HMM) for gesture recognition using shape feature. Gesture state was determined after stabilizing the imagecomponent as open fingers in consecutive frames. They also used maxima–minima approach for construction the hand im-age and FSM. Vizireanu and Udrea [6] used a reference frame for MST based morphology on grayscale and binary images.They [7] also presented shape decomposition using mathematical morphology. Raheja et al. [8] used orientation feature vec-tors to identify gestures in varying light conditions.

Wang and Mori [9] proposed an optical flow based powerful approach for human action recognition using learning mod-els. Their technique also labels hidden parts in the image. The max-margin based algorithm can be applied to gesture rec-ognition. Kim and Fellner [10] in their system used learning model for dynamic gestures recognition. Sigalas et al. [11] usedmulti-layered perceptron neural network classifiers to recognise arm gestures which would help in controlling robots placedin public place. Delden and Umrysh [12] have shown his experimental setup a technique through which a robot can identifythe pointing direction of the index finger. This was a small setup and its constraints allow it to work only in laboratory space.Bergh et al. [13] used KINECT� to pass commands to its interactive robot using hand gesture. Although they pass commandsby finger pointing which is direction invariant also, their work requires many specialised hardware to detect legs and hu-mans which does not support natural computing.

Pateraki et al. [14] tried to find the pointing direction by robot using hand gesture and head position. They did it with onlyone camera installed in the room but the assumption was that the robot has the pre-information about the surroundingswhich does not seem practical in a real scenario. Zhang et al. [15] used learning features for hand gesture recognition whichwould be used to control an intelligent wheel chair. This system has a good real time performance. Salem et al. [16] used amulti model controlling technique to interact with Honda humanoid robot where a study was performed on how humansunderstand the robot gestures. Tran and Trivedi [17] also used 3D tracking of upper body with multiple camera and trackedposture–gesture on joints. This was done to track the gesture changed by user. Ahsan et al. [18] did ANN based gesture rec-ognition where the gesture was described using EMG signals. Gosselin et al. [19] demonstrated a robotic hand with 15� offreedom and single degree actuation. Many other applications could be found in literature of real time control in humancomputer interaction. Some of these are computer games control [20], indoor robots based games [21], human robot inter-action [22] and different sign language recognition [23,24].

Fig. 1. Block diagram of the angle calculation system.

Please cite this article in press as: Chaudhary A, Raheja JL. Bent fingers’ angle calculation using supervised ANN to control electro-mechan-ical robotic hand. Comput Electr Eng (2012), http://dx.doi.org/10.1016/j.compeleceng.2012.07.012

A. Chaudhary, J.L. Raheja / Computers and Electrical Engineering xxx (2012) xxx–xxx 3

3. Pre-processing and fingertips detection

The presented system uses a simple webcam which is connected to a PC running windows XP� to capture the video of ahuman hand. Its image frames capturing frequency is configurable, currently it is taking 6 frames which gives a feel of realtime video input and can be used to calculate the fingers’ position angles in different gestures. There are no restrictions onthe user to use any kind of sensor or wired device. It is not even mandatory to wear a full sleeved shirt. User is free to showhand gesture in any direction to the camera with a bare hand, parallel to the webcam. The hand can be at any angle in anydirection from the camera. In gesture recognition colour based methods are applicable because of their characteristic of col-our footprint of human skin. The colour footprint is usually more distinctive and less sensitive than in the standard (cameracapture) RGB colour space. The system would extract the ROI from the input image frames. Then it would detect fingertipsand the centre of palm in the extracted gesture image. Finally it measures the distance between the centre of palm and fin-gertips in pixels and this distance would be translated into angle values for each finger with the help of ANN.

3.1. ROI extraction

First of all the system has to find out the ROI. This was implemented by applying skin filter on the image. An HSV colourmodel based skin filter was applied on the RGB format image to reduce the lighting effect. The resultant frame would be pre-processed and then BLOB would be applied on the image to obtain a smoothened filtered image. The details of pre-processingon the captured image sequences are discussed in [25]. The hand direction was detected based on pixel intensities in theimage. The direction would help in determining the fingertips. Now frame would be cropped based on ROI boundaries tomake further process faster. After cropping, most of the time only half of the image is processed further.

Frame cropping can be formulated mathematically as shown below.

Pleaseical ro

imcrop ¼Originimage ; Xmin < X < Xmax

; Ymin < Y < Ymax

0; ; elsewhere

8><>:

9>=>;

where imcrop represents the cropped image, Xmin, Ymin, Xmax and Ymax represent the boundary of the hand image indifferent directions. Few results of frame cropping for different inputs are presented in Fig. 2, which in turn shows therobustness of this method. In all the histograms shown in Fig. 2, it is clear that at the wrist point, a steeping inclination startsin the scanning direction which was the point to cut the image on continuous skin pixels.

3.2. Fingertips and centre of palm detection

Fingertips have been used in different systems for different purposes. Many researchers have used fingertips because theyhelp in determine the hand position in the frame. In our approach the hand directions are already known, so we can multiplypixel values from wrist to fingertips by an increasing factor. At the tips the value of the factor would be 255 and it will comeout as white. The mathematical expression for fingertip detection would be the following:

Fingeredgeðx; yÞ ¼1 if modifiedimageðx; yÞ ¼ 2550 otherwise

� �

Here Fingeredge would give the fingertips. Automatic centre of palm (COP) would be detected by applying a mask of dimen-sion 30 � 30 to the cropped ROI image and counting the number of on pixels lying within the mask. This process was madefaster using summed area table of the cropped binary image for calculating the masked values. The fingertips and COP detec-tion results are shown in Fig. 3 where it is clearly visible that all fingertips and centre of palm were detected irrespective ofthe hand direction.

4. Neural network implementation

After the detection of fingertips and centre of palm, we need to identify the fingers’ position to calculate angles. A super-vised ANN was implemented using the Levenberg–Marquardt algorithm [26] with 8000 sample data for all fingers in differ-ent positions and ANN was trained for 1000 iterations. The architecture of ANN has five input layers for five finger positionsand five output layers for angle of each finger as shown in Fig. 4.

We tried with different architectures and compared them for their performance as shown in Table 1. It was observed thatfor the 6th combination in Table 1, the test error was minimum while the fitness value was maximum. The ANN design in-cludes two hidden layers and 19 Neurons for processing.

The Levenberg–Marquardt algorithm was designed to approach second-order training speed without having to computethe Hessian matrix. When the performance function has the form of a sum of squares (as typical in training feed-forwardnetworks), then the Hessian matrix can be approximated as

H ¼ JTJ

cite this article in press as: Chaudhary A, Raheja JL. Bent fingers’ angle calculation using supervised ANN to control electro-mechan-botic hand. Comput Electr Eng (2012), http://dx.doi.org/10.1016/j.compeleceng.2012.07.012

Fig. 2. Results of ROI cropping from input sequence.

4 A. Chaudhary, J.L. Raheja / Computers and Electrical Engineering xxx (2012) xxx–xxx

The gradient can be computed as

Pleaseical ro

g ¼ JTe

where J is the Jacobian matrix that contains first derivatives of the network errors with respect to the weights and biases, ande is a vector of network errors. The Jacobian matrix can be computed through a standard back propagation technique, whichis much less complex than computing the Hessian matrix. The Levenberg–Marquardt algorithm applies this approximation[26] to the Hessian matrix using the following:

xkþ1 ¼ xk � ½JTJþ lI��1JT

when the scalar l is zero, this is just the Quasi-Newton’s method, using the approximate Hessian matrix. When l is large,this becomes gradient descent with a small step size.

Newton’s method is faster and more accurate near an error minimum, so the aim is to shift toward Newton’s method asquickly as possible. Thus l is decreased after each successful step (reduction in performance function) and is increased onlywhen a tentative step would increase the performance function. Hence, the performance function will always reduce inevery iteration of the algorithm. This algorithm appears to be the fastest method for training moderate-sized feed forwardneural networks (up to several hundred weights). The training state during iterations and data validation for this system isshown in Figs. 5 and 6 respectively.

cite this article in press as: Chaudhary A, Raheja JL. Bent fingers’ angle calculation using supervised ANN to control electro-mechan-botic hand. Comput Electr Eng (2012), http://dx.doi.org/10.1016/j.compeleceng.2012.07.012

Fig. 3. Results of fingertips and centre of palm detection in image sequence.

Fig. 4. ANN architecture.

A. Chaudhary, J.L. Raheja / Computers and Electrical Engineering xxx (2012) xxx–xxx 5

The input data includes the distance from COP to fingertips for all fingers at different angles in terms of pixels. Table 2shows values of one random test for all distances. The mean squared error in ANN is the order of 10�12 as shown in Fig. 7.

Please cite this article in press as: Chaudhary A, Raheja JL. Bent fingers’ angle calculation using supervised ANN to control electro-mechan-ical robotic hand. Comput Electr Eng (2012), http://dx.doi.org/10.1016/j.compeleceng.2012.07.012

Table 1Architecture comparison for ANN.

ID Architecture Fitness Train error Validation error Test error

1 [5-1-5] 0.007762 111.6609 123.3729 128.82862 [5-9-5] 0.029136 21.07543 25.35231 34.321533 [5-14-5] 0.030816 17.03908 26.58807 32.450314 [5-17-5] 0.028309 22.56162 26.80747 35.324735 [5-18-5] 0.031086 20.17483 25.85577 32.16866 [5-19-5] 0.037425 12.60105 22.93995 26.719787 [5-20-5] 0.034877 12.1308 25.62003 28.672388 [5-21-5] 0.034308 12.13067 24.02896 29.147529 [5-23-5] 0.03166 14.48353 22.33495 31.5859

Fig. 5. Training state using 1000 iterations.

6 A. Chaudhary, J.L. Raheja / Computers and Electrical Engineering xxx (2012) xxx–xxx

5. System description

The presented system was developed on a PC running Windows XP�, on an i5 processor. A Logitech webcam was used tocapture video where the resolution was 160 � 120 and images were stored in the RGB format. MATLAB� was used for systemdevelopment. The captured image frames process internally and it takes the reference distance (RD) from the first frame thatcaptures the hand, where the user has to show a straight hand without bending any fingers. This RD will work throughoutthe session of the system as static value. If the hand disappears from the camera view, the system will remove RD and willtake the new RD in the same session so that different users can work in the same session. Now, if the user bends the fingers,the new distance (ND) will be calculated from the COP. The ratio of ND to RD would be sent to ANN to match with training

Please cite this article in press as: Chaudhary A, Raheja JL. Bent fingers’ angle calculation using supervised ANN to control electro-mechan-ical robotic hand. Comput Electr Eng (2012), http://dx.doi.org/10.1016/j.compeleceng.2012.07.012

Fig. 6. Data validation state graph.

Table 2Distances from centre of palm to each fingertip in pixels.

Angles Index Middle Ring Little Thumb3

0 81 87 82 75 5715 79 84 79 70 5330 77 80 76 67 5045 73 72 68 64 4560 60 63 61 60 40

A. Chaudhary, J.L. Raheja / Computers and Electrical Engineering xxx (2012) xxx–xxx 7

data and ANN will results in the corresponding angles for all five fingers after comparing with stored RDs and NDs. The out-put vs training graph for ANN is presented in Fig. 8 which shows a perfect match of output with training data.

6. Experimental results

The response time of the system to display angles for one gesture is 0.001 second, which is very satisfactory and close toreal time. Fig. 9 presents a few snapshots of the angle calculation system. The five values at the top of the window in eachframe of Fig. 9 shows the angles of fingers in the sequence as they appear in the input image.

The system takes live input from the user and calculates angles in real time. The calculated angle values can be used tofeed to an electro-mechanical robotic hand controlling software so that it can perform the same operation there. This re-search was focused assuming the Shadow Dexterous Hand 6 as target machine. The simulation of the robotic hand has beenperformed in Blender� software, where the hand performs based on the values provided by our system. Fig. 10 shows fewresults of this animated simulation where fingers are changing their positions based on input values calculated from humangesture performed in front of the camera.

6.1. Comparisons with existing work

In the past many researchers have tried to develop this kind of a device to mimic the hand operation, but they used spe-cial hardware or sensors to build their models. Lee et al. [27] tried to measure joint’s angle for robot teleoperation with 10DOF for hand where a slave robot was controlled using human arm and a sensor glove was used. The whole arm was part of ahumanoid robot and it took 2 ls for controlling the arm. Ye and Li [28] developed a bionic hand where fingers were con-trolled using air cylinders. Ye tried to move fingers by applying force on different knuckles. Jeong et al. [29] developed a glovefor the right hand to recognise hand gestures with 5 DoF. It could measure the five fingers’ bending angle from voltage dif-ferences within a time frame. Cheng et al. [30] presents a vision-based robotic arm controlling technique with 90% precisionfor a home robot.

Please cite this article in press as: Chaudhary A, Raheja JL. Bent fingers’ angle calculation using supervised ANN to control electro-mechan-ical robotic hand. Comput Electr Eng (2012), http://dx.doi.org/10.1016/j.compeleceng.2012.07.012

Fig. 7. Mean squared error in the ANN.

Fig. 8. Output vs training graph.

8 A. Chaudhary, J.L. Raheja / Computers and Electrical Engineering xxx (2012) xxx–xxx

Researchers [27–29] have developed gloves, which do not provide natural computing and only Ye and Li [28] tried to cal-culate fingers’ angles only for one hand. Cheng et al. [30] developed a natural method but their home robot consisted of onlythe arm. The end of the arm was not like a human hand. Therefore, on comparison with existing work, we see that our meth-od gives a real time performance with more robustness and without any help from a glove or extra device. Also, this is a

Please cite this article in press as: Chaudhary A, Raheja JL. Bent fingers’ angle calculation using supervised ANN to control electro-mechan-ical robotic hand. Comput Electr Eng (2012), http://dx.doi.org/10.1016/j.compeleceng.2012.07.012

Fig. 9. Results of fingers bending angle computation.

Fig. 10. Finger bending angles computed through neural network.

A. Chaudhary, J.L. Raheja / Computers and Electrical Engineering xxx (2012) xxx–xxx 9

novel approach to control the robotic hand using natural hand. We are applying vision based methods for controlling a ro-botic hand which has the same structure as a human hand while researchers have tried to control only clips on robotic arm.The performance of the system is expected to improve after implementation of this method in hardware.

Please cite this article in press as: Chaudhary A, Raheja JL. Bent fingers’ angle calculation using supervised ANN to control electro-mechan-ical robotic hand. Comput Electr Eng (2012), http://dx.doi.org/10.1016/j.compeleceng.2012.07.012

10 A. Chaudhary, J.L. Raheja / Computers and Electrical Engineering xxx (2012) xxx–xxx

7. Conclusion

This paper has presented a new research direction in gesture recognition techniques as very little work has been done inclosed/bent finger detection. The fingers’ angle calculation from hand gesture is a new direction of research and it would bevery helpful in controlling many applications/devices. This work contributes towards a natural approach to control a robotichand without using any keyboard or GUI software and it operates in real time. These kinds of robots can help in fields likemedical surgery military operations, rescue operations and many other applications.

The presented system is able to detect the fingertips in real time movements of hand and it is direction invariant. Thegesture was extracted even from a complex background and cropping of image was performed to keep only the ROI, whichmade the algorithm much faster. The user shows his hand to the camera in any direction and fingertips and centre of palmwas detected in the captured image sequences. The bending angles of fingers were calculated using the ANN by performingthe gesture in front of the camera without wearing any gloves or markers. This system provides an easy way to control therobot’s fingers by showing the hand and hence there is no need to feed numbers to bend fingers at particular angles. Theresults are satisfactory and this technology can be used to save human life in a number of applications.

Acknowledgements

This research was being carried out at Machine Vision Lab, Central Electronics Engineering Research Institute (CEERI/CSIR), A Govt. of India Laboratory, Pilani, India. This research was a part of our current project controlling the electro-mechan-ical hand using hand gesture. Authors would like to thank The Director, CEERI Pilani for providing research facilities and for hisactive encouragement and support.

References

[1] Chaudhary A, Raheja JL, Singal K, Raheja S. An ANN based approach to calculate robotic fingers positions. Published as book chapter in advances incomputing and communications, vol. 192. Berlin Heidelberg: CCIS, Springer; 2011. p. 488–96.

[2] Chaudhary A, Raheja JL, Das K, Raheja S. A survey on hand gesture recognition in context of soft computing. Published as book chapter in advancedcomputing, vol. 133. Berlin Heidelberg: CCIS, Springer; 2010. p. 46–55.

[3] Chaudhary A, Raheja JL, Raheja S. A vision based geometrical method to find fingers positions in real time hand gesture recognition. J Softw2012;7(4):861–9.

[4] Nolker C, Ritter H. Visual recognition of continuous hand postures. IEEE Trans Neural Networks 2002;13(4):983–94.[5] Lee D, Park Y. Vision-based remote control system by motion detection and open finger counting. IEEE Trans Consum Electr 2009;55(4):2308–13.[6] Vizireanu DN, Udrea RM. Visual-oriented morphological foreground content grayscale frames interpolation method. J Electr Imag 2009;18(2):1–3.[7] Vizireanu DN. Generalizations of binary morphological shape decomposition. J Electr Imag 2007;16(1):1–6.[8] Raheja JL, Manasa MBL, Chaudhary A, Raheja S. ABHIVYAKTI: hand gesture recognition using orientation histogram in different light conditions. In:

Proceedings of the 5th Indian international conference on artificial intelligence, Tumkur, India, 14–16 December, 2011. p. 1687–98.[9] Wang Y, Mori G. Max-margin hidden conditional random fields for human action recognition. In: IEEE conference on computer vision and pattern

recognition. Miami, Florida, USA, 20–25 June, 2009. p. 872–79.[10] Kim H, Fellner DW. Interaction with hand gesture for a back-projection wall. In: Proceedings of computer graphics international, Crete, Greece, 19 June,

2004. p. 395–402.[11] Sigalas M, Baltzakis H, Trahanias P. Gesture recognition based on arm tracking for human–robot interaction. In: Proceedings of international

conference on intelligent robots and systems, 18–22 October, 2010. p. 5424–9.[12] Delden SV, Umrysh MA. Visual detection of objects in a robotic work area using hand gestures. In: IEEE international symposium on robotic and

sensors environments, 17–18 September, 2011. p. 237–42.[13] Bergh MVD, Carton D, Nijs RD, Mitsou N, Landsiedel C, Kuehnlenz K, Wollherr D, Gool LV, Buss M. Real-time 3D hand gesture interaction with a robot

for understanding directions from humans. In: Proceedings of 20th IEEE symposium on robot and human interactive communication, Atlanta, GA, USA,July 31 2011–August 3 2011. p. 357–62.

[14] Pateraki M, Baltzakis H, Trahanias P. Visual estimation of pointed targets for robot guidance via fusion of face pose and hand orientation. In: IEEEInternational conference on computer vision workshops, 6–13 November, 2011, p. 1060–7.

[15] Zhang Y, Zhang J, Luo Y. A novel intelligent wheelchair control system based on hand gesture recognition. In: IEEE/ICME international conference oncomplex medical engineering, 22–25 May, 2011. p. 334–9.

[16] Salem M, Rohlfing K, Kopp S, Joublin F. A friendly gesture: investigating the effect of multimodal robot behavior in human–robot interaction. In: 20thIEEE symposium on robot and human interactive communication; July 31 2011–August 3 2011. p. 247–52.

[17] Tran C, Trivedi MM. 3-D posture and gesture recognition for interactivity in smart spaces. IEEE Trans Ind Inf 2012;8(1):178–87.[18] Ahsan MR, Ibrahimy MI, Khalifa OO. Electromygraphy (EMG) signal based hand gesture recognition using artificial neural network (ANN). In: 4th

International conference on mechatronics, May 17–19, 2011. p. 1–6.[19] Gosselin C, Pelletier F, Laliberte T. An anthropomorphic underactuated robotic hand with 15 Dofs and a single actuator. In: Proceedings of international

conference on robotics and automation, CA, USA, May 19–23, 2008. p. 749–54.[20] Freeman WT, Tanaka K, Ohta J, Kyuma K. Computer vision for computer games. In: Proceedings of 2nd international conference on automatic face and

gesture recognition, Killington, VT, USA, October 13–16, 1996. p. 100–5.[21] Shiraishi T, Ishitani A, Ito M, Karungaru S, Fukumi M. Operation improvement of indoor robot by gesture recognition. In: Proceedings of 4th

international conference on modeling, simulation and applied optimization, Kualalumpur, Malaysia, April 19–21, 2011. p. 1–4.[22] Triesch J, Malsburg CVD. Robotic gesture recognition. In: Proceedings of international gesture workshop, Bielefeld, Germany; 1997. p. 233–44.[23] Starner T, Pentland A. Real-time American sign language recognition from video using hidden markov models. Proc Int Symp Comput Vis

1995;23:265–70.[24] Shon S, Beh J, Yang C, Han DK, Ko H. Motion primitives for designing flexible gesture set in human–robot interface. In: Proceedings of 11th

international conference on control, automation and systems, Gyeonggi-do, Korea, 26–29 October, 2011. p. 1501–4.[25] Raheja JL, Das K, Chaudhary A. An efficient real time method of fingertip detection. In: Proceedings of 7th international conference on trends in

industrial measurements and automation (TIMA 2011). CSIR Complex, Chennai, India, 6–8 January, 2011. p. 447–50.[26] Wilamowski B, Chen Y. Efficient algorithm for training neural networks with one hidden layer. In: Proceedings of international joint conference on

neural networks (IJCNN 99), vol. 3; 1999. p. 1725–8.

Please cite this article in press as: Chaudhary A, Raheja JL. Bent fingers’ angle calculation using supervised ANN to control electro-mechan-ical robotic hand. Comput Electr Eng (2012), http://dx.doi.org/10.1016/j.compeleceng.2012.07.012

A. Chaudhary, J.L. Raheja / Computers and Electrical Engineering xxx (2012) xxx–xxx 11

[27] Lee S, Lee J, Chung W, Kim M, Lee C. A new exoskeleton-type masterarm with force reflection. In: Proceedings of IEEE/RSJ international conference onintelligent robots and systems, Kyongju, South Korea, 17–21 October, 1999. p. 1438–43.

[28] Ye Z, Li D. Principle and mechanical analysis of a pneumatic underactuated bionic hand. In: Proceedings of IEEE international conference on roboticsand biomimetics, Guilin, China, 19–23 December, 2009. p. 432–6.

[29] Jeong E, Lee J, Kim D. Finger-gesture recognition Glove using velostat. In: Proceedings of 11th international conference on control, automation andsystems, Gyeonggi-do, Korea, 26–29 October, 2011. p. 206–10.

[30] Cheng H, Sun Z, Zhang P. Imitok: real-time initiative robotic arm control for home robot applications. In: Proceedings of IEEE international conferenceon pervasive computing and communications workshops, Seattle, WA, 21–25 March, 2011. p. 360–3.

Ankit Chaudhary received his Master of Engineering in Computer Science from Birla Institute of Technology & Science (BITS), Pilani, India and Ph.D. inComputer Vision, from CEERI-BITS Pilani, India. He has worked as System Programmer at CITRIX R&D and AVAYA INC. His research interests are computervision, artificial intelligence and intelligent systems.

J.L. Raheja received his Master of Technology from Indian Institute of Technology Kharagpur, India and Ph.D. from Technical University of Munich,Germany. Currently he is Head, Machine Vision lab at Central Electronics Engineering Research Institute (CEERI), Pilani, India. He has been a DAAD and IFTAfellow. His research interests are digital image processing, embedded systems and human computer interface.

Please cite this article in press as: Chaudhary A, Raheja JL. Bent fingers’ angle calculation using supervised ANN to control electro-mechan-ical robotic hand. Comput Electr Eng (2012), http://dx.doi.org/10.1016/j.compeleceng.2012.07.012