Ijcot v5p308

Page 1

International Journal of Computer & Organization Trends – Volume 5 – February 2014

Real Time Hand Gesture Recognition Using CIELab Colour Space Model Manju Lecturer Department of Computer Engineering CDL Govt. Polytechnic ES, Nathusari Chopta, Sirsa- 125055 Deepika M.Tech. Scholar Department of Computer Science and Applications Ch. Devi lal University, Sirsa- 125055, Haryana (India)

Dr. Harish Rohil Asst. Professor Department of Computer Science and Applications Chaudhary Devi Lal University, Sirsa - 125055 Abstract- Moving beyond mouse and keyboard, the evolution of human-computer interaction (HCI) has been an interesting research field in recent years which changes from text-based like using a keyboard to graphic user interface (GUI) based on a mouse, from cumbersome data gloves and tracking devices to visual-based computer application. One of the interesting fields is by using hand gestures to interact with computer. Compared to many other methods, hand gestures have the advantages of being easy to use, natural and intuitive. Successful applications of hand gesture recognition include computer games control, human-robot interaction, and sign language recognition etc. In this paper, a real time vision based hand gesture interaction method is proposed .The basis of our approach is a fast segmentation process with CIELab colour space which separate hand from the image which is given as input by camera and then apply morphological operation for the removal of noise .

Keywords- human-computer interaction; CIELab colour space; hand gesture recognition; skin colour segmentation; morphology;

1. INTRODUCTION Human-computer interaction (HCI) can be described as an interaction between user exchanges of information with computer system [14]. From the last few years HCI is based on mouse and keyboard to control the computer. But many new techniques have been introduced. Now a day’s hand gesture recognition is the most popular method for communication because this is the natural way to communicate with the computer [8]. Body language is an important way of communication among humans because gestures are full message in itself. For gesture recognition, segmentation is necessary part to separate the skin pixel from background. Skin colour segmentation is the technique

ISSN: 2249-2593

http://www.ijcotjournal.org

Page 31


International Journal of Computer & Organization Trends – Volume 5 – February 2014 used to segment hand portion in hand gesture recognition with the help of any colour model like RGB, HSV, CIELab, YCbCr etc. Gesture recognition is one of the applications of skin colour segmentation [7]. The method proposed in this paper uses CIELab colour space for segment the hand portion. In real time, the method requires a high accuracy in detection and recognition because noise encountered in the environment may have a big impact on the detection and recognition performance of the hand gesture [10]. So to overcome the problem of noise, this approach uses morphology. So hand gesture technique is introduced to upgrade the input devices in the future. The content of this paper is organized as follows. Section 2 discusses about the related work. Section 3 gives the introduction of CIELab colour model. Section 4 is about the discussion of hand gesture recognition. Section 5 introduces proposed real time hand gesture recognition. Experimental evaluation is given in section 6. The paper is concluded in Section 7.

2. RELATED WORK Various techniques have been developed for hand gesture recognition. Broadly these techniques can be divided into two categories: Data Glove based techniques [5] and Vision based techniques [14]. The early technology of hand gesture detector used mechanical device to retrieve the information of the hand gesture [3]. One of the example included data glove devices [4, 5]. Data glove is quite popular at the several years ago. The vision-based hand gesture slowly replaces the role of data glove with non wearable devices because it is more naturalness without using any devices in the hand and user friendly which is important in human-computer interaction. In hand gesture recognition many researchers have proposed different methods like Can et al. in [16] uses support vector machine for hand segmentation but results shows that the speed of system is slow. The method used by Ehsan et al. in [8] named as voila jones algorithm which uses a set of Haar-like features, which processes a rectangular area of the image instead of a single pixel. So the authors of this paper propose a new and efficient method which overcomes the limitations of existing approaches.

3. CIELAB COLOUR MODEL CIELab is an opponent colour system based on the earlier system of Richard Hunter called l,a,b. It is most widely used colour management as the device independent model of the ICC device profile where ICC stands for international colour consortium. Lab is the abbreviated name of two different colour spaces. The best known is CIELAB strictly CIE 1976 L*a*b* and the other is Hunter Lab strictly Hunter L, a, b. Lab is an informal abbreviation and without further checking should not be assumed to be one or the other. L* represents the difference between light where L* is equals to100 and dark where L* equals to 0. The term a* represents the difference between green a* and red +a*. Term b* represents the difference between yellow +b* and blue -b*. Variables of L*, a*, b* or E* are represented as delta L*, delta a*, delta b* or delta E* where delta E* equals to delta. The CIELab colour space is an

ISSN: 2249-2593

http://www.ijcotjournal.org

Page 32


International Journal of Computer & Organization Trends – Volume 5 – February 2014 absolute colour space because it defines colours exactly. It does not depend on input devices such as camera or output devices monitors and printer [7] [15]. There are various characteristics of CIE lab colour model. These are: 

This colour model does not depend on input devices such as camera or output devices like monitor and printer. So this model is device independent means its colour values are absolute and not tied to any particular piece of hardware.

CIELab colour space is perceptually uniform. This means any movement within the space in any direction should result in the equally perceptible colour shift;

CIELab includes more colour even more than the human eye can see than other colour spaces.

In CIELab colour spaces, the computation of the luminance and the chroma is obtained through a nonlinear mapping of the XYZ coordinates.

The technique which is used in this space for skin colour segmentation is that RGB image is converted into gray image. CIELab colour space detects the skin colour and converted it into binary image.

4. HAND GESTURE RECOGNITION Gesture is defined as an expressive movement of body parts which has a particular message to be communicated precisely between sender and receiver [11]. Gesture recognition pertains to recognizing meaningful expressions of motion by a human, involving the hands, arms, face, head and body. There are various types of gestures performed by different body parts but in this paper technique for hand gesture recognition is proposed. Hand gestures are an important modality for HCI because out of all the methods, this is very natural and easy to use as it requires only the camera of our system. Gesture recognition is the process of recognizing and interpreting a stream continuous sequential gesture from the given set of input data [4]. In this paper, we propose a gestures recognition which uses CIELab colour space to recognize a number of well-defined hand gestures.

5. PROPOSED ALGORITHM FOR REAL TIME HAND GESTURE RECOGNITION In the paper, a new and efficient method for real time hand gesture recognition by using CIELab colour space has been proposed which detects the hand portion in real time by skin colour segmentation with suitable colour model i.e CIELab colour model. This section explains the various steps of proposed algorithm. The last step of this algorithm is skin colour segmentation which is further a sub algorithm. Both the algorithm can be implemented in MATLAB. The steps of both algorithm that is real time hand gesture recognition and skin colour segmentation is shown in Figure 1 and Figure 2

ISSN: 2249-2593

http://www.ijcotjournal.org

Page 33


International Journal of Computer & Organization Trends – Volume 5 – February 2014 Hand Detection: The very first step is the detection of hand for which our method proposed the setting of the webcam of our system because our system works in real time. This step checks whether the camera is supported for our system or not. If it is not supported than the setting must be done in MATLAB. After this our camera is able to take the image of hand.

Algorithm: Real Time Hand Gesture Recognition. Input: An Image of hand From the Web Cam is given as input. Output: This algorithm generates a background from the image. Begin: Step1: Input image from the camera. Step2: Set the dimensions to get the required part of the image. Step3: Resize the image . Step 4: Separate the background from the image. Step 5: Apply skin colour segmentation using CIELab colour space. End.

Figure 1. Real Time Hand Gesture Recognition Algorithm Hand Segmentation: Segmentation is the necessary step of our method. It forms the basis of gesture recognition. Segmentation can be done with the help of many colour spaces but the proposed method uses CIELab for segmentation because it is device independent. The segmentation steps are shown in Figure 2. In this algorithm first step is the image from our webcam which is in RGB format. The next step involves the conversion of RGB image into grayscale image by giving command in MATLAB. The reason for converting RGB image to grayscale is because many image processing algorithm apply on grayscale images not on RGB. Grayscale images used just single channel of colour, just 8 bit for representation. So in the case of this type of image we have fewer bits to treat so complexity of our code reduces. After this the grayscale image is converted into Lab space by applying threshold

ISSN: 2249-2593

http://www.ijcotjournal.org

Page 34


International Journal of Computer & Organization Trends – Volume 5 – February 2014 value for skin colour detection. For our method threshold is chosen for all types of skin colour. By using “makecform” the conversion from grayscale to lab space takes place because it creates the colour transformation structure. This step marks the skin pixels in the image. Then this is converted to binary image. And we get the segmented image. Morphology: After the skin colour detection apply the morphological operation on image to remove the imperfections obtained in binary image by segmentation. When we do skin colour segmentation by simply applying threshold then the image formed by it are distorted by noise. Morphological image processing pursues the goals of removing these imperfections by accounting for the form and structure of the image [8].

Algorithm: Skin Colour Segmentation With CIELab Colour Model. Input: An Image is input in this in RGB form. Output: Skin colour pixels get separated and Image after noise removal comes in output. Begin: Step 1: Initialize RGB as input image. Step 2: RGB image is converted into grayscale image. Step 3: Gray scale image is converted into lab space by applying the threshold values for skin colour detection. Step 4: Mark skin pixels Step 5: Image after CIELab colour space is converted into binary image. Step 6: After skin colour segmentation, we get noisy image that is filtered by median filter. Step 7: For further refined image, apply the morphological operation on image. Step 8: Exit. End. Figure 2. Skin Colour Segmentation Using CIELab Colour Model

ISSN: 2249-2593

http://www.ijcotjournal.org

Page 35


International Journal of Computer & Organization Trends – Volume 5 – February 2014 6. EXPERIMENTAL EVALUATION An experiment is undertaken to test the functionality of the proposed approach with the help of MATLAB. The effectiveness of the method can be demonstrated by many gestures. Images are taken from the camera of our system on which the method is implemented. The experimental evaluation is done on the basis of noise removal. Various gestures and images are taken for the experimental evaluation of proposed work. For the segmentation process defined in Figure 2, various outputs are shown in figures from Figure 3 to Figure 7.

Figure 3. RGB Image

Figure 4. Gray Scale Image

Figure 5. Mark Skin Pixels

Figure 6. Segment Face with Noise

Figure 7. Segment Face without Noise

Figure 3 shows the RGB image which is input to the system. Then this input image is converted into grayscale image shown in Figure 4. Then to detect the skin pixels, threshold value are applied which separate the skin pixels from non skin pixels shown in Figure 6. Segmented image is shown in Figure 6 which contain noise and to remove the noise, filtering and morphological operator is applied which remove noise from the image shown in Figure 7. Then the real time hand gesture recognition is evaluated by different gestures. To show the efficiency of our method for real time, authors of this paper take full hand gesture. For the capturing of gesture, the settings for our camera are to be done because our system works in real time. When the method proposed in this paper is implemented in MATLAB, first window which comes is shown in Figure 8 and the various outputs are shown in figures from Figure 9 to Figure 16. Figure 9 shows the background frame which is to be cut from our background. It is the part of our background. The next step involve is the formation of bounding box in which the input image is captured by our camera shown in Figure 10. Figure 11 show the input image in the form of RGB captured by our camera. Then to segment the image, various steps mentioned in Figure 1 are applied. Firstly this image is converted to grayscale because for segmentation there is need to convert image from RGB to grayscale shown in Figure 12. In the proposed method CIELab colour model is used for segmentation so the next image that is Figure 13 is the skin pixels detected in the image by applying threshold value. The threshold value detects the skin pixels in the image and then separates the skin pixels

ISSN: 2249-2593

http://www.ijcotjournal.org

Page 36


International Journal of Computer & Organization Trends – Volume 5 – February 2014 from non skin pixels shown in Figure 14. The image shown in Figure 14 contains unwanted noise so the method in this paper apply filtering and morphology to remove noise shown in Figure 15 and Figure 16 is the final output of this method [8].

Figure 8. Real Time Hand Gesture Algorithm in Action

Figure 9. Background Frame

ISSN: 2249-2593

Figure 10. Bounding Box

Figure 11. RGB Image

Figure 12. Grayscale Image

http://www.ijcotjournal.org

Page 37


International Journal of Computer & Organization Trends – Volume 5 – February 2014

Figure 13. Skin Detection

Figure 14. Binary Image

Figure 15. After Noise Removal

7.

Figure 16. After Morphology

CONCLUSION

In this paper, we proposed an alternative method to replace the traditional way that human interacts with computer. The evolution of computer technology has revealed that human can interact with computer by using non-verbal language. The main focus of this work is on the hand segmentation that is to separate skin pixels from non skin pixels by applying suitable threshold as a pre processing step for gesture recognition. CIELab colour model is used for skin colour segmentation for the proposed work. Segmentation is applied in real environment for gesture recognition. Then experiments were performed on the different images for the evaluation of the proposed method taking noise removal as the assessment criteria. Results show that the proposed method works well for hand gesture recognition in real time and gives the segmented image free from noise.

REFERENCES [1]

Alasdiar McAndrew, “An introduction to Digital Image Processing with MATLAB”, 2004.

[2]

Adrian Ford and Alan Roberts, “Colour Space Conversions”, 1998.

[3]

Beifang yi, Frederick C. Harris, Ling Wang and Yusong Yan, “Real Time Natural Hand Gestures”, pp. 92-96, 2005.

[4]

David Houcque, “Introduction to Matlab”, 2005.

[5]

Devendra Singh Raghuvanshi, Dheeraj Aggarwal, “Human Face Detection By Using Skin Colour Segmentation”, International Journal of computer applications, Vol . 38, No. 9, pp. 14-17, 2012.

[6]

Ehsan ul haq, Mirza Waqar Baiq and Hyunchal Shin, “New Hand Gesture Recognition Method for mouse operation”, 2011.

[7]

Harshith.C, Karthik.R.Shastry, Manoj Ravindran, M.V.V.N.S Srikanth, Naveen Lakshmikhanth, “ Survey On Various Gesture Recognition Techniques For Interfacing Machines Based On Ambient Intelligence”, International Journal of Computer Science & Engineering Survey (IJCSES), Vol. 1, No. 2, pp. 31-42, 2010.

ISSN: 2249-2593

http://www.ijcotjournal.org

Page 38


International Journal of Computer & Organization Trends – Volume 5 – February 2014 [8]

Hebert Luchetti and Ribeiro Adilson Gonzaga, “Hand Image Segmentation in Video Sequence by

GMM: a

comparative analysis”, XIX Brazilian Symposium on Computer Graphics and Image Processing, 2006. [9]

J. David and Z. David, “A survey of Glove –based Input,”IEEE Computer Graphics & Applications”, Vol. 4, pp. 30-39, 1994.

[10] Kuan Yu Chen,Cheng Chin Chien,Wen Lung Chang,Jyh Tong Tang, “An Integrated Colour and Hand Gesture Recognition Approach for an Autonomous Mobile Robot”, 3rd International Congress On Image And Signal Processing, pp. 2496-2500, 2010. [11] Michael Hoffman,Paul Varcholik, “Improving Gesture Recognition With Spatially Convenient Input Devices”, pp. 59-66, 2010. [12] Mr. Chetan A. Burande, Prof. Raju M. Tungnayat and Prof. Nitin K Choudhary, “Advanced Recognition Technique for Human Computer Interaction”, pp. 480-483, 2010. [13] R. C. Gonzalez, R. E. Woods, S. L. Eddins, “Digital image processing using MATLAB”, 5th Edition, Pearson, 2009. [14] Erika Rogers, “ Introduction To Human Computer Interaction”, RAS/IFRR Summer School on

Human-

Robot Interaction” July 19-23, 2004. [15] Sanjay Singh, D.S Chauhan, Mayank Vasta and Richa singh , “A Robust Skin Colour Based Face Detection Algorithm”, Tamkang Journal of science and engineering, Vol . 6, No 4, pp. 227- 234, 2003. [16] Can Xie, Jun Chang,Qi Xie , “Vision Based Hand Gesture Spotting and Recognition” International Conference On Information, Networking And Automation, pp. 391-395, 2010.

ISSN: 2249-2593

http://www.ijcotjournal.org

Page 39


Turn static files into dynamic content formats.

Create a flipbook
Issuu converts static files into: digital portfolios, online yearbooks, online catalogs, digital photo albums and more. Sign up and create your flipbook.