Second

Page 1

International Engineering Journal For Research & Development

E-ISSN NO:-2349-0721

Vol.4 Issue 1

Impact factor : 3.35

GESTURE RECOGNITION OF HANDWRITTEN ALPHANUMERIC CHARACTERS USING HD CAMERA BASED DIGITAL PEN 1

1

Prof. A. Bhujade , 2Vidya Pohane, 3Apurva Pande, 4Nikita Waghwani Professor , 2,3,4B.E. Student, ETC Department, SRMCEW, RTMNU University, Nagpur

________________________________________________________________________________________

ABSTRACT: This paper presents a digital pen for handwritten digit and character for interactive printing applications. It is consists of an MATLAB processor, microphone, & camera. The drawing application lets user draw on any surface by observing the movement of index finger. Also can access or manipulate the information using fingers. Take a photo by Just make a square with fingers using color markers, & highlight that frame, and this system will capture the photo. Arrange these photos using own hands over free space. The device has a large number of applications, it is portable and easy to carry & that can wear it in neck. The camera also helps user to take pictures of the scene is viewing and later can arrange them on any surface. This system also used for reading a newspaper. Reading & viewing videos in news paper instead of the photos. Live sports can be updates while reading the newspaper. Users can use the pen to write digits or character. Our experimental results have successfully validated the effectiveness of the trajectory recognition algorithm for handwritten digit and character using the proposed digital pen. Keywords :MATLAB processor, camera, microphone, color markers, MATLAB, Human-Computer Interfacing.

INTRODUCTION Explosive growth of miniaturization technologies in electronic circuits and components has greatly decreased the dimension and weight of consumer electronic products, like smart phones and computers, and made them more handy and convenient. Due to the continuous development of computer technology, human– computer interaction (HCI) techniques have become an indispensable component in our daily life. Although the mi-niaturization of computing devices allows us to carry comput-ers in our pockets, keeping us continuously connected to the digital system, there is no connection between our digital de-vices and our interactions with the physical world. We use our ―devices‖ like, computers, mobile phones, tablets, etc. to go into the internet and we can get the information that we want. We will use a device no bigger than current cell phones and probably eventually as small as a button on our shirts to bring the internet to us in order to interact with our world! This tech-nology will allow us to interact with our world like never before [1]. We can get information we want on anything from any-where within a few moments! We will not only be able to inte-ract with things on a whole

www.iejrd.in

1


International Engineering Journal For Research & Development

Vol.4 Issue 1

new level but also with people! One great part of the device is its ability to scan objects or even people and project out information regarding what you are looking at. HARDWARE DESIGN Initializing GUIDE (GUI Creator) 1.

First, open up MATLAB. Go to the command window and type in guide.

2.

You should see the following screen appear. Choose the first option Blank GUI (Default).

3.

You should now see the following screen (or something similar depending on what version of MATLAB you are using and what the predesignated settings are):

4.

Before adding components blindly, it is good to have a rough idea about how you want the graphical part of the GUI to look like so that it’ll be easier to lay it out. Below is a sample of what the finished GUI might look like.

www.iejrd.in

2


International Engineering Journal For Research & Development

Vol.4 Issue 1

Creating the Visual Aspect of the GUI: Part 1 1.

For the adder GUI, we will need the following components o

TwoEdit Text components

o

ThreeStatic Text component

o

OnePushbuttoncomponent

Add in all these components to the GUI by clicking on the icon and placing it onto the grid. At this point, your GUI should look similar to the figure below :

2.

Next, its time to edit the properties of these components. Let’s start with the static text. Double click one of the Static Text components. You should see the following table appear. It is called the Property Inspector and allows you to modify the properties of a component.

www.iejrd.in

3


International Engineering Journal For Research & Development 3.

Vol.4 Issue 1

We’re interested in changing the String parameter. Go ahead and edit this text to +. 4.

Let’s also change the font size from 8 to 20.

After modifying these properties, the component may not be fully visible on the GUI editor. This can be fixed if you resize the component, i.e. use your mouse cursor and stretch the component to make it larger. 5.

Now, do the same for the next Static Text component, but instead of changing the String parameter to +, change it to =.

6.

For the third Static Text component, change the String parameter to whatever you want as the title to your GUI. I kept it simple and named it MyAdderGUI. You can also experiment around with the different font options as well.

7.

For the final Static Text component, we want to set the String Parameter to 0. In addition, we want to modify the Tag parameter for this component. The Tag parameter is basically the variable name of this component. Let’s call it answer_staticText. This component will be used to display our answer, as you have probably already have guessed.

www.iejrd.in

4


International Engineering Journal For Research & Development

Vol.4 Issue 1

8.

9.

So now, you should have something that looks like the following:

Creating the Visual Aspect of the GUI: Part 2 1.

Next, lets modify the Edit Text components. Double click on the first Edit Text component. We want to set the String parameter to 0 and we also want to change the Tag parameter to input1_editText, as shown below. This component will store the first of two numbers that will be added together.

2.

For the second Edit Text component, set the String parameter to 0BUT set the Tag parameter input2_editText. This component will store the second of two numbers that will be added together.

3.

Finally, we need to modify the pushbutton component. Change the String parameter to Add!and change the Tag parameter to add_pushbutton. Pushing this button will display the sum of the two input numbers.

www.iejrd.in

5


International Engineering Journal For Research & Development

Vol.4 Issue 1

4.

5.

So now, you should have something like this:

Rearrange your components accordingly. You should have something like this when you are done:

6.

Now, save your GUI under any file name you please. I chose to name mine myAdder. When you save this file, MATLAB automatically generates two files: myAdder.fig and myAdder.m. The .fig file contains the graphics of your interface. The .m file contains all the code for the GUI. Writing the Code for the GUI Callbacks MATLAB automatically generates an .m file to go along with the figure that you just put together. The .m file is where we attach the appropriate code to the callback of each component. For the purposes of this tutorial, we are

www.iejrd.in

6


International Engineering Journal For Research & Development

Vol.4 Issue 1

primarily concerned only with the callback functions. You don’t have to worry about any of the other function types. 1.

Open up the .m file that was automatically generated when you saved your GUI. In the MATLAB editor, click on the

2.

icon, which will bring up a list of the functions within the .m file. Selectinput1_editText_Callback.

The cursor should take you to the following code block:

3.

function input1_editText_Callback(hObject, eventdata, handles)

4.

% hObject

5.

% eventdata reserved - to be defined in a future version of MATLAB

6.

% handles

handle to input1_editText (see GCBO)

structure with handles and user data (see GUIDATA)

7. 8.

% Hint: get(hObject,'String') returns contents of input1_editText as text

9.

%

10. %

str2double(get(hObject,'String')) returns contents of input1_editText as a double

Add the following code to the bottom of that code block:

%store the contents of input1_editText as a string. if the string %is not a number then input will be empty input = str2num(get(hObject,'String'));

%checks to see if input is empty. if so, default input1_editText to zero if (isempty(input)) set(hObject,'String','0') end guidata(hObject, handles);

www.iejrd.in

7


International Engineering Journal For Research & Development

Vol.4 Issue 1

This piece of code simply makes sure that the input is well defined. We don’t want the user to put in inputs that aren’t numbers! The last line of code tells the gui to update the handles structure after the callback is complete. The handles stores all the relevant data related to the GUI. This topic will be discussed in depth in a different tutorial. For now, you should take it at face value that it’s a good idea to end each callback function with guidata(hObject, handles); so that the handles are always updated after each callback. This can saveyoufrompotentialheadacheslateron. 11. Add the same block of code to input2_editText_Callback. 12. Now we need to edit the add_pushbutton_Callback. Click on the

icon and select add_pushbutton_Callback.

The following code block is what you should see in the .m file.

13. % --- Executes on button press in add_pushbutton. 14. functionadd_pushbutton_Callback(hObject, eventdata, handles) 15. % hObject

handle to add_pushbutton (see GCBO)

16. % eventdata reserved - to be defined in a future version of MATLAB 17. % handles

structure with handles and user data (see GUIDATA)

Here is the code that we will add to this callback:

a = get(handles.input1_editText,'String'); b = get(handles.input2_editText,'String'); % a and b are variables of Strings type, and need to be converted % to variables of Number type before they can be added together

total = str2num(a) + str2num(b); c = num2str(total); % need to convert the answer back into String type to display it set(handles.answer_staticText,'String',c); guidata(hObject, handles); 18. Let’s discuss how the code we just added works:

19. a = get(handles.input1_editText,'String'); 20. b = get(handles.input2_editText,'String');

www.iejrd.in

8


International Engineering Journal For Research & Development

Vol.4 Issue 1

The two lines of code above take the strings within the Edit Text components, and stores them into the variables a andb. Since they are variables of String type, and not Number type, we cannot simply add them together. Thus, we must convert a andb to Number type before MATLAB can add them together. 21. We can convert variables of String type to Number type using the MATLAB command str2num(String argument). Similarly, we can do the opposite using num2str(Number argument). The following line of code is used to add the two inputs together.

22. total= (str2num(a) + str2num(b)); The next line of code converts the sum variable to String type and stores it into the variable c. Color Markers It is at the tip of the fingers & marked the user’s fingers with yellow, red, blue, and green color tape helps the webcam recognize gestures. The arrangements and movements of color makers are interpreted into gestures that act as interaction instructions for the projected application interfaces.

Fig. 3.Block diagram of digital pen for handwritten digit &character for interactive printing SYSTEM IMPLEMENTATION

Gesture Recognition using Phase Correlation Algorithm In this project topic we use the Gesture recognition using Phase Correlation algorithm. Gesture recognition is a topic of computer science and language technology. The goal of Ges-ture recognition is to interpreting human gestures via mathe-matical algorithms. Gestures can originate from anybody mo-tion. But this recognition commonly originates from the face or hand. This system uses emotion recognition from the hand gesture recognition. Cameras have been made number of approaches. Gestures can exist in isolation or involve external

www.iejrd.in

9


International Engineering Journal For Research & Development

Vol.4 Issue 1

objects. With respect to objects, we have a large range of ges-tures that are almost unique, consisting of pointing at objects, changing the shape of object, activate the objects. Gesture recognition is used to understand human body language and also used for human-computer interface, thus building a richer bridge between machines and humans than primitive text user interfaces or even GUIs (graphical user interfaces. Gesture recognition enables humans to interface with the computer (HCI) and interact naturally without any mechanical devices. Gestures can be used to communicate. These can further be categorized according to their functions. Gesture recognition is used to collect the information about position, motion, pose. Following steps are used for Hand & finger tracking: Pixel level segmentation Regions of pixels corresponding to the hand are extracted by color segmentation or background subtraction. Then the de-tected regions are analyzed to determine the position and orientation of the hand. The color of human skin varies greatly between individuals and under changing illumination. Ad-vanced segmentation algorithms, that can handle this, have been proposed [43][5], however these are computationally demanding and still sensitive to quickly changing or mixed lighting conditions. In addition color segmentation can be con-fused by objects in the background with a color similar to skin color. Background subtractions only work on a known or at least a static background, and consequently are not usable for mobile or wearable use. Alternatives are to use markers on the fingers [39] or use infrared lighting to enhance the skin objects in the image.

Motion segmentation Moving objects in the video stream can be detected by calcu-lation of inter frame differences and optical flow. In a system capable of tracking moving objects on a moving background with a hand held camera is presented. However, such a sys-tem cannot detect a stationary hand or determine which of several moving objects the hand is.

Contour detection Much information can be obtained by just extracting the con-tours of objects in the image. The contour represent the shape of the hand and is therefore not directly dependent on skin color and lighting conditions. Extracting contours by edge de-tection will result in a large number of edges both from the tracked hand and from the background. Therefore some form of intelligent post processing is needed to make a reliable sys-tem.

Correlation: A hand or fingertip can be sought in a frame by comparing areas of the frame with a template image of the hand or finger-tip [4]. To determine where the target is, the template must be translated over some region of interest and correlated with the neighborhood of every pixel. The pixel resulting in the highest correlation is selected as the position of the object. This prob-lem can be addressed by continuously updating the template [4], with the risk of ending up tracking something other than the hand. Tracking

www.iejrd.in

10


International Engineering Journal For Research & Development

Vol.4 Issue 1

On top of most of the low level processing methods a tracking layer is needed to identify hands and follow these from frame to frame. Depending on the nature of the low level feature ex-traction, this can be done by directly tracking one prominent feature or by inferring the motion and position of the hand from the entire feature set. Flow chart for this system is as follows:

Fig. 4.Flow chart of digital pen for handwritten digit &character for interactive printing

REFERENCES [1]. Jeen-Shing Wang, Member, IEEE, and Fang-Chen Chuang, ―An Accelerometer-Based Digital Pen With a Trajectory Recognition Algorithm for Handwritten Digit and Gesture Recognition‖, IEEE TRANSACTIONS ON INDUSTRIAL ELECTRONICS, VOL. 59, NO. 7, JULY 2012. [2]. MoniruzzamanBhuiyan and Rich Picking, ―Gesture-controlled user interfaces, what have we done and what’s next?‖, Centre for Applied Internet Research (CAIR), Glyndŵr University, Wrexham, UK.

www.iejrd.in

11


International Engineering Journal For Research & Development

Vol.4 Issue 1

[3]. Piyush Kumar, JyotiVerma and Shitala Prasad, ― Hand Data Glove: A Wearable Real-Time Device for Human-Computer Interaction‖, International Journal of Advanced Science and Technology Vol. 43, June, 2012. [4]. RENUKA R, SUGANYA V & ARUNKUMAR B ―ON-LINE HAND WRITTEN CHARACTER RECOGNI-TION USING DIGITAL PEN FOR STATIC AUTHEN-TICATION‖, Proceedings of IRAJ International Confe-rence, 20th October 2013, Chennai, India. ISBN: 978-93-82702-34-4. [5]. SushmitaMitra, Senior Member, IEEE, and TinkuAcharya, Senior Member, IEEE, ―Gesture Recogni-tion: A Survey‖, IEEE TRANSACTIONS ON SYS-TEMS, MAN, AND CYBERNETICS—PART C: APPLICATIONS AND REVIEWS, VOL. 37, NO. 3, MAY 2007. [6]. Santosh K.C., CholwichNattee, ―A COMPREHEN-SIVE SURVEY ON ON-LINE HANDWRITING REC-OGNITION TECHNOLOGY AND ITS REAL APPLI-CATION TO THE NEPALESE NATURAL HANDWRITING‖, KATHMANDU UNIVERSITY JOURNAL OF SCIENCE, ENGINEERING AND TECHNOLGY VOL. 5, No. I, JANUARY, 2009, pp 31-55. [7]. Priyanka.M, S.Ramakrishanan and N.R.Raajan,‖ ELECTRONIC HANDWRITING CHARACTER RECOGNITION (E-HWCR)‖, International Journal of Engi-neering and Technology (IJET), Vol 5 No 3 Jun-Jul 2013. [8]. Meenaakumari.M, M.Muthulakshmi, ―MEMS ACCE-LEROMETER BASED HAND GESTURE RECOGNI-TION‖, International Journal of Advanced Research in Computer Engineering & Technology (IJARCET) Vo-lume 2, No 5, May 2013. [9]. P.Venkatesh1 &P.Sireeshbabu, ―An Accelerometer-Based Digital Pen With a Trajectory Recognition Algo-rithm for Handwritten Digit and Gesture Recognition‖, IJAIR Vol. 2 Issue 8 ISSN: 2278-784,2013. [10]. Jeen-Shing Wang, Member, IEEE, and Fang-Che Chuang,‖ An Accelerometer-Based Digital Pen With a Trajectory Recognition Algorithm for Handwritten Digit and Gesture Recognition‖, IEEE TRANSACTIONS ON INDUSTRIAL ELECTRONICS, VOL. 59, NO. 7, JULY 2012.

www.iejrd.in

12


Turn static files into dynamic content formats.

Create a flipbook
Issuu converts static files into: digital portfolios, online yearbooks, online catalogs, digital photo albums and more. Sign up and create your flipbook.