ISBN: 378-26-138420-01
INTERNATIONAL CONFERENCE ON CURRENT TRENDS IN ENGINEERING RESEARCH, ICCTER - 2014
GESTURE RECOGNITION
BY AMREEN AKTHAR .J. AKSHAYA .B. III YEAR E.C.E PANIMALAR ENGINEERING COLLEGE
INTERNATIONAL ASSOCIATION OF ENGINEERING & TECHNOLOGY FOR SKILL DEVELOPMENT
42
www.iaetsd.in
ISBN: 378-26-138420-01
INTERNATIONAL CONFERENCE ON CURRENT TRENDS IN ENGINEERING RESEARCH, ICCTER - 2014
placed in particular Position is satisfied. This value is fed to microcontroller, which is preprogrammed will display the corresponding words. At the same time voice output will be heard for the corresponding words with the help of speak jet chip. On the other hand voice from normal people is converted and displayed in to corresponding sign symbol which is pre stored.
Abstract—Tongue and Ear plays a major role for speaking and hearing by normal person. But it is impossible to speak and hear by deaf and dumb people. But they normally speak using their sign action with others. It is easily understood by their community, but they fell difficult when they communicate with normal people because normal person can’t able to understand their sign symbol. To tackle this
II. RELATED WORK
situation we design a system which converts their sign symbol
L.k.Simone has introduced a low cost method to measure the flexion of fingers. He use flex sensor for measuring flexion. He has evaluated the custom glove for measuring finger motion. Some of the parameters he has evaluated are donning, glove comfort and durability. Wald developed software for editing automatic speech recognition in real time for deaf and hard-hearing people. Syed Faiz, Ahmed, Syed Baber Ali, Saqib Qureshi developed an electronic speaking glove for speechless patients which are one way communication. Jingdong Zhao, developed a five finger under actuated prosthetic hand system.
to text as well as voice output and normal person’s voice to corresponding sign symbol for two way communication. This system has flex sensor and IMU (Inertial Measurement Unit) to recognize their sign symbol, speech synthesis chip for voice output and speech recognizing module for converting voice to sign symbol. These are interfaced with microcontroller, which is programmed to obtain corresponding output. Keywords— Flex sensor, IMU, speak jet IC, speech recognition module.
I.
III. SYSTEM LAYOUT INTRODUCTION
Figure 1 below shows the proposed system module. In this system flex sensor is used to recognize finger position to obtain words, phrases, sentences etc. This value is signal conditioned using LM342 IC and other components, which is given as input to micro controller. In order to get accurate sign symbol, words or phrases as output, microcontroller is interfaced with IMU which consists of gyroscope, accelerometer and magnetometer. This sensor is used to determine tilt, rotation and rate of finger. By comparing flex sensor and IMU values microcontroller will display corresponding words or phrases. As an option these captured words also send to mobile using Bluetooth module. Output from Speak jet IC is fed to speaker to speak according to phonemes stored in controller for the captured values by combining flex sensor and IMU sensor.
Many research people have undergone research in order to overcome difficulties faced by physically challenged people. Many developed the system related to prosthetic hands which is used to find behavior of human hand. This project is similar kind as a part of determining sign words using hand motion. The main feature of this system is: Hand held real time embedded device. Low cost and reliable. Operated using battery. In order to get accurate words or sentences IMU is used to find the exact position of the hand because placing hands in any position will give same values, to overcome this IMU is used. It is placed in the hand along with flex sensors so that its value gets changed according to the position of hand. So that for a particular word for which the hand should be
Similarly voice from normal people is captured using microphone and fed to microcontroller through speech
INTERNATIONAL ASSOCIATION OF ENGINEERING & TECHNOLOGY FOR SKILL DEVELOPMENT
43
www.iaetsd.in
ISBN: 378-26-138420-01
INTERNATIONAL CONFERENCE ON CURRENT TRENDS IN ENGINEERING RESEARCH, ICCTER - 2014 recognizing module. Controller which is preprogrammed will display corresponding symbol.
resistance values get changed which is fed to controller after signal conditioning it.
Figure 2. Flex sensor .
Signal conditioning circuit: It is shown in figure 3. For a simple deflection to voltage conversion, the bend sensor device is tied to a resistor Rm in a voltage divider configuration. The output is described by Vout = (V+)/ [1 + Rm/Bend sensor]
Figure1. Block Diagram.
IV. IMPLEMENTATION Obtaining signal from fingers which is used to recognize sign symbol consists of lot of methods including [5]
Figure.3. Flex sensor signal conditioning circuit
EMG (Electromyography) Load cell Wearable conductive fiber Sliding fiber optic cable Flex sensor
Output from this sensor is fed to Op-Amps LM324 to boost up the circuit current. For different values of resistor Rm, different deflection Vs voltage curve is obtained as shown in figure 4.
In this system flex sensor is used to recognize hand gesture due to reliable and cost effective. A. Flex sensor: Flex sensors technology is based on resistive carbon elements. As a variable printed resistor the sensor achieves great form factor on a thin substrate. When the substrate is bent, the sensor produces a resistance output correlated to bent radius as shown in figure 2. Smaller the radius, higher the resistance value. It varies approximately from 10k to 50k.
Figure4. Deflection Vs Vout [6].
B. Inertial Measurement Unit: IMU used in this system is MinIMU-9 v2 from Pololu. It consists of 3-axis accelerometer, 3-axis magnetometer and 3-axis gyro sensors. An I2C interface access nine independent rotations, acceleration and
It offers a superior solution for application that requires accurate measurement and sensing of deflection, acceleration. It is placed inside the gloves which are to be worn. As the finger is bent for corresponding words,
INTERNATIONAL ASSOCIATION OF ENGINEERING & TECHNOLOGY FOR SKILL DEVELOPMENT
44
www.iaetsd.in
ISBN: 378-26-138420-01
INTERNATIONAL CONFERENCE ON CURRENT TRENDS IN ENGINEERING RESEARCH, ICCTER - 2014 magnetic measurement that can be used to calculate the sensor’s absolute orientation. L3GD20: It is low power 3-axis angular rate sensor capable of providing the measured angular rate to the external world through I2C terminal. The direction of angular rate is shown in figure 5.
Figure.8. Speak jet typical connection.
D. Speech recognizing module The module used here is VRbot. It is used to recognize voice from normal people through inbuilt microphone. It communicates with microcontroller using UART interface which is shown in figure 9. Figure.5. Direction of angular rate
LSM303DLHC: It is a combination of digital linear accelerometer and magnetometer sensor. It can support standard and fast mode I2C serial interface. Direction of acceleration and magnetic field is shown in figure 6 and figure 7.
Figure.9. VRbot interfaced with microcontroller .
Fig.6.Direction of ACC
It has built-in speaker independent command and also supports 32 user defined commands, which is used to recognize corresponding words spoken by people and displayed it in LCD which can be understood by physically challenged people.
Fig.7.Direction of Mag field
Whenever we change the position of hand for particular word, the values of IMU gets changed. This IMU sensor is interfaced with microcontroller. By comparing the values of flex sensor and IMU we can recognize the correct word or phrase with correct position of hand, which can be displayed in LCD.
V. PROGRAMMING THE HARDWARE The code required for this system is written in embedded C language which can be compiled and debugged using integrated development environment (IDE). The software used to program the hardware’s are
C. Speak jet IC
A.MPLAB
Speak jet IC is self contained single chip for sound synthesizer using mathematical sound architecture technology. It is preconfigured with 72 speech elements, 43 sound effects and 12 DTMF touch tones. This is interfaced with microcontroller which is pre-programmed to send serial data to speak jet IC to speak the corresponding words or sentences by combining words. In order to hear by normal people output from speak jet IC is amplified by giving it to LM386 audio amplifier which is connected for particular gain and then to speaker. The connection recommended by manufacturer is shown in figure 8.
This is the IDE with integrated toolset for developing an embedded application using PIC microcontroller. It consists of a text editor, simulator, and device drivers made by microchip. In this code can be built using assembly or in C language. Device and language for coding can be selected according to our need. B. CCS Compiler A compiler is a programming language used to transfer source code in to object ode. The compiler used
INTERNATIONAL ASSOCIATION OF ENGINEERING & TECHNOLOGY FOR SKILL DEVELOPMENT
45
www.iaetsd.in
ISBN: 378-26-138420-01
INTERNATIONAL CONFERENCE ON CURRENT TRENDS IN ENGINEERING RESEARCH, ICCTER - 2014 here is CCS compiler, which is commonly used for PIC controller. It has many inbuilt function by including that we can call the whole functions, which makes the program to be very simple this compiler can implement normal C constructs, input/output operations etc.
VI. RESULT The hardware circuit of the module is shown in figure12. It consists of microcontroller interfaced flex sensors, speak jet IC, etc.
C. Phrase-A-Lator This software is a demonstration program from Magnevation which allows speak jet IC to speak. It is used to set voice quality like pitch, volume and speed. It will generate the code for the corresponding words we need which is then used in main code to make the speak jet IC to speak. The main menu for this software is shown in fig.10, is used to select communication settings and other editor menu. When connected to PC the serial port check box will turn the color to green if correct COM port is selected.
Figure10. Magnevation phrase transistor for speak jet.
Phrase editor is used to develop words, phrases and sound effects using built-in phonemes and sound effects. Required words or phrases can be written in say data area and corresponding code can be obtained by pressing view code button. This is shown in figure 11.
Figure12. Hardware circuit of the system.
The hex file obtained for the code after compilation is downloaded in to PIC controller and corresponding words is displayed in LCD. The words or conversation is obtained by taking signed English as a reference. For each and every word, values from flex sensors is compared with IMU sensors and fed to micro controller which is displayed in LCD display and also voice output is obtained through speak jet IC. This output is also transmitted to mobile using Bluetooth module. Accelerometer output obtained from IMU sensor is shown in figure 13. It shows result for all three axis. Magnetometer output obtained from IMU sensor for all three axes is shown in figure 14. Gyroscope output obtained from IMU sensor for all three axis is shown in figure 15.
Figure11. Phrase editor menu with words and code.
INTERNATIONAL ASSOCIATION OF ENGINEERING & TECHNOLOGY FOR SKILL DEVELOPMENT
46
www.iaetsd.in
ISBN: 378-26-138420-01
INTERNATIONAL CONFERENCE ON CURRENT TRENDS IN ENGINEERING RESEARCH, ICCTER - 2014 from normal people is not converted in to sign symbol. In this system flex sensor is used along with IMU sensor to capture the words. Flex sensor is used to obtain fingers change to capture words. Even though we change the position of hand the flex sensor values will not get changed because flex sensor is placed in fingers. All the conversation used by physically challenged people will be using fingers or hands in particular position and rotating hands or fingers. To capture these positions of fingers or hand IMU sensor is used. IMU sensor which consists of accelerometer, magnetometer, and gyroscope will tackle this situation. This sensor is placed along with flex sensors in the hand so that by changing the finger for conversation, flex sensor and IMU sensor will get changed, by comparing these two values, output is displayed in display.
Figure13. Accelerometer output in display.
VIII. OUTPUT Sample output obtained for some important conversations such as “WELCOME and HOW ARE YOU”, are shown in below figure 16a and b. The Left portion of figure 16a, and b shows the exact position to keep the fingers for the word which is obtained using the reference site and right portion of figure 16 a and b shows the output obtained in digital display by wearing gloves with flex sensors and IMU in the hands and by keeping the hands in the position referred by the picture. At the same time same word which is displayed is also heard through speaker using speak jet IC.
Figure14. Magnetometer output in display
Figure15. Gyroscope output in display.
VII. RESULT COMPARASION Outputs obtained in previous work are simply obtaining finger flex, using software for speech recognition which uses computer to run the software. Since these system uses pc it can’t able to use it for any situation. Another system used only flex sensor to recognize words. Using change in flex sensor, the output is obtained. Some system which use computer is not portable to use and also only one way communication which will display result, which will be understood by normal people, but speech
Figure16.a. Digital display shows WELCOME
INTERNATIONAL ASSOCIATION OF ENGINEERING & TECHNOLOGY FOR SKILL DEVELOPMENT
47
www.iaetsd.in
ISBN: 378-26-138420-01
INTERNATIONAL CONFERENCE ON CURRENT TRENDS IN ENGINEERING RESEARCH, ICCTER - 2014 [3] Syed Faiz, Ahmed, Syed Baber Ali, Saqib Qureshi,”Electronic Speaking Glove For Speechless Patients A Tongue to a Dumb”, Proceedings of the 2010 IEEE Conference on Sustainable Utilization and Development in Engineering and Technology University Tunku Abdul Rahman 2010. [4] Jingdong Zhao, Li Jiang, Shicai Shi, Hegao Cai, Hong Liu, G.Hirzinger, "A Five-fingered Underactuated Prosthetic Hand System", Proceedings of the 2006 IEEE International Conference on Mechatronics and Automation, June 2006, pp. 1453 1458. [5] N. P. Bhatti, A. Baqai, B. S. Chowdhry, M. A. Umar, "Electronic Hand Glove for Speech Impaired and Paralyzed Patients", EIR Magazine, pp. 59-63, Karachi, Pakistan. May 2009 [6] Flex Point Inc. USA,”http://www.flexpoint.com” Last Accessed on September 06, 2010. [7] www.pololu.com/catalog/product/1268 Figure16.b. Digital display shows HOW ARE YOU
[8] Magnevation Speak jet http://www.speakjet.com. IX. CONCLUSION AND FUTURE ENHANCEMENT
[9] VRbot module, www.VeeaR.eu [10] http://www.sign.com.au
This system will be useful for physically challenged people and will tackle the gap between them and normal people. Since it is two way portable communication systems, it can be used at any time. This system can be enhanced by using extra flex sensors in wrist and elbow, so that conversation which uses these bent positions can be obtained accurately. Further storage device like SD card can be used to store more phrases as a dictionary to speak and to visualize it. It can also be enhanced by covering it with water proof layer to use it in any situation. REFERENCES [1] L. K. Simone, E. Elovic, U. Kalambur, D. Kamper, "A Low Cost Method to Measure Finger Flexion in Individuals with Reduced Hand and Finger Range of Motion", 26th Annual International Conference of the IEEE Engineering in Medicine and Biology Society 2004 (IEMBS '04), Volume 2, 2004, pp. 4791-4794. [2] M. Wald, "Captioning for Deaf and Hard of Hearing People by Editing Automatic Speech Recognition in Real Time", Proceedings of 10th International Conference on Computers Helping People with Special Needs ICCHP 2006, LNCS 4061, pp. 683-690.
INTERNATIONAL ASSOCIATION OF ENGINEERING & TECHNOLOGY FOR SKILL DEVELOPMENT
48
www.iaetsd.in