Design INTRODUCTION The language used by speech and

 Design and development of hand gesture recognitionsystem for speech impaired people  R.

Ananthi ME-Computer Scienceand Engineering DhanalakshmiSrinivasan Engineering College.  Abstract— All over world, deaf anddumb people face struggle inexpressing their feelings to other people. There are various challengesexperienced by speech and hearing impaired people at public places inexpressing themselves to normal people. The solution to this problem isdetermined in this paper, by the usage of the Indian sign language symbolswhich are generic to all deaf and dumb people in India. The gesturesillustrated by the Indian sign language symbols will be conquered with thesupport of the flex sensors and accelerometer. The movements included during gesturerepresentation are rotation, angle tilt, and direction changes.

Don't waste your time
on finding examples

We can write the essay sample you need

The flex sensorand the accelerometer are incorporated over fingers and wrist respectively toacquire their dynamics, a these sensors are fitted over the data glove. Thesevoltage signals will then be processed by microcontroller and sent to voicemodule, where the words voice outputs are stored and play backed equivalent toeach word values to produce the appropriate voice words with the help of thespeaker. Keywords—Indian sign language, speech impaired,flex sensors, accelerometer and voice module.

 I.      INTRODUCTION Thelanguage used by speech and hearing impaired to represent themselves is knownas Sign Languages. But these languages vary from one country to other, as it isnot common to all people. Some of the main challenges experienced by speech andhearing impaired people while communicating with normal people were socialinteraction, communication disparity, education, behavioral problems, mentalhealth, and safety concerns.

As a result of these obstacles, deaf and dumbpeople are discouraged to speak out about themselves or their situations in apublic place or emergency cases or in a private conversation. Moreover the language diversifyis very vast in India, from place to place hence a common mode of connectionwas needed for speech and hearing impaired people. This resulted in the usage of the Indian Sign Language symbolsbetween deaf and dumb people to interact among them, but it cannot beunderstood by other normal people.

In this paper, Indian Sign language ISLhas been used. ISL has its own specific syntax, grammar, alphabets, words andnumerals. Hand Gestures made by using these symbols are the effective way ofcommunication by speech impaired people to express their idea or meaning. Thesegestures are made with the help of fingers, hands, wrist movements and elbowmovements for different sequence of words. Here two aspects are being governedas one with only finger position without changing hand position and orientationand the other one is change in both finger and hand position and orientations.The main need arises when these sign language symbols are not understood bynormal people, as most of them would not have studied ISL. As in real timeimage processing methods, only a single individual can be benefited bycapturing his or her image and processing it into text or speech.

But in thispaper, any speech impaired people hand gesture movements can be captured by theflex sensors and accelerometers and produced as voice output through the voicemodule. Researches have been done for somany years in the Hand gesture interpretation system using various signlanguages. As mentioned in paper 1, the sign language gestures are convertedinto voice for a single alphabet or a complete string by concatenating each andevery word and thereby forming the full meaningful words, but it was done onlyfor both American and Pakistan Sign Languages. The method described 2 aims tohelp patients with wrist impairments to perform some of their dailyexcercises.In one of the research method 3 American Sign Language have beenused, where the boundary of the gesture image depicted by the speech impairedpeople was approximated into a polygon. Then on further image processingDouglas Peucker algorithm using Freeman Chain Code Direction the words wasdetermined. ISL have also been used in a research paper 4 where each set ofsigns have been represented by the binary values of the `UP’ &`DOWN’positions of the five          fingers.

The respective images ofthe symbols was dynamically loaded and converted into text. A material known asVelostat was being used in one of the papers 5 for making piezo resistivesensors, then these sensors was used to detect bend in fingers. This data wasmapped to a character set by implementing a Minimum Mean Square Error machinelearning algorithm. The method used in one of the research paper 6 is theusage of sensor gloves for detecting hand gestures which uses British SignLanguage system.

Here only the normal hand gestures are depicted, but not toany sign language symbols pertaining to any country was captured. The outputsare produced in the text format using LCD and audio format using flex sensor.One of the researches 7 was on the robust approach for recognition ofbare-handed static American Sign Language using a novel combination of theLocal Binary Pattern histogram and Linear Binary Support Vector Machine (SVM)classifiers. In one of the papers 8 it was mentioned to use a device whichdetects and tracks hand and finger motions for Arabic Sign Language. It is doneby the data acquisition using the Multilayer Perceptron networks using NavesBayes classifier. In the research approach 9 discussed for the American SignLanguage uses glove with six colored markers and two cameras to extract theco-ordinate points .The detection of the alphabets is done by the Circle HoughTransform and backpropagation of the Artificial neural network.

One of the ways10 to detect American Sign Language was capable of recognizing hand gestureseven when the fore arm was involved and also its rotation. It has beenimplemented using Principal component analysis to differentiate between twosimilar gestures. . Thus there were variouslimitations on the previous researches done so far in the field of Signlanguage interpretation system. Some of them were usage of the image processingmethod, as it will be restricted to only individual images being captured andprocessed, hence it can be dynamically loaded and calculated for differentpersons using it.

Only finger gestures and alphabets have been obtained fromthe sign language movements and were produced as output for other countrylanguages as British, American and Pakistan. Also the distance between thecamera and the person may disturb the accuracy. Therefore in this project, thegestures for words in Indian sign languages have been used and eight commonlyused words are produced as voice outputs. The movements are captured with thehelp of flex sensor and accelerometer and can be changed dynamically with thechange in person and hand orientations. A. Data glove A data glove is an associativedevice , which facilitates tactile sensing and fine-motion control. It is is speciallyused to capture the shape and dynamics of the hand in a more effective anddirect manner.

The flex sensors are being fixed over each finger andaccelerometers over the wrist part. These sensors are being fixed onto thecloth type data glove by the use of the cello tape or glue. The Fig.2 shows thedata glove that was worn by the speech impaired people and which does acquisition of the gestures with the aid of the flexsensors and acclerometers. B. Sensory part The sensory part consists of flexsensors for gaining the finger arrangements and acclerometer for the wristspins.In the flex sensor, the resistance varies equivalent to .                 Fig.

1.  Block Diagram of the ISL handgesture recognition system       Flex sensors   II. MATERIALS AND METHODOLOGY The hand gesture recognitionsetup represented in this paper comprise of the Data glove, Sensory part (flexsensors and accelerometer), Amplifier, PIC Microcontroller, voice module andspeaker. The Fig.

1.shows the block diagram of the ISL hand gesture recognitionsystem.  Accelerometer  Fig. 2.

  Data glove fitted with senors the bending of the sensors. This resistance is thenconverted to voltage value by the use of the voltage divider circuit using a        113010k? resistor. Similiarly the voltage conversion isdone for each flex senor situated over the fingers. The accelerometer consistof 3 axes as x,y,z axes and produces 3 different set of values corresponding toeach axis location and based on the wrist movement or orientation made in thehand gesture. C. PIC Micrcontrollerdeviation to mean to each flex sensor andaccelerometer values.

Once all the flex sensors and accelerometer were testedwith their good reputable readings, the experimental setup shown in Fig. 3. wasarranged. The Fig 3.

Shows the Experimental set up of the ISL hand gesturerecognition system. The data glove fitted with sensors after testing wasconnected to a PIC microcontroller, then to a voice module, speaker and LCD tohear the voice signals. The microcontroller is used togovern the operartion of the signal vlaues that are being handled from thesensors. Thus the outcome voltages of flex sensors and the accelerometers aregiven as inputs to the to the ports of A and E of the PIC microcontroller forfurther processing. The other end of all the sensory part is connected tocommon ground. These signal values are converted to digital by the use of theinbuilt ADC in the microcontroller. D. Voice Module Signals from the microcontrollerwas then given to the voice module.

The voice module consist of eight channels,in which eight words can be recorded. The voice module can be operated invarious modes as parallel and serial modes. The voices were recorded when theboth signals CE(reset sound track) and RE(record) are low till the rising edgeof the trigger. Then the same voice can be play back when only RE is high and ahigh to low edge is applied as trigger.

The sound of the words can be heardloud and clearly with the help of the speaker. The setup also incorporate a LCDpanel to display the flex sensor and acclerometer voltages. LCD Microcon troller       Speaker    Voicemodule                Data glove fitted withsensors III.  EXPERIMENTAL RESULTS All the sensors on the data glove were firsttested. The flex sensor and the accelerometer readings were observed withvariation in their position, rotation and bending for 5 trials. The bending in hand is being determined at threebends of the bones of the hand known as distal, middle and proximal phalangesas mentioned in TABLE I.

of the flex sensor readings. The Fig. 2.shown is theexperimental setup of the project used for the Indian Sign Language gestureInterpretation system.

The inference was observed from these sensor readings offlex sensor in TABLE I.and accelerometer in TABLE II., at different positionsand trials.

For the flex sensor, the voltage across the finger when the sensoris straight is 3.5V, for a power supply of 5V is given. The Voltage drop acrossthe flex sensor was maximum at the middle phalanges bend and minimum when atthe proximal phalanges bend. For the accelerometer, the maximum values wereobserved for X-axis when hand turns right, Y-axis when hand moves up and Z-axiswhen hand slants to up position. The trial readings were taken and from thosethe final readings were derived and given as mean ± standard deviation. Oncalculating the coefficient of variance for these readings, which is very lessthan one which indicates these values have good repeatability and reproducibility.

The coefficient variance is the ratio of standard Fig. 3.  Experimental Setup of the ISLhand gesture recognition system     TABLE I. FLEX SENSOR READINGS             Finger Nobend Distal  phalanx   Middle  phalanx Name (Mean±Stddev) Bend   Bend   (V) (Mean±Stddev)   (Mean±Stddev)     (V)   (V)           Thumb 3.498±8.944e-4 3.536±0.01073   3.

662±3.57e-3 Index 3.504±1.788e-3 3.742±3.

577e-3   3.872±8.944e-4 Middle 3.508±3.

577e-3 3.786±2.68e-3   3.898±8.944e-4 Ring 3.506±2.

68e-3 3.694±1.788e-3   3.842±3.577e-3 Little 3.

502±8.944e-4 3.536±1.788e-3   3.694±1.788e-3  Subsequently after setting the full plan of thesystem, with the consideration of tested both the sensors for their repeatablevalues.

After this the data glove with flex sensors over each finger and accelerometerover the wrist was worn out by the          1131speech impaired people. Once they get ready withtheir gestures and start expressing in hands, simultaneously the voltage signalequal to the bend and rotation will be fed to the microcontroller. The flexsensor voltage of each finger movement according to each word gesticulation wasnoted down depending on the bend in each word. Similarly it was done for eachfinger’s various angle bends. Also in the accelerometer all the x, y, z axesvariations were calculated corresponding to each words rotation, up, downpositions. This scheme of measurements was repeated for different set of speechimpaired people and for a number of trails.

By this approach the minimum andmaximum threshold of each finger and wrist was calculated proportionate toeight words. From these measurements theaverage values of the sensor readings were computed. The mostly commonly usedeight set of words were listed, noted their each finger and wrist movement andtheir corresponding values. The selected eight words were Monday, Tuesday,Thursday, What and Which.

The Indian sign language symbols for these words arethe gesture movements that was obtained from the speech impaired people afterthey wore the glove fitted with seniors. The TABLE III shows the derived andaverage values of one of the words which was reprented through Indian signlanguage symbols. The graph shown in Fig.4. represnts the sensor readings of asingle word Monday .

 Similarly the same procedure wasrepeated for all other words by calculating their minimum and maximum voltages for their corresponding ISL gestures madeby speech impaired people. These values will be given to the voicemodule after processed by the PIC microcontroller. In the voice module,the voltages which are received from the sensors and microcontroller select their appropiate words’s sound output. If thematch between the word and voltage readings are equivalent , then voice of the words can be heard through the speaker.The same custom of steps will be applicable for all the other word’s voltages and their respective voiceturnouts can be heard as output. IV.

CONCLUSIONS In this paper, the usages of thehand gesture made by speech and hearing impaired people have been madesuccessful to interpret their expression of words. Hence the gesture for eachword was acquired with the help of the flex sensors and accelerometers. Theircorresponding distinct voltages were fed serially to the setup. The data onprocessing by the microcontroller and voice module would generate the consonantwords which can be heard by normal people with the help of the speaker.

Thus,the communication gap between normal and speech and hearing impaired people isreduced. The discussions of the Indian sign language have been made and thesymbols of the eight commonly used words was captured and produced as voiceoutput. Hence this research provides an elucidation for all the obstacles facedby all speech impaired people, as from this they will be satisfied, motivated andgain self confidence that their feelings will also be understood by otherpeople.Atpresent the gestures made by only single hand have been captured, but in futureit can be extended to symbols produced by both of the hands.

Also as far as nowonly eight words are produced in the voice module, which can also be enhancedto more number of words as voice turnouts.        TABLE II. ACCLEROMETER READINGS                             Axis   Up   Down   Tilt Tilt left     Slant /     (Mean±St   (Mean±St   right (Mean±     position Positio   ddev)(V)   ddev)(V)   (Mean± Stddev)   (Mean±Std n             Stddev) (V)     dev)(V)                 (V)                                   X axis   1.644± 1.688±     1.912± 1.

358±8.   1.46±       1.

788e-3   3.577e-3   8.944e-4 944e-4     4.472e-3                         Y Axis   1.352± 1.858±   1.744± 1.

698±8.   1.386±       3.577e-3   3.577e-3   1.

788e-3 944e-4     2.683e-3                         Z-axis   1.45± 1.

394±   1.512± 1.442±3.   1.312±       4.472e-3   2.68e-3   3.577e-3 577e-3     3.

577e-3                               TABLE III. SENSOR READINGS FOR WORD                         Sensor value for word-Monday                           Position   Voltage Values (Mean±Stddev)( (V)         Thumb finger     4.68±0.014142136                           Index finger       3.

71±0.0083666           Middle finger       4.3±0.021679483             Ring finger       3.96±0.008944272             Little finger       4.

37±0.010954451             X-axis       1.83±0.

01               Y-axis       1.58±0.0083666             Z-axis       1.42±0.008944272                1132                  Fig. 4.

  Graphical Representaion of theSensor readings for word V.    ACKOWLEDGMENT I wouldlike to thank the Principal, teachers and students of the St. Louis Institute,Chennai who gave me permission to trial this hand gesture system in theirschool. I would also convey my regards to Mrs. Jayanthi, who assisted me in thetechniques of learning Indian sign language.REFERENCES 1    SanAntonio.

R, Shadaram.M,Nehal.S, Virk.M.A, Ahmed, Ahmedani, and Khambaty.

Y, “Cost effective portablesystem for signlanguagegesture recognition,”IEEE International Conference on System of SystemsEngineering on 2-4 June 2008,Farmingdale,USA. 2     Al-Osman,H.Gueaieb,El  Saddik.A,and Karime.  A,”   E-Glove: An electronic glove with vibro-tactile feedback forwrist rehabilitation of post-stroke patients,”IEEE International Conference onMultimedia and Expo on 11-15 July 2011 in LaSalla university, Spain. 3    Menon.R, Jayan.S, James.

R,Janardhan and Geetha.M, “Gesture Recognition for American Sign Language withPolygon Approximation,” IEEE International Conference Technology for Educationon 14-16 July 2011, Chennai..

 4     Balakrishnan.Gand Rajam.P.

S, “Real time Indian Sign Language Recognition System to aid deaf-dumb people,” IEEE 13th InternationalConference on Communication Technology on 25-28 September 2011, pp 737-742,Australia. 5      Ramakrishnan.G,Kumar.STamse.A,Krishnapura.N,andPreetham.C,”Hand Talk-Implementation of a Gesture Recognizing Glove,” Texas InstrumentsConference on India Educators on 4-6 April 2013, Banglore NIMHANS ConventionCentre. 6    Vikram Sharma, M.

Vinay Kumar,N.Masaguppi, S.C.Suma and M.

N.Ambika , “Virtual Talk for Deaf, Mute, Blind andNormalHumans,” Texas Instruments Conference on India Educators on 4-6 April2013, IEEE Bangalore Section. 7            M.H.Kamrani and  Weerasekera,”   Robust ASL  Finger  spelling Recognition  Using Local Binary Patterns andGeometric  Features,”International Conference on Digital Image Computing Techniques and8   Mohandes.M,Aliyu.Sand Deriche.M,”Arabic Sign Language Applications on 26-28 November2013,Hobart,Australia.

recognition using the leap motion controller,”IndustrailElecttroincs(ISIE),IEEE 23rd International Symposium 1-4 June,2014, Istanbul. 9    Tangsuksant.W,Adhan.S andPintavirooj.C,”Amercian Sign Language recognition using 3D geometric invariantfeature and ANNclassification,”,Biomedical Engineering International Conference(BMEiCON),26-28November,2014 , Fukuoka.

 10  Hussain.I,Tulukdar.A.K  and Sarma.K.K,”Hand  Gesture  recognition system with real time palm tracking,” Indian Conference)INDICON),AnnualIEEE, 11-13 December,2014 Pune.

                                       1133

x

Hi!
I'm Owen!

Would you like to get a custom essay? How about receiving a customized one?

Check it out