抽象的な

Implementation of Bidirectional Voice Communication Between Normal & Deaf & Dumb People Using Hand Wave & Voice Recognition

B. Sivachandra Mahalingam*, D. Jaya Kishore Reddy, M. Harish and M. Kishore Kumar Reddy


Sign language is the most important methodology using which hearing and speech impaired people can interact with the rest of the world. Conversation with hearing impaired individuals gets complicated if the listener is ignorant of sign language. Hence it becomes important to construct a bridge between these two banks. Generally deaf and dumb people use sign language for communication but they find difficulty in communicating with others who don’t understand sign language. This paper aims to give bidirectional communication between mute people and normal people. It is based on the need of developing an electronic device that can translate sign language into speech as well as speech into text in order to make the communication take place between the mute communities with the general public possible. Wireless data is sent using ZigBee from MEMS sensors along the length of each finger and the thumb. Mute people can use the sensor to make hand gestures and the signal is communicated to android mobile and it will be converted into speech on the mobile. So that normal people can understand their sign language. The normal person having an android mobile app which is includes speech recognition system which is converted into the text format and send it to the electronic hand glove device subsequently it is displayed as a text what they are spoken.


免責事項: この要約は人工知能ツールを使用して翻訳されており、まだレビューまたは確認されていません

インデックス付き

  • Google スカラー
  • Jゲートを開く
  • 中国国家知識基盤 (CNKI)
  • コスモスIF
  • ジュネーブ医学教育研究財団
  • ICMJE

もっと見る

ジャーナルISSN

ジャーナル h-インデックス

オープンアクセスジャーナル