CHAPTER-1

 

INTRODUCTION

We Will Write a Custom Essay Specifically
For You For Only $13.90/page!


order now

 

 

 

 

 

 

 

 

 

 

 

 

 

 

1.1           
 Introduction:

Now
a days, robots are increasingly being integrated into working tasks to replace
human beings especially to perform the repetitive task. These robots are
currently used in many fields of applications including office, military tasks,
hospital operations, dangerous environment and agriculture. Besides, it might
be difficult or dangerous for humans to do some specific tasks like picking up
explosive chemicals, defusing bombs or in worst case scenario to pick and place
the bomb somewhere for containment and for repeated pick and place action in
industries robots can also be used for sorting of colors and in library to
place the books at their right place. Therefore a robot can be replaced human
to do work.

 

1.1.2 Sign Language Robotic
Manipulator:

For
making a sign language robotic manipulator we are proposing a robotic hand and
with each finger of the hand we will be use servo motors for the movement of
the fingers and hand in a particular way. It will be done with the help of a
microcontroller. We will also be use a bluetooth module for transfer of voice
signals towards the controller for the action of robotic hand. The basic
operation of the robotic hand will be to make a sign of that word which a
person will speak in mic.it is basically for deaf person who cannot hear but
can understand sign language

1.1.3 Industrial
Perspective:  

At
industrial level gesture communication or sign language is preferred Our Sign
language manipulator helps to communicate with deaf persons and those who are
hard of hearing .sign language robotic manipulator can also be used for
teaching purpose at special schools colleges and universities by a normal
teacher who do not even know the sign language. He can communicate through this
robotic manipulator to deaf students. This robotic manipulator can also be used
for communication at noisy places of industry to communicate easily just by
speaking in Bluetooth mic.

1.2 Literature Survey:

The term HSI (hearing/speech impaired) or deaf refers to
the person who has disability to hear voices and sounds. The sign language (SL)
is one of effective means used by HSI people to communicate with people but
unfortunately, the number of people including the HSI people, who are familiar
with sign language is very limited. This is a major hindrance communication. As
every region has their own language, so there is no universal sign language. For
example, Recognizing & interpreting
Indian Sign Language gesture for Human Robot Interaction was made for
communication of  human being to robot
but it was in Indian sign language as 1.

Tamil alphabets sign
language translator was also made to communicate easily with the people from
different areas who was not familiar with tamil so it was very impressive work
to communicate with sign translator 2. Real
time Indian sign language recognition system to aid deaf-dumb people was made
by indian group of engineers who worked for real time communication  through indian sign language3. A
Narrow-Band Video Communication System for the Transmission of Sign Language
Over Ordinary Telephone Lines was made for making the communication more
understandable and easy 4. Gesture
recognition project was also done through flex sensor in which a man can train
a robot by making different gestures 5.
Recognizing complex, parameterized gestures from monocular image sequences 6.
Introduction to the Special Issue on Human–Robot Interaction7.

.

 

 

1.2.1
Starting Point of Technology:

 

Figure 1. 1:
Sign language gestures

American Sign Language is, as the name suggests, native to the United
States. It has its linguistic origins in Langue des Signs Français (French Sign
Language). It is a manual language with its own linguistic complexities and
rules. The hands, body, and facial expressions are used to communicate without
using sound. American Sign Language is considered to be a creolization of
Langue des Signes Français, not a dialect (Padden et al, 2010). Before 1817 and
the establishment for the first school for the deaf, there were only a few
vague documentations of limited Deaf communities in the United States (Groce,
1985). Since the founding of the first school for the deaf those schools have
become epicenters for socialization and the development of Deaf culture. Since
deafness affects people of any race, class, or orientation deafness is found
anywhere, creating a diverse community.

1.2.2 Current Status of Technology:

Sign language is recognized as the main means of
communication between deaf persons and others. There are government measures to
encourage media and other forms of public information for making their services
accessible to persons with disabilities. Aids for communication are possible
for people with sensory disabilities. All supplementary equipment needed by
deaf people for their vocational and private life can be subsidized as can
technical equipment. If a deaf person needs a sign language interpreter for
essential business or medical investigation different sign language robots are
being made for better conversation.

 

1.3 Problem Statement:

For
making a sign language translator by robotic hand we will make a robotic hand
and with each figure of the hand are connected with servo motors for the
movement of the figures and hand in a particular way. It all will be done with
the help of a microcontroller. We will also use a VR-shield that is a kind of
voice reorganization shield for Arduino. The basic operation of the robotic
hand will be to make a sign of that word which a person will speak in a mic. It
is basically for deaf person who cannot hear but can understand sign language.

1.4 Methodology of Project:

We are making a robotic hand for deaf persons so that a
normal man can easily communicate with disable people. Our hand will consist of
servo motors with each finger .The movement of robotic hand is controlled with
Arduino. The hand movement is done by sending a voice signal through VR shield
or Bluetooth module .by adjusting the angle of servo motor we can control the
movement of each finger of the robotic hand for making particular sign of
American sign language .for example if a person wants to say “A” to the deaf
person he will speak out this word “A” in the mic of VR shield or bluetooth
module the fingers of the robotic hand will make the sign of “A” physically.

 

 

 

 

 

 

1.5 Description of the block diagram:                      

Figure 1. 2: Block Diagram

We are using an android application of bluetooth serial
for sending data through sending end module of bluetooth to the receiving end
module of bluetooth and our receiving end module is connected with a
microcontroller arduino mega that is connected by a 16 channel servo driver IC
to the servo motors .Every servo motor is connected with each finger of the
robotic hand that will be operated at specific angles through the microcontroller.
The movement of the fingers will be according to the word spoken by the normal person
in the bluetooth mic.       

1.6 American Sign Language Symbol

Figure 1. 3:
American Sign Language Symbols

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

References:

 

1        Nandy,S.
Mondal, J.S. Prasad, P.Chakraborty & G.C.Nandi, “Recognizing & interpreting Indian Sign Language gesture for
Human Robot Interaction”, IEEE
International Conference on Computer and Communication Technology (ICCCT),
Page(s): 712-717,2010.

2        Jayanthi,
P., and K. K. Thyagharajan. “Tamil
alphabets sign language translator.” Advanced Computing
(ICoAC), 2013 Fifth International Conference on. IEEE, 2013.

 3      P.Subha Rajam, G. Balakrishnan, “Real time Indian sign language
recognition system to aid deaf-dumb people”, IEEE International Conference on Computing Communication and Networking
Technologies (ICCCNT), Page(s): 1-9, Trichy, India, 2010.

4       J.F.
Abramatic, P. Letellier, and M. Nadler,
“A Narrow-Band Video Communication System for the Transmission of Sign
Language Over Ordinary Telephone Lines,” Image Sequences Processing and Dynamic Scene Analysis, T.S. Huang, ed.,
pp. 314-336.Berlin and Heidelberg: Springer-Verlag, 20.

5        S.Mitra, “Gecture recognition”, IEEE Transactions on Systems, Man, and Cybernetics
– Part C: Applications and Reviews, vol. 37, N. 3, May 2007, p311-324.

 

6        Axenbeck, T.,
Bennewitz, M., Behnke, S., & Burgard, W. (2008)”Recognizing complex, parameterized gestures from monocular image
sequences”, IEEE-RAS international conference on humanoid robots
(Humanoids’08), Daejeon, South
Korea, December 2008.

 

7        J. A. Adams , M. Skubic, “Introduction to the Special Issue on Human–Robot Interaction”, IEEE Transactions on Systems, Man, and
Cybernetics, Part A: Systems and Humans, v.35 n.4, p.433-437, July
2005 

 

 

x

Hi!
I'm Eleanor!

Would you like to get a custom essay? How about receiving a customized one?

Check it out