Elechouse speech recognition module V3

State of the art

RFIWS: A Radio Frequency Identification Walking Stick (RFIWS) was designed in [9] in order to help blind people navigating on their sidewalk. This system helps in detecting and calculating the approximate distance between the sidewalk border and the blind person. A Radio Frequency Identification (RFID) is used to transfer and receive information through radio wave medium. RFID tag, reader, and middle are the main components of RFID technology. A number of RFID tags are placed in the middle of the sidewalk with consideration of an equal and specific distance between each other and RFID reader. The RFID will be connected to the stick in order to detect and process received signals. Sounds and vibrations will be produced to notify the user with the distance between the border of the sidewalk and himself/herself. Louder sounds will be generated as the user gets closer to the border. Figure I.8 shows the distance of frequency detection (Y) and width of sidewalk (X). Each tag needs to be tested separately due to different ranges of detection. Figure I.6 Distance of the frequency detection on sidewalk. RFID technology has a perfect reading function between the tags and readers that makes the device reliable in the level of detection. However, each tag needs a specific range which requires a lot of individual testing, that leads to scope limitation. Also, the system can be easily stopped from working in case of wrapping or covering the tags which prevents those tags from receiving the radio waves.

Fusion of Artificial Vision and GPS (FAV&GPS) An assistive device for blind people was introduced in [10] to improve mapping of the user’s location and positioning the surrounding objects using two functions that are: based on a map matching approach and artificial vision. The first function helps in locating the required object as well as allowing the user to give instructions by moving her/his head toward the target. The second one helps in automatic detection of visual aims. As shown in Figure I.7, this device is a wearable device that mounted on the user’s head, and it consists of two Bumblebee stereo cameras for video input that installed on the helmet, GPS receiver, headphones, microphone, and Xsens Mti tracking device for motion sensing. The system processes the video stream using SpikNet recognition algorithm to locate the visual features that handle the 320 × 240 pixels image.

Assistive technology

Assistive technology represents all the systems, services, devices and appliances that are used by disabled people to help in their daily lives, make their activities easier, and provide a safe mobility. [1] In the 1960s, assistive technology was introduced to solve the daily problems which are related to information transmission (such as personal care), navigation and orientation aids which are related to mobility assistance. [2] In Figure II.1, visual assistive technology is divided into three categories: vision enhancement, vision substitution, and vision replacement [3]. This assistive technology became available for the blind people through electronic devices which provide the users with detection and localization of the objects in order to offer those people with sense of the external environment using functions of sensors. The sensors also aid the user with the mobility task based on the determination of dimensions, range and height of the objects [4]. Figure II.1 Classification of electronic devices for visually-impaired people. The vision replacement category is more complex than the other two categories; it deals with medical and technology issues. Vision replacement includes displaying information directly to the visual cortex of the brain or through an ocular nerve. However, vision enhancement and vision substitution are similar in concept; the difference is that in vision enhancement, the camera input is processed and then the results will be visually displayed.

Lire sur cLicours.com :  Individual identification of decapod crustaceans

Vision substitution is similar to vision enhancement, yet the result constitutes non-visual display, which can be vibration, auditory or both based on the hearing and touch senses that can be easily controlled and felt by the blind user. Description of the device Our device belongs to the category of vision substitution, and it consist of a smart cane capable of detecting obstacles using ultrasonic sensors and giving the user live different feedbacks of the surrounding environment via vibration motors or voice alert, which can be proved to be very useful in guiding the user outside without the need of any input from another human. It is capable of communicating with the user directly using voice commands, and it can also exchange information with the smartphone of the user via Bluetooth, the device can measure the heart rate of the user and examine the results, and it has LED’s on board that can be switched On and Off automatically using a light sensor, the LED’s make the user visible to drivers at night when crossing the road. In the heart of this system we have the Arduino board that controls everything that we stated above as shown in Figure II.2, and it’s powered by a standard 5v battery.

How it works When the device is turned on the 4 ultrasonic sensors start working simultaneously to detect and measure the distance between the smart cane (the user) and the obstacles, if an obstacle is detected within 70cm-200cm the vibration motor of the corresponding sensor start vibrating slowly, if the distance between the obstacle and the sensor is lower than 70 cm the vibration gets more powerful and a voice alert is played through the speaker with the direction of the obstacle (right, left, front, and low), those voice alerts are already recorded in mp3 format and played by the DFPlayer mp3 module, there are 8 possible alerts in total to provide a more precise feedback to the user so he can act accordingly. At the top of the cane there is an integrated button that the user can press to activate voice commands, when pressed the system start listening to the user’s voice and take commands such as “heart rate” command and the system will give instruction to the user and start measuring his heart rate, or he can request the cane to communicate and send commands to the smartphone such as “request a Rekba” which is a local taxi service in Algeria, or “call emergency contact”.

Table des matières

Abstract
Acknowledgments
Introduction
Chapter I Visual impairment (Blindness)
I.1 Introduction
I.2 Visual impairment
I.3 Causes of blindness
I.4 History
I.5 State of the art
I.6 Conclusion
References
Chapter II Conception and realization of the device
II.1 Introduction; assistive technology
II.2 Description of the device
II.3 Arduino board
II.3.1 Characteristics
II.3.2 Arduino IDE
II.4 Ultrasound
II.4.1 Ultrasonic sensors
II.4.2 HC-SR04 module
II.4.2.1 Characteristics
II.5 Vibration motors
II.5.1 Characteristics
II.6 DFPlayer Mini MP3 module
II.6.1 Characteristics
II.6.2 Pin map:
II.7 Speech recognition
II.7.1 Elechouse speech recognition module V3
II.7.2 Characteristics
II.8 Heart rate
II.8.1 SEN-11574 Pulse sensor
II.9 Bluetooth
II.9.1 BLE 4.0: Bluetooth low energy
II.9.2 HM-10 Bluetooth module
II.9.2.1 Characteristics
II.10 LM393 light level sensor:
II.11 Final Product
II.12 Conclusion
References
Chapter III Creation of the android application
III.1 Introduction
III.2 Android Studio
III.2.1 Advantages
III.3 Our application
Conclusion
perspectives
appendix

Cours gratuitTélécharger le document complet

Télécharger aussi :

Laisser un commentaire

Votre adresse e-mail ne sera pas publiée.