Abstract: The system developed is a device that helps visually impaired patients to lead an independent life. This system is integrated with voice commands. The input to the system is a real-time scenery and the output is the voice output of the names of the recognized objects. Avoiding obstacles is the main goal of our project so that the life of the visually impaired person can be led safely. The algorithm used is YOLO (You Only Look Once).

The YOLO algorithm uses a single neural network to simultaneously predict bounding boxes and class probabilities for multiple objects within an image. YOLO, Object recognition, Voice command integration. The proposed system employs YOLO model trained on large datasets to recognize a wide range of everyday objects, including household items, food products, navigation aids, and more. Upon detecting an object through the camera feed, the system provides audio feedback through speech synthesis, conveying information about the recognized object to the user in a clear and concise manner.

User feedback and iterative improvements are fundamental to the system's development, ensuring its effectiveness and usability for individuals with diverse visual impairments. Moreover, the system's design prioritizes simplicity, portability, and affordability, aiming to make it accessible to a broader population of visually impaired individuals. object recognition system represents a vital step toward empowering the visually impaired community by providing them with a reliable and intuitive tool to better navigate and interact with their environment.

Keywords: YOLO, Object recognition, Voice commands, Real-time

Cite:
Lasya Babu K M, Madhu S Salimath, Niriksha D, Tejaswini L, Roopa K Murthy, "A Survey on Guidance System for People with Limited Visionor No Vision", IARJSET International Advanced Research Journal in Science, Engineering and Technology, vol. 10, no. 12, pp. 162-169, 2023, Crossref https://doi.org/10.17148/IARJSET.2023.101221.


PDF | DOI: 10.17148/IARJSET.2023.101221

Open chat