Abstract: Independent navigation remains a critical challenge for the visually impaired, with existing assistive technologies often limited by computational constraints and poor real-time performance. This paper proposes an IoT-enabled smart assistive system that integrates lightweight YOLO-based deep learning models for autonomous obstacle detection, path planning, and environmental interaction. Deployed on edge devices (Raspberry Pi 5, NVIDIA Jetson Nano), the system employs optimized YOLOv8n/YOLOv5s variants with INT8 quantization, achieving 92% mAP@0.5 on custom visually impaired datasets while maintaining >35 FPS inference at
Downloads:
|
DOI:
10.17148/IARJSET.2026.13220
[1] Aishwarya C, K R Sumana, "An IoT-Enabled Smart Assistive System for Autonomous Navigation and Environmental Interaction for the Visually Impaired," International Advanced Research Journal in Science, Engineering and Technology (IARJSET), DOI: 10.17148/IARJSET.2026.13220