Acme AI Ltd.

Developing ‘DrisT’, a navigation system for the visually impaired using computer vision, audio, and haptic feedback.
We are looking to solve the problem of navigation for people with complete blindness and low-vision – initially in Bangladesh and then scaling beyond. The current selection of navigation approaches or technologies are limited to legacy systems with the sector experiencing little to no affordable and disability-centered innovation for decades.
We will develop ‘DrishT’ a product that works through computer vision, audio, and haptic motors to help the visually impaired navigate the world. The system would include two hardware components, (a) a Raspberry Pi 4B-powered chest attachment that has the power to observe the world using artificial intelligence as well as speak to the user, and (b) use observations from the chest apparatus to provide input to two haptic-powered wrist devices help guide the user effectively. The chest attachment would have a camera, GPS, GSMA, bluetooth, speaker unit, and LiDAR sensors, while both the attachment and wrist devices would have rechargeable lithium-ion batteries. The machine learning model will be trained for environment detection, text recognition and depth estimation using the camera and LiDAR sensor. It will then communicate with a large language model embedded in the system to provide audio responses to events and send signals for the haptic engines on the wrist to work appropriately.