Skip to content

Latest commit

 

History

History
12 lines (7 loc) · 1.03 KB

README.md

File metadata and controls

12 lines (7 loc) · 1.03 KB

Blind Navigation

Mapping depth (in real space) to haptic feedback, allowing blind individuals to navigate - in a sense, granting a sort of depth-based "sight" to the blind.

Tested on a vest with 32 vibration motors arranged in a grid; depth is obtained from a fusion between the Microsoft Kinect and ultrasonic sensors. Each motor receives a segment of the depth-image, and it vibrates stronger the closer you get to an obstacle.

Over time, the user adapts to the haptic feedback input and learns from it, allowing the user to accept this input as intuitively as natural vision itself.

Blind Navigation Video

See it in action by clicking the image above, or any of the following links: 1, 2, 3

This branch is a refactor-in-progress. Please check out "master" for a less buggy version.