Neosensory’s second developers contest has concluded and the winners are in! Our judges were thrilled with the many inventive entries leveraging Buzz to create devices ranging from smart snoring correctors to echolocation devices. In this ongoing series, we’ll be showcasing these submissions. To see all winners, click here.
For their entry in Neosensory’s “Expand Your Senses” contest, Hayden Jones, Ryan Peng, and Jay Desai took inspiration from nature. To be precise, they looked at bats – the winged mammals capable of finding their way in the darkness – and asked themselves, “What if we could give humans the ability to use echolocation?” And so they got to work.
Their invention, the echoSense glove, communicates ultrasonic sensor data to the user via Neosensory’s Buzz wristband, a haptic feedback band with four vibrating motors.

How does it work?
An ultrasonic sensor attached to the glove contains two pins: a trigger pin and an echo pin. In their entry, the team explains when the trigger pin is activated, the ultrasonic sensor sends out a sound wave, which can bounce off objects and return back to the sensor. Upon returning back to the sensor, the sound wave triggers the “echo” pin, and data is sent back to the connected microcontroller unit (MCU).
By comparing the timestamps between when the sound wave was released and when it returned and multiplying this difference by the speed of sound in air, the MCU can effectively gauge the distance to an object (with high accuracy up to 13 feet).
This data is then sent to the Buzz wristband, which turns this information into vibrations. The closer an object is, the stronger the vibrations. The vibrations become gradually less intense, until they stop at a distance of 4 feet from an object.
Check out this video to see echoSense in action and hear from the inventors:
To top it off, Jones, Peng, and Desai added a light-sensing mode that tells users how bright or dark it is. Changing modes is as easy as toggling a sliding switch.
Next steps to refine echolocation for humans
In the future, the three creators want to make the mapping system more sensitive as users get better at interpreting the vibrations from Buzz. They’re even thinking about a training app that enables users to learn what different distances from objects feel like.
While the echoSense is fully functional, the team says the aesthetics could be improved– it’s still a prototype, after all.
Lastly, the developers are thinking about integrating AI-based computer vision using Edge Impulse machine learning models. This could allow people with vision impairment to recognize if it’s a human or another object in front of them.
To read more about the science behind echoSense, check out the team’s full entry here.
Got an idea for your own project? Check out Neosensory’s third contest here.