The winners are in for Neosensory’s second developer contest. Our judges were thrilled with the dozens of inventive entries that leveraged Buzz to create devices ranging from health monitors to echolocation devices. In this ongoing series, we’ll be showcasing the most exciting submissions. To see all winners, click here.
Meet the team:
“Hi! I’m Emanuel and I graduated with a Master (Upper Second Class) of Mechanical Engineering from the University of Sheffield in 2019. I now work as a software engineer for a team which commissions metal rolling mills around the globe. In my spare time, I enjoy learning about new technologies, such as tinyML algorithms and their applications, and improving my coding skills – specifically in Python.
My teammate Kevin graduated from Swansea University with a First Class degree in Computer Science in 2018. Since then he has worked as a full-stack software developer at IBM Client Innovation Centre which primarily focuses on creating bespoke solutions to clients’ needs. Preferred programming languages are Java, Python, and Nodejs.”
Why did you decide to enter the contest?
“The earliest motivation I (Emmanuel) had for this project was actually just to step outside of my comfort zone while trying to build something cool. When I was choosing what Masters project to sign up for, the description of one titled ‘Becoming Superhuman’ included Dr Eagleman’s TED talk. After watching the TED talk I knew I wanted to work with that project’s supervisor, Dr Manson, at the University of Sheffield. Neither of us had heard of sensory substitution before watching the talk, but we were both interested in learning more about it. Hearing Eagleman describe the human ‘umwelt’ made me want to create a device which would allow someone to experience a sense they didn’t already have.
I chose to focus my project on firefighters. Some firefighters already use infrared (IR) in situations where there is poor visibility, dense smoke or dark environments to help find unconscious people. However, checking a traditional video display in these settings is extremely difficult, even if you don’t factor the additional problems of heat, stress and fatigue into the equation. By utilising haptics to turn this IR information into a tactile sense, and leveraging sensory substitution, we hoped to help firefighters feel their way to casualties faster, potentially saving lives.
While I didn’t manage to finish the project while studying, I was able to detect when an object with a temperature within the human body range was in an image and then vibrate a single linear resonant actuator (LRA).
When I learned about Neosensory’s “Expand your Senses” developers contest, I saw it as a great opportunity to continue with the project and team up with Kevin.”
“The Buzz wristband, a mandatory part of the contest, easily fit within the firefighters’ needs: The Buzz band’s four vibrating motors could send information in the form of vibrational patterns to the wearer. The fact the contest was hosted in collaboration with Edge Impulse gave us the chance to learn more about tinyML.
To put it simply, we captured an IR image of a scene, detected whether there was a person in the image, then, through our simple haptic language, sent directional information back to the Buzz wristband.”
What challenges did your team experience?
“We encountered some hiccups during the project. The tinyML algorithms we tested had varying success with our data – sometimes it was unable to recognise a person at all and others it identified anything and everything as a person. We chalked this up to our IR camera not being very good. A commercial IR camera used by firefighters has 1000x more pixels on average than the one we used. A better camera with a higher resolution display would make it easier for a tinyML algorithm to learn to identify people in the image.
Another challenge of the project was collecting so much data to train the tinyML algorithm up with. Luckily, family members (including a small sausage dog) were willing to be captured in the IR spectrum and added to the database.”
What are the next steps for your device?
“In the future, we hope to work alongside a local firefighting department which uses IR cameras in their search and rescue operations so that we can build up a higher resolution IR database. With this database, we will be able to further test the effectiveness of a tinyML algorithm in detecting people and also importantly, where specifically they are in the image. We aim to work on and develop a more sophisticated language (than our current 3-sweep system) which will be able to better describe where exactly someone is in the IR image and relay some idea of distance.
Furthermore, we hope to test the effects of using a larger haptic device which firefighters could wear under their protective gear. Perhaps a jacket, or even full-body suit, will allow for higher information throughput and a more sophisticated haptic language potential due to the larger number of actuators which can be stimulated in varying patterns.”
Emmanuel presented during the Expand your Senses Demo Day event. You can watch the presentation here.