fbpx

Buzzing emotions: how Buzz can be used to help neurodiverse people interpret emotion

Chloe and Shannon are finalists from our first Neosensory developer contest. We are delighted to host the following guest write-up from them about their extraordinary submission!

When I first saw a demo for Neosensory’s vest prototype back in 2018, my mind was mystified about the seemingly endless possibilities for data to feed into the haptic wristband. As our experience of reality evolves with new technology, the line between human and computer has become increasingly blurred, creating incredible opportunities to augment and improve our consciousness in previously inconceivable ways.

The plan

 After hearing about the innovation challenge to build new sensory substitution applications with Buzz, I reached out to my tech-wiz of a friend, Shannon, to partner with. While brainstorming, we were intrigued by Buzz’s ability to seamlessly translate sound into haptic feedback. We wanted to create a product that fits with Neosensory’s initial premise: reimagining the way sensation is delivered to accommodate people that experience it differently. After pondering the myriad possibilities and applications of the Buzz, we landed on autism spectrum disorder (ASD) as a viable option because autistic people struggle with emotional perception due to a difficulty in interpreting the modality it is expressed in. By adjusting the type of emotional input people take in to a more objective signal, haptic vibration, we hope to help autistic people learn to better discern the emotions of people they talk to.

The solution

We created Valence, an app that pairs with a person’s Buzz to help them interpret the emotional valence of speech through vibrations on their wrist. Using the same functionality as the classic Buzz technology, our app senses emotions from vocal pitch data picked up by your smart phone’s microphone. By receiving audio input and quickly converting audio signals into frequency diagrams, our data pipeline and machine learning model are able to determine the emotional valence of a statement, the current possibilities of which are neutral, surprise, disgust, happy, sad, fear, and angry. Valence offers a non-invasive tool to assist autistic people in learning to interpret emotions of the people they talk to, and an opportunity for any person to augment their perception of emotion and strengthen their conversations and relationships.

As full-time undergraduate students at the University of Southern California, Shannon and I worked tirelessly over many late nights after class to get a viable product and demo up and running. I pulled information and research from my neuroscience background to help understand underlying causes and issues resulting from the inability to distinguish emotion, while Shannon pulled from her computer science and machine learning background to find and process the data and to subsequently build our main model to predict the sentiment of a statement based on vocal patterns.    

The future     

We plan to continue changing the model and adding training data to increase its accuracy. We hope to also add a natural language processing pipeline to run concurrently which would allow for quick transcription straight to the device to go along with the audio-based emotional classification. This transcription can be run through a sentiment classifier to add an extra level of accuracy to the final output. By focusing on the speech patterns rather than solely on the words of the spoken statement, we’re aiming to build a more accurate and useful model to help neurodiverse people. We are also working on building an app GUI for both Android and iOS platforms to allow for accessibility and ease of use.

We’d love to hear any questions or comments from those who may be interested in our application, or those who may benefit from an app such as this!

This project is currently recruiting beta-testers, for more information, please visit their website: www.valencevibrations.com.

Interested in building your own amazing sensory expanding applications? Dive in by checking out our developer site and developer posts on our blog. If you have any questions regarding this project or a project you’re working on, visit the Neosensory developer Slack. If you create your own project and would like to share it, email us at developers@neosensory.com – we’ll feature select projects on our site in the coming months!

Edited to include identity-first language, the clear community preference of the Autistic Self Advocacy Network (ASAN) and most other autistic-led organizations.