We recently launched our first ever developer contest. We were excited to see what projects the growing Neosensory developer community had in mind since the launch of our open API and SDKs. But given all of the craziness going on in the world, we weren’t sure what to expect.
Developer project entries
To our pleasant surprise, we ended up with over 70 entries from all over the world. Before we get into the winners, here’s a quick breakdown of the proposed projects areas:
As one can imagine, it was an exceedingly difficult task to narrow down these 70 entries to just 5 finalists to move on to phase II of the contest. In fact it was so difficult, we couldn’t help ourselves but let another 2 entries slip in for a total of 7 contestants for phase II.
Each phase II contestant received a free Neosensory Buzz wristband to turn their idea into reality. They had just a month to take their projects as far as they could, both in terms of technical implementation and gathering user-data.
The Winner: Buzz-Arduino Air Quality Sensor by Chris Bartley
This was a close competition, but one competitor managed to rise above by going the extra few miles on technical implementation: Chris Bartley with his Air Quality Sensing project.
Not only did Chris put together an air quality sensor network that can interface with Buzz either directly or via an iPhone app, he also produced an open-source Neosensory API Swift package and diagnostic app for our developer community.
For his submission, Chris leveraged volatile organic compound (VOC) sensors, which provide a measure for outdoor (*cough* *cough*) and especially indoor air quality. We were impressed by how immediately intuitive and well-calibrated the system is. For example, by placing a sensor next to a stove, one can immediately feel the VOC measurement spike as soon as one turns it on and starts cooking.
The vibration encoding is also effective – with little-to-no-vibration when the air quality is at a “reasonable” level. Sweeping vibration across the Buzz’s 4 vibrational units indicates whether the VOCs are increasing or decreasing in real-time.
We appreciate Chris’s system for how powerful and flexible this framework is: sensors can either be worn or placed in fixed locations and other types of sensors can be readily substituted.
Our team members in California breathing through the current west coast wildfires were particularly excited to test this project. We can’t wait to see how this project continues to evolve and how it is utilized after this contest given all of the potential use-cases.
All of the finalists’ ideas were wonderful and ambitious. For now, we’re sharing a snapshot of their projects as a preview. Check back soon for project write-ups on our blog.
In last-name alphabetical order:
Sam Chin – Sensing CO2 as a measure of ventilation quality and COVID-19 risk
Sam produced a wearable CO2-sensing device that controls Buzz, where a stronger vibration is indicative of more CO2 in the environment. As COVID-19 is mainly airborne and people produce CO2 when they breathe, the measure of present CO2 can approximate how well ventilated an environment is, and therefore serve to approximate the risk of COVID-19 exposure. The concept is brilliant and you can read more about this preliminary work on her personal site.
Chloe Duckworth, Shannon Brownlee, and Bhargava Ganti – Emotion recognition as an aid for people with autism spectrum disorder
Chloe’s team had the fantastic idea of using machine learning to train an emotion-recognition system on speech audio (a hot field right now) and map recognized emotions to Buzz as a communication aid for people with autism spectrum disorder.
Max Ladabaum – Sensing alternative current electric fields
In true Neosensory spirit, Max put together a wearable device for sensing the invisible electric fields around us emanating from alternating current (AC) sources. Interested in knowing if you have a grounding issue in your house? Well, we tested it and had a lot of fun waving our arms around TVs, wall outlets, and such, and were impressed by the diversity of patterns emanating from these devices.
Ian Pengra – Obstacle detection as a blind navigation aid
Ian had the bright idea of combining two ultrasonic sensors – one wide-angle, one narrow-angle – to detect objects in front of the wearer’s path. We love how he encoded the sensor readings to vibration using texture across the 4 motors, which created a natural sensation when passing the sensors over an object.
Jennie Stenhouse – Heads-up navigation
Have you ever been annoyed by needing to keep your eyes glued to your phone’s maps app when walking to a new destination? Well no more–with Ramble, Jennie has devised several methods for getting haptic feedback to help enable varying degrees of exploration as you navigate to your destination.
Anonymous – Biofeedback
What can we learn by tapping into our body’s own signals? Perhaps with enough practice one could obtain nirvana! The MakeSense project uses an Apple Watch’s heart rate sensor and an Arduino with connected galvanic skin response (GSR), sound, and heart-rate sensors, which are then piped to an iPhone app that controls Buzz. A true sensory symphony.
So what’s next? Another competition, that’s what!
We have been overwhelmed by all of the fantastic submissions and are now planning to launch a follow-up competition later this fall. We hope you’ll join us!
Sign up for our Developer Newsletter and check back on our blog soon for details. We’ll also be sharing more blog posts on each contestant’s project.
If you’d like to get involved with our growing developer community, join the Neosensory developer Slack. If you create your own project and would like to share it, email us at firstname.lastname@example.org. We’ll feature select projects on our site in the future.