The best way to predict the future
is to make it
The best way to predict the future
is to make it
What is sensory augmentation?
Your brain perceives the world by learning to decode data from your existing sensory pathways. This starts with the sensory receptors in your peripheral nervous system–whether the various touch receptors in your skin, the sensory hair cells in your inner ear, or many other kinds.
Nothing stops your brain from adapting to new data streams; you can extend existing senses or add entirely new ones.
This extraordinary ability of the brain is known as sensory augmentation and addition.
Buzz has four motors, each of which can independently transmit 255 levels of vibration (amplitude). This gives 4.3 billion possible signals. In the time domain, a typical frame of data is a few tens of milliseconds, and the data can carry rich information as it changes rapidly. Therefore, Buzz is optimized to send high-dimensional, fast-changing data.
Although it could easily do notifications, anything that is simply an alert can be done with an Apple Watch. Or a popup on your cell phone. Or a notification on your computer screen.
So using Buzz to notify you when someone is at your door would be an underutilization. Its power comes in projects like feeling infrared light (where you can expand your sense to include light not visible to the human eye), or the ongoing state of a system of machines (such as understanding how a factory is running), or your spatial relationship to things around you (such as understanding where objects are in VR by feel), or the emotional valence of some Twitter hashtag on a moment-by-moment basis (you can train your brain to feel the sentiments change in real time).
What is a good application for sensory augmentation or addition?
When developers start brainstorming on new data streams, the list is impressively long. So when you think about applications for Buzz, think broadly and creatively.
Be sure to take advantage of the full capabilities of Buzz, not just simple alerts. And be sure that there is some way the brain can learn the meaning of the data by correlation. Then you’ll be on your way to expanding the human experience.
The Neosensory x Neurable book club is currently reading
Just Enough Research by Ericka Hall
Sign up here
SDKs and API
Build apps that port real-time data to Buzz and other Neosensory products with our API. Create your own custom vibrations and empower Buzz users to expand their sensory experience with your own app or hardware device.
Read through our API documentation as well as our developer licence
Check out some of the amazing projects our community has built on hackster.io
This SDK allows you to develop your own projects for Neosensory Buzz by sending data to the wristband from a compatible Arduino board via Bluetooth.
Our SDK is built on top of Adafruit’s Bluefruit nRF52 library, which means you can make your Buzz vibrate from any microcontroller that works with that library (see Adafruit’s list of nRF52 boards here).
Arduino SDK documentation
Arduino SDK GitHub
Arduino SDK walkthrough
This Java-based SDK streamlines the process of developing your own projects for Neosensory Buzz by sending data to the wristband from a compatible Android device via Bluetooth. The SDK is built on top of Martijn van Welie’s BLESSED Bluetooth library for Android.
Android SDK documentation
Android SDK GitHub
Android SDK walkthrough
Our first contest winner Chris Bartley not only developed an amazing air quality sensing project. He also open sourced his swift package so you too can now create iOS applications that leverage the power of Buzz.
BuzzBLE Swift Package GitHub
Buzz explorer diagnostic app GitHub
If you’re using any other programmable device that supports Bluetooth Low-Energy, you can still control Neosensory Buzz with a little extra legwork by relying on our API documentation, which contains all of the information you need to facilitate a connection and control Buzz.
Most platforms have preexisting Bluetooth libraries you can work with to make the process even easier.We always welcome community-developed SDK’s. If you have made a SDK for a platform or language let us know.
Meet Team Rehab
Team Rehab is part of the Neosensory Community Research Program. Neosensory is excited about exploring the frontiers of science and...
Meet Team EEG
Team EEG is part of the Neosensory Community Research Program. Neosensory is excited about exploring the frontiers of science and technology. As...
Meet team CGM
Team CGM is part of the Neosensory Community Research Program. Neosensory is excited about exploring the frontiers of science and technology. As...
Community Research Program
Here at Neosensory, we love exploring the frontiers of science and technology. Sometimes it’s hard to get involved with research when you are...
Connecting parents and children with Neosensory Buzz
Babies have one primary way of communication: crying. Deaf and hard-of-hearing parents have many ways of knowing exactly what their children need...
Creating a one-button keyboard with the help of Buzz
Most computer keyboards have a total of 101 keys. Software developer Ajay Verma thought there had to be a way to type more efficiently, so he cut...
Feel the Future: Winners Announced
Neosensory’s third contest, Feel the Future, has come to a close. After weeks of hard work, our top 4 contestants presented to a panel of expert...
Making touch-sensing prosthetics affordable with Buzz
There was no shortage of creative entries in Neosensory’s second developers contest “Expand your Senses” hosted in cooperation with Edge Impulse and...
Feel the Future: Submitting your entry
The time is almost here to get your final project submissions in! To make things easy, we are going to use the Hackster platform for your entries....