Neosensory Buzz offers a great opportunity for developers to build new sensory experiences. You can use a number of different platforms to send data to the Buzz wristband (check out our SDKs for Android or Arduino).
If you’ve seen our TED Talk, you’ve seen examples of feeding data streams to the brain from Twitter, the stock market, drones, and so on. In the following years, we’ve built examples using infrared light, virtual reality, lidar, and many more.
When developers start brainstorming on new data streams, the list is impressively long. But it turns out that some ideas are better for Buzz than others. Should you feed in complex information (such as your spatial proximity to various machines), or should you use simple alerts (such as having it vibrate when someone rings your doorbell)?
Your brain learns correlations
The important thing to understand is that the brain learns information by finding correlations between different senses. So when deaf users wear Buzz, they can see your mouth moving and they can feel the vibrations on their skin. Their brains naturally make the connection.
Similarly, if you use the Neosensory Arduino SDK to build and wear an infrared detector connected to Buzz, it’s easy to make the brain understand the correlation, because you can take an infrared source (say, an infrared LED) and move your hand closer (Buzz makes a more intense buzzing) and farther from the source (Buzz becomes weaker). You feel it. In this way, the brain makes the link, coming to understand the inputs with practice over time.
The correlation always needs to be there; otherwise it’s just random noise. Let’s imagine you input data from the Bolivian stock exchange, where each motor represents the up or down movement of one of four stocks. Without learning what means what, it would simply feel like random buzzing.
To understand Bolivian stocks in real time, you have to also watch the stocks in real time while you feel them. Or build a sandbox application in which you can manipulate fake stocks: you move this stock up, and you experience what it feels like; you push this stock down, and you learn what that feels like. You establish a correlation between your own outputs and the incoming data. You practice for a while… and you can then tap into real-time data and glean meaning from the vibrations. You no longer need to look at the stock prices to understand what’s happening.
Using the capabilities
Buzz has four motors, each of which can independently transmit 255 levels of vibration (amplitude). This gives 4.3 billion possible signals. In the time domain, a typical frame of data is a few tens of milliseconds, and the data can carry rich information as it changes rapidly. Therefore, Buzz is optimized to send high-dimensional, fast-changing data.
Although it could easily do notifications, anything that is simply an alert can be done with an Apple Watch. Or a popup on your cell phone. Or a notification on your computer screen.
So using Buzz to notify you when someone is at your door would be an underutilization. Its power comes in projects like feeling infrared light (where you can expand your sense to include light not visible to the human eye), or the ongoing state of a system of machines (such as understanding how a factory is running), or your spatial relationship to things around you (such as understanding where objects are in VR by feel), or the emotional valence of some Twitter hashtag on a moment-by-moment basis (you can train your brain to feel the sentiments change in real time).
All this leads to an important side note about the Apple Watch. Some developers have asked us why we wouldn’t just use the Apple Watch to feed in data streams. The answer is that the Apple Watch only has one motor. That motor can only give one of 8 predefined patterns (buzzzz, bz-bz-bz-bz, buzz-buzz, etc). And even if those eight patterns were enough to tell you a small data trickle that you wanted to know, you’re not allowed to use these patterns continuously because Apple protects the battery life of the watch. So the Apple Watch is good for notifications, but not for continuous data streams.
Building your Neosensory Buzz app
So when you think about applications for Buzz, think broadly and creatively. Be sure to take advantage of the full capabilities of Buzz, not just simple alerts. And be sure that there is some way the brain can learn the meaning of the data by correlation. Then you’ll be on your way to expanding the human experience.
Any questions? Join our developer Slack channel or email us at email@example.com. Don’t forget to sign up for our Developer newsletter to receive the latest about our work, our SDKs, and projects from the Neosensory developer community.
By David Eagleman, CEO and Co-founder