Team Rehab is part of the Neosensory Community Research Program.
Neosensory is excited about exploring the frontiers of science and technology. As part of this effort, we have launched the Community Research Program, which aims to bring like-minded people together from around the world and across skill sets to work on projects with real-world impact and expand the knowledge domain of sensory addition, augmentation, and substitution. Learn more here or dive right in and sign up to join a team.
It’s estimated that 1 in 10 people aged 55 and over suffer from peripheral neuropathy. This chronic medical condition can develop when nerves in the body’s extremities, such as the hands, feet, and arms, are damaged. And damage to the peripheral nervous system can lead to a reduction or loss of sensory, motor, or autonomic nerve functions. For example, when patients’ feet are affected, it can often result in inconsistent balance and gait due to a combination of reduced sensory information from the feet, limb positional awareness, and impaired motor functions.
The team is exploring the possibility of using additional forms of sensory feedback to reinforce impaired or missing sensory or motor function and support gait rehabilitation for peripheral neuropathy patients. The team posed the question: Could a ubiquitous device, such as a mobile phone attached to a patient, and used in conjunction with a Neosensory Buzz, create a bio-feedback mechanism that provides gait awareness and corrective feedback?
Background
Gait is defined as the manner or style of walking. It’s unique to every person, and for many of us, we never give it a second thought. But for those with peripheral neuropathy, stroke patients, or for patients with physical trauma as a result of an accident, it can be profoundly impactful for mobility, independence, and overall long-term health.
The study of gait analysis sees the gait cycle (defined as the interval of time between repetitive events of walking) broken down into distinct phases. When operating in harmony, the cycle is balanced, symmetric, and consistent. But when impairment affects the cycle, it can lead to instability, loss of balance, and asymmetric gait.

Using mobile technology
Mobile phones have rapidly evolved from the first single-function ‘handheld’ devices in the early 70s, through the advances introduced with the iPhone in early 2007, to the devices we’re inseparable from today. Modern mobile devices are increasingly packed with features and sensors, and in combination with wearables, that trend is set to continue. The team became interested in the variety of motion sensors present in many phones and wondered if the sensors could be used to measure the gait cycle, along with the phone’s computing and networking capabilities, process the motion data, and communicate feedback to the Neosensory Buzz.
There were two approaches to getting motion data from a mobile device:
- Write a native app, ideally cross-platform, using something like Flutter
- Write a web-based application using JavaScript Web APIs
We explored the Web API approach for fast iteration and easy sharing with others in the team.
Web APIs are built into many modern browsers, but what can you do with Web APIs? Quite a lot, it turns out! There are APIs to access a device’s hardware interfaces and filesystem, helpers to load web content, store data, and, most interestingly, access a device’s sensors. While many of these APIs are experimental and have incomplete or differing implementations across browsers, it was a quick way of testing the idea.
We built a simple JavaScript application that used the web DeviceMotionEvent to fetch:
- Acceleration
- Acceleration, including gravity (orientation)
- Rotation rate
We then created some charts to show and explore the data in real-time using a web browser on the phone. You can try this on your phone here (view in landscape mode and tap request permission).

Exploring the motion data
The device’s motion sensors provide an opportunity to track user activity if the device is located on a test subject. And if the device is securely attached to say, the leg or foot, it should be possible to detect the gait cycle using a combination of acceleration, orientation and rotation rate.
Acceleration is across three axes: x, y and z, and represents the acceleration in m/s2.

Acceleration including gravity can be used to establish device orientation by subtracting acceleration. The values range from -9.81 to 9.81 with 9.81 m/s2 being the acceleration due to gravity.
Using the x, y and z values, we could establish whether the device is:
- Upright
- Upside down
- On the right edge
- On the left edge
- Face up
- Face down
And every combination in between.

Finally we have rotation rate, the angular rotation rate across three axes: alpha, beta and gamma, measured in degrees per second.

After we explored the real time data, sometimes ‘wearing’ the phone on our legs(!), we extended the application to allow streaming of the motion data to cloud storage for offline analysis. It was at this stage, we recognised that while it should be possible to measure and classify the phases of the gait cycle using the available motion data and traditional programming techniques, there was a more robust approach to be explored using machine learning.
Next steps
We’re in the process of researching Recurrent Neural Networks (RNNs) as a means to model the gait cycle and use the outputs to drive feedback to the Buzz. Again, we’re taking a mobile-first approach where we’re hoping to leverage machine learning models running in a mobile web browser using a JavaScript library based on TensorFlow, a popular machine learning framework originating at Google. Once we can reliably identify the phases of the gait cycle, and common anti-patterns, we’ll explore sending feedback to the Buzz.

Hello! My name is Ben Dyer. I live in the U.K. My background is in software and hardware development, and I’m currently working in London as a software engineering leader. I’ve always been interested in neuroscience and curious if the brain’s amazing plasticity can be harnessed with technology. I joined this project to revive my passion for research and development, find interesting problems to solve and learn from other inquisitive minds.

Hi! My name is Julia Jorgensen, and I’m from Rancho Palos Verdes, California. I’m currently finishing up my M.S. program in Computer Science; after I graduate, I will work as a software developer for ClearPoint Neuro, a company that provides a platform for navigation and delivery to the brain during neurosurgery. I am passionate about using neurotechnology to positively impact the lives of patients and medical professionals. In my free time, I love to read, write stories, and play the piano.