Staff Services Student Enterprise

DistaNet

A system for monitoring user gestures
Please note, the header image is purely illustrative. Source: Gorodenkoff via Adobe Stock.

A new machine learning method that efficiently recognises physical movements through surface Electromyography, utilising multi-channel sensors to enhance accuracy and performance.

Application

  • Prosthetics
  • VR+AR


Development Status

Ongoing validation


IP Status

PCT application filed in September 2024


Commercial Offering

Licensing partner


Opportunity

Technologies that recognise physical movements, such as surface Electromyography (EMG), convert muscle contractions into digital commands, enhancing user engagement and significantly improve the independence and quality of life for those with prosthetic limbs.

Despite the benefits, challenges persist. The machine learning-based interface control lacks continuous, intuitive feedback about task performance, needed to facilitate the acquisition and retention of myoelectric control skills and requires user-specific calibration due to the variability in physiological signals. The learning curve can also be steep.

Our latest development, the DistaNet framework, directly addresses these issues by utilising a sophisticated neural network model to refine signal interpretation, enhancing accuracy and usability. DistaNet reduces the need for extensive calibration. This makes advanced control more accessible and enjoyable, ensuring users can effortlessly integrate their actions with prosthetics or virtual environments.


Technology

Researchers in Edinburgh have developed a neural network-based framework, DistaNet, that accurately recognises physical movements through the innovative use of surface Electromyography (EMG). This method employs multiple channels to capture input signals which are then processed in a low-dimensional control space. DistaNet enhances this process by creating a continuous pseudo-label that trains the neural network, allowing it to extract smooth, continuous, and refined signatures of hand grasps from myoelectric signals and provide specific grasp-oriented biofeedback.

This technology enables individuals to control their immediate environment, whether physical or in the metaverse, using the natural electrical impulses of their forearm muscles. The advanced machine learning method at the core of DistaNet can interpret user intentions from these myoelectric signals, integrating dimensionality reduction and continuous pseudo-labelling to successfully maintain myoelectric skill retention in a pattern recognition context for the first time.

By decoding user gestures and offering targeted biofeedback, DistaNet supports users in retaining their newly acquired motor skills.


Benefits

  • Real-time visuals and feedback for users.
  • Better user experience and improved interaction with intuitive learning features.
  • Improved control over user movements - more continuous and accurate - for VR, AR, and prosthetics applications.
  • Classify intention in terms of the movement


Publications

DistaNet: grasp-specific distance biofeedback promotes the retention of myoelectric skills

PCT/GB2024/052386


Quote: TEC1104512


License this technology

Dr Nikos Christogiannis

Technology Transfer Executive

School of Engineering
School of Informatics
School of Geosciences
School of Mathematics