APS Logo

Interpreting Myoelectric Signals via Machine Learning Algorithms

ORAL

Abstract

Advancements in the field of medical prosthetics have created a gap in the quality to cost ratio when using these advanced medical solutions. Similarly, low-cost options, such 3D printed prosthetics, often sacrifice much of the natural mobility of a more expensive prosthetic. Our goal is to develop upper prosthetics that are both low-cost and highly functional. Starting with an open source 3D printed hand, we fit this hand with electronics of our own design to measure and interpret the myoelectric signals generated by the user. These myoelectric signals are used to control the prosthetic giving the user intuitive operation. Before the signals can be interpreted, they are first cleaned by removing the observed flicker noise that was dominating the recorded signals. After cleaning the signals, we tested several machine learning algorithms using signal features such as maximum and minimum voltage and the dominant frequency component of the myoelectric signal. We found that the decision tree algorithm had the highest predictive power, correctly identifying the gesture over 75% of the time. Through this work, we move towards the creation of a low-cost but high functioning prosthetic that will enable many amputees to live a life of confidence and mobility.

Authors

  • Marigordon Varner

    Presbyterian College

  • Preston Robinette

    Vanderbilt University

  • Eli Owens

    Presbyterian College