Research ArticlePROSTHETICS

A myoelectric prosthetic hand with muscle synergy–based motion determination and impedance model–based biomimetic control

See allHide authors and affiliations

Science Robotics  26 Jun 2019:
Vol. 4, Issue 31, eaaw6339
DOI: 10.1126/scirobotics.aaw6339
  • Fig. 1 Overview of the proposed prosthetic hand control system.

    The control system is composed of four parts: EMG measurement, EMG signal processing, prosthetic hand control, and a myoelectric prosthetic hand.

  • Fig. 2 Hardware structure of the proposed prosthetic hand control system.

    EMG measurement, EMG signal processing, and prosthetic hand control were implemented in the electrical apparatus, which is composed of electrodes, a microcomputer, and a motor driver. The exterior and many other parts of the prosthetic hand were printed using a 3D printer. To print the parts, we modified open-source 3D data (24).

  • Fig. 3 Experimental results of the control of five fingers.

    (A) Classified motion and corresponding normalized EMG signals for each channel, force information, and muscle contraction level. (B) Classified motion and corresponding motor output angle for each finger. In this experiment, the motions conducted by the participant were no motion (NoM), thumb flexion (M1), index finger flexion (M2), middle finger flexion (M3), ring and little finger flexion (M4), and grasp (M5).

  • Fig. 4 Examples of normalized EMG signals and corresponding muscle synergies estimated using the proposed system for each motion.

    (A) Results for the single motions (M1 to M5). (B) Results for the combination motions (M6 to M10). These time-series results are arbitrary 2-s cuts from data of a participant.

  • Fig. 5 Experimental results for healthy participants.

    (A) Confusion matrix for the classification of motions of participants 1 to 6. The classified motions are thumb flexion (M1), index finger flexion (M2), middle finger flexion (M3), ring and little finger flexion (M4), grasp (M5), pinch with two fingers (M6), peace sign (M7), pinch with three fingers (M8), index finger pointing (M9), and thumbs-up (M10). M1 to M5 are the single motions, and M6 to M10 are the combined motions. The color scale represents the accuracy in classification between pairs of classes in the confusion matrix. Participants 1 to 3 and 4 to 6 conducted the experiments under conditions without or with the feedback of their classification results, respectively. (B) Average classification accuracies over participants for the conditions without and with the feedback of classification results. Blue and red bars represent results without and with feedback, respectively. Error bars represent SE. (C) Scenes during prosthetic hand control.

  • Fig. 6 Prosthetic hand system for experiment 2.

    (A) The hardware composition of the prosthetic hand system used in experiment 2. The socket is specially designed for the amputee participant participating in the experiment. (B) Photograph of the prosthetic hand system. The EMG electrodes are built in the inside of the socket. (C) The control circuit is inside the 3D-printed box and attached on the outside of the socket. The battery is built onto the outside of the socket. (D) Photograph of the prosthetic hand fitted by the amputee participant.

  • Fig. 7 Experimental results for amputee patient.

    (A and B) Examples of normalized EMG signals and corresponding muscle synergies estimated from the proposed system for each motion. (A) Results for the single motions (M1 to M4). (B) Results for the combined motions (M5). (C) Confusion matrix for the classification of motions. The classified motions are pinch with three fingers (M1), index finger pointing with thumb up (M2), thumb flexion (M3), grasp (M4), and index finger pointing (M5). M1 to M4 are the single motions, and M5 is the combined motion (M2 + M3). The color scale represents the accuracy in classification between pairs of classes in the confusion matrix. (D) Average classification accuracies for all trials. Error bars represent SE. (E) Scenes during prosthetic hand control.

  • Fig. 8 Scenes of the control tasks.

    (A) Photograph of the block-picking task. (B) Photograph of the plastic bottle–picking task. (C) Photograph of the notebook-holding task. (D) Examples of the classified motions, force information, and corresponding photographs during the plastic bottle–picking task. Here, the amputee participant controlled the prosthetic hand and switched its motions among the open [no motion (NoM)], three-finger pinch (M1), and grasp (M4).

Supplementary Materials

  • robotics.sciencemag.org/cgi/content/full/4/31/eaaw6339/DC1

    Fig. S1. Exterior and inside of the prosthetic hand and structure of the finger.

    Fig. S2. General form of motion-generation model.

    Fig. S3. Biomimetic impedance control system.

    Fig. S4. Locations of electrodes.

    Table S1. List of target motions for classification in experiment 1.

    Table S2. List of modifying vectors used in experiment 1.

    Table S3. List of impedance parameters used in the experiments.

    Table S4. List of target motions for classification in experiment 2.

    Table S5. List of modifying vectors used in experiment 2.

    Movie S1. Control of five fingers.

    Movie S2. Grasping of plastic bottle.

    Movie S3. Block-picking task.

    Movie S4. Plastic bottle–picking task.

    Movie S5. Notebook-holding task.

  • Supplementary Materials

    The PDF file includes:

    • Fig. S1. Exterior and inside of the prosthetic hand and structure of the finger.
    • Fig. S2. General form of motion-generation model.
    • Fig. S3. Biomimetic impedance control system.
    • Fig. S4. Locations of electrodes.
    • Table S1. List of target motions for classification in experiment 1.
    • Table S2. List of modifying vectors used in experiment 1.
    • Table S3. List of impedance parameters used in the experiments.
    • Table S4. List of target motions for classification in experiment 2.
    • Table S5. List of modifying vectors used in experiment 2.

    Download PDF

    Other Supplementary Material for this manuscript includes the following:

    • Movie S1 (.mp4 format). Control of five fingers.
    • Movie S2 (.mp4 format). Grasping of plastic bottle.
    • Movie S3 (.mp4 format). Block-picking task.
    • Movie S4 (.mp4 format). Plastic bottle–picking task.
    • Movie S5 (.mp4 format). Notebook-holding task.

    Files in this Data Supplement:

Stay Connected to Science Robotics

Navigate This Article