FocusBIOMIMETICS

Neuroengineering challenges of fusing robotics and neuroscience

See allHide authors and affiliations

Science Robotics  09 Dec 2020:
Vol. 5, Issue 49, eabd1911
DOI: 10.1126/scirobotics.abd1911

Abstract

Advances in neuroscience are inspiring developments in robotics and vice versa.

MODELING THE BRAIN

Roboticists are making use of insights from neuroscience to build better performing robots. This fusion of robotics and neuroscience represents a neuroengineering approach—a nascent research domain that is bringing together neuroscience, robotics, and artificial intelligence. This article highlights past and current perspectives and future key challenges at the intersection of robotics and neuroscience (Fig. 1).

Fig. 1 From brains to robots and back: Overview of past, current, and future perspectives at the intersection of neuroscience and robotics.

Key challenges toward future brain-robot synergy include the elaboration of neural decoders, soft- and hybrid-structured robotics, advanced feedback to the brain, and more widespread translation of neuroscience findings into robotics. An emerging challenge is the development of advanced control schemes for bidirectional brain-robot adaptation.

CREDIT: KELLIE HOLOSKI/SCIENCE ROBOTICS. PHOTO OF HUMANOID ROBOT WITH SKIN BY ASTRID ECKERT/TUM

Robotic platforms have been developed to study aspects of brain functions and body mechanisms, such as learning and sensory-motor control. Insights gained from the computational modeling of the working of the inner ear have enabled the emulation of vestibulo-ocular functions (neuronal reflex mechanism for stabilizing gaze during head movements). Emulating these functions has led to the use of realistic neuro-inspired models to study balancing and bipedal locomotion in humans that have been demonstrated in a 50–degrees-of-freedom (DOFs) humanoid robot (1). These developments have also led to enhanced function of complex robots—such as high-performance active visual perception, advanced bipedal balancing, locomotion, and manipulation—as well as active learning of complex tasks. Recent developments have successfully incorporated neuroscientific models, such as central pattern generators (CPGs) for robust locomotion (e.g., walking and running), into robotic systems. Moreover, the neuroscientific validity of these models has been tested. Using robots in conjunction with neuroscience research enables a better understanding of a range of brain functions from neural mechanisms of motor control (2) to social interactions (3).

NEUROSCIENCE-BASED ENGINEERING SOLUTIONS

Neuromorphic electronics, which emulate neurological principles on devices to function like artificial neural systems, have demonstrated advancements over standard engineering solutions in computing, robot sensing, and actuation (4). An autonomous full-sized humanoid robot with human-like sensitive robot skin was developed that leverages the neural event-driven mechanism to process tactile information (5).

BRAIN-CONTROLLING MACHINES

Brain-machine interfaces (BMIs) can achieve direct brain control of robots, enabling the restoration of motor function and the ability to probe the neural circuits of the brain (6). This dual use of BMI has set a path toward a unifying approach for exploiting, studying, and altering neural mechanisms in a closed-loop fashion. Although the main point of BMI is to map neural activity that represents motor intentions to tasks executed by robots, such as neural prostheses or exoskeletons, there have been examples where BMI-based direct control of robotic systems has led to an incorporation of such artificial devices into the representation of the human body by neural circuits in motor and somatosensory cortical and subcortical structures (6). This points toward the possibility of a seamless integration of complex (semi-)autonomous machines into the human body schema beyond the well-known integration of inanimate tools. In addition to direct control of prosthetic devices (6), BMIs have been useful for the rehabilitation of lost sensations and motor function, e.g., in patients with stroke-induced motor impairments (7). Partial recovery of lost sensation and motor control was observed even in spinal cord injury paraplegics who underwent a long-term gait neurorehabilitation therapy using a BMI-controlled exoskeleton (8). This unprecedented neurological recovery points toward the large, yet underexplored, potential of combining BMI and robotics for the treatment of, to date, incurable disorders.

BRAIN FEEDBACK TO TEACH ROBOTS

Task-level improvements through an error-update loop between a human and a robot were shown in several well-defined tasks, such as in the context of simplified upper-limb neural prostheses reaching tasks (9). These reports demonstrated the successful decoding of error-sensitive brain readings to adapt robot behavior to the human expectations. This approach opens the future prospect of efficiently training robots while incorporating and following human conventions without the need for explicitly programming each task (by an expert). Coadaptations during human-robot interaction in a continuous manner have been demonstrated where a robot and a human started to adapt to each other. By measuring the changes in the brain and its ability to detect unexpected circumstances, alterations of robot behavior were shown to be feasible as indicated by emerging joint human-robot policies that triggered efficient interaction (10). Collectively, these works set out the path toward establishing truly synergistic human-robot interaction.

CHALLENGES AHEAD

Closing the loop

A clear understanding of closed-loop interfaces with suitable feedback from a robot to a human user (beyond the fact that it worked) is still missing. This opens challenges in scenarios where the brain and the robot become one through a bidirectional control-feedback loop. Further challenges include the following: (i) Sensing human states and intentions using neural interfaces will become increasingly difficult with the requirement of real-time neural decoding of a larger numbers of DOFs, e.g., more complex and fine-grained motor commands, beyond the few states that can nowadays be used to control robots. The optimization of calibrating neural decoders to individually varying brain activities poses another challenge. (ii) Feedback to the brain that induces seamless and natural acceptance of the robot by the human/brain poses a huge challenge, specifically, the types and modalities of feedback, their spatial accuracy, and timing of feedback such as latencies of feedback sensation and dynamical feedback modulations by human and robot movements. (iii) Control schemes for continuous human-robot bidirectional adaptation beyond task-level adaptations have only started to emerge. Challenges ahead include the formalization of human-robot and brain-robot interaction loops and its generalization across use cases and applications.

A truly realistic functional model

Roboticists will continue to take inspiration from neuroscience to build highly efficient robots with higher levels of sophistication, and neuroscientists will further challenge roboticists for tools as realistic models for their studies. Further challenges include the following: (i) Soft structured and hybrid robots are emerging, although in comparison with biology, they are still primitive. One challenge ahead is to build realistic platforms with closer resemblances to nature. (ii) Accessible platforms that represent genuine models and are realistically trustful as well as simple enough to be used for neuroscientific studies will be needed. (iii) Sensorimotor control of highly complex structures such as soft- and hybrid-structured robots could also benefit from taking lessons from neuroscience. Yet, translating findings from neuroscience to robotics still poses an important challenge, especially in solving large-scale problems in robotics, such as the control of soft exoskeletons.

REFERENCES AND NOTES

Funding: This work was partially supported by the Elite Network Bavaria (ENB) through the Master Program in Neuroengineering (MSNE) and the Deutsche Forschungsgemeinschaft (DFG) through the International Graduate School of Science and Engineering (IGSSE) at the Technical University of Munich (TUM).

Stay Connected to Science Robotics

Navigate This Article