Monkeys Drive Wheelchair with Their Cortical Activity

Our new study “Direct Cortical Control of Primate Whole-Body Navigation in a Mobile Robotic Wheelchair” presents the first demonstration of wheelchair navigation enabled through a cortical brain-machine interface (BMI).

Previous neurophysiological and BMI research in primates mostly focused on eye and arm movements, whereas brain mechanisms of whole-body movements (for example, jumping from one tree to another) were virtually neglected. This is a serious impediment to the development of invasive BMIs for wheelchair control. Such devices are needed for patients suffering from severe body paralysis.

Rajangam et al. for the first showed that rhesus monkeys can navigate themselves while seated in a wheelchair, using their cortical activity as the control signal. Monkeys were chronically implanted with multichannel electrode arrays, which recorded from several hundred cortical neurons in the sensorimotor cortex. The BMI transformed this neuronal ensemble activity into wheelchair linear (backward and forward) and rotational (leftward and rightward) velocity.

Monkeys successfully learned to navigate the wheelchair from one corner of the room to the other corner, where a food reward was placed in a feeder. Their ability to drive the wheelchair improved over several weeks of training. The navigation did not require any steering device (for example, a joystick); monkeys produced the wheelchair movements just by imagining themselves moving.

The demonstration was possible owing to Tim Hanson’s multichannel wireless recording system and brilliant engineering of Gary Lehew, Po-He Tseng and Allen Yin.

There is still a long way till invasive BMIs of this type could be implemented in human patients. But a proof of concept demonstration is there!

Walk Again: History of the Project

Discharges of cortical neurons during bipedal locomotion in a monkey

Discharges of cortical neurons during bipedal locomotion in a monkey

On June 12, an EEG-controlled, “Walk Again” exoskeleton was demonstrated at the World Cup opening.

The Walk Again project was first announced in the 2009 review article by Nicolelis and Lebedev “Principles of neural ensemble physiology underlying the operation of brain-machine interfaces” published in Nature Rev Neurosci:

Ultimately, we expect that the identification of principles of neural ensemble physiology will guide the development of a generation of cortical neuroprosthetic devices that can restore full-body mobility in patients suffering from devastating levels of paralysis, due either to traumatic or degenerative lesions of the nervous system. We believe that such devices should incorporate several key design features. First, brain-derived signals should be obtained from multi-electrode arrays implanted in the upper- and lower-limb representations of the cortex, preferably in multiple cortical areas. Custom-designed microchips (also known as neurochips), chronically implanted in the skull, would be used for neural signal-processing tasks. To significantly reduce the risk of infection and damage to the cortex, multi-channel wireless technology would transmit neural signals to a small, wearable processing unit. Such a unit would run multiple real-time computational models designed to optimize the real-time prediction of motor parameters. Time-varying, kinematic and dynamic digital motor signals would be used to continuously control actuators distributed across the joints of a wearable, whole-body, robotic exoskeleton. High-order brain-derived motor commands would then interact with the controllers of local actuators and sensors distributed across the exoskeleton. Such interplay between brain-derived and robotic control signals, known as shared brain–machine control, would assure both voluntary control and stability of bipedal walking of a patient supported by the exoskeleton.

Touch, position, stretch and force sensors, distributed throughout the exoskeleton, would generate a continuous stream of artificial touch and proprioceptive feedback signals to inform the patient’s brain of the neuroprosthetic performance. Such signals would be delivered by multichannel cortical microstimulation directly into the patient’s somatosensory areas. Our prediction is that, after a few weeks, such a continuous stream of somatosensory feedback signals, combined with vision, would allow patients to incorporate, through a process of experience-dependent cortical plasticity, the whole exoskeleton as an extension of their body.

These developments are likely to converge into the first reliable, safe and clinically useful cortical neuroprosthetic. To accelerate this process and make this milestone a clinical reality, a worldwide team of neurophysiologists, computer scientists, engineers, roboticists, neurologists and neurosurgeons has been assembled to launch the Walk Again Project, a non-profit, global initiative aimed at building the first cortical neuroprosthetic capable of restoring full-body mobility in severely paralysed patients.

Screenshots of neuronal ensemble activity taken many months after the implantation surgery

Screenshots of neuronal ensemble activity taken many months after the implantation surgery

Similar ideas were expressed in the 2009 original-research article by Fitzsimmons, Lebedev et al. “Extracting kinematic parameters for monkey bipedal walking from cortical neuronal ensemble activity“:

Based on these results, we propose an approach to restore locomotion in patients with lower limb paralysis that relies on using cortical activity to generate locomotor patterns in an artificial actuator, such as a wearable exoskeleton (Figure 8 G; Fleischer et al., 2006 ; Hesse et al., 2003 ; Veneman et al., 2007 ). This approach may be applicable to clinical cases in which the locomotion centers of the brain are intact, but cannot communicate with the spinal cord circuitry due to spinal cord injury. The feasibility of employing a cortically driven BMI for the restoration of gait is supported by fMRI studies in which cortical activation was detected when subjects imagined themselves walking (Bakker et al., 2007 , 2008 ; Iseki et al., 2008 ; Jahn et al., 2004 ) and when paraplegic patients imagined foot and leg movements (Alkadhi et al., 2005 ; Cramer et al., 2005 ; Hotz-Boendermaker et al., 2008 ). Event-related potentials also demonstrated cortical activations in similar circumstances (Halder et al., 2006 ; Lacourse et al., 1999 ; Muller-Putz et al., 2007 ). Further support for this idea comes from recent studies of EEG-based brain-computer interfaces for navigation in a virtual environment in healthy subjects (Pfurtscheller et al., 2006 ) and paraplegics (Enzinger et al., 2008 ).

While a cortical BMI based neuroprosthesis that derived all its control signals from the user would have to cope with the lack of signals normally derived from subcortical centers, such as the cerebellum, basal ganglia and brainstem (Grillner, 2006 ; Grillner et al., 2008 ; Hultborn and Nielsen, 2007 ; Kagan and Shik, 2004 ; Matsuyama et al., 2004 ; Mori et al., 2000 ; Takakusaki, 2008 ), these problems may be avoided by an approach which only derives higher level leg movement signals from brain activity, while allowing robotic systems to produce a safer, optimum output. The challenge of efficient low-level control could be overcome by implementing “shared brain–machine” control (Kim et al., 2006 ), i.e. a control strategy that allows robotic controllers to efficiently supervise low-level details of motor execution, while brain derived signals are utilized to derive higher-order voluntary motor commands (step initiation, step length, leg orientation).

A cortically driven BMI for the restoration of walking may become an integral part of other rehabilitation strategies employed to improve the quality of life of patients. In particular, it may supplement the strategy based on harnessing the remaining functionality and plasticity of spinal cord circuits isolated from the brain (Behrman et al., 2006 ; Dobkin et al., 1995 ; Grasso et al., 2004 ;Harkema, 2001 ; Lunenburger et al., 2006 ). Indeed, cortically driven exoskeletons may facilitate spinal cord plasticity, helping to recover locomotion automatisms. Additionally, cortically driven neuroprostheses may work in cohort with rehabilitation methods based on functional electrical stimulation (FES; Hamid and Hayek, 2008 ; Nightingale et al., 2007 ; Wieler et al., 1999 ; Zhang and Zhu, 2007 ). In such an implementation, the BMI output could be connected to a FES system that stimulates the subject’s leg muscles. Finally, there is the intriguing possibility of connecting the BMI to an electrical stimulator implanted in the spinal cord, a strategy that may help induce plastic reorganization within these circuits.

Altogether, our results indicate that direct linkages between the human brain and artificial devices may be utilized to define a series of neuroprosthetic devices for restoring the ability to walk in people suffering from paralysis.

An update was given in 2011 by Lebedev, Tate, et al. in “Future developments in brain-machine interface research“:

As follows from our results on BMIs that enact leg movements, BMIs for the whole body are likely to become a real possibility in the near future. We propose the development of a whole-body BMI in which neuronal ensemble activity recorded from multiple cortical areas in rhesus monkeys controls the actuators that enact movements of both upper and lower extremities. This BMI will be first implemented in a virtual environment (monkey avatar) and then using a whole-body exoskeleton. In these experiments we will also examine the plasticity of neuronal ensemble properties caused by their involvement in the whole-body BMI control and the ability of cortical ensembles to adapt to represent novel external actuators.

Furthermore, we will also explore the ability of an animal to navigate a virtual environment using both physical and neural control that engages both the upper and lower limbs. The first phase of these experiments will be to train the animals to walk in a bipedal manner on a treadmill while assisting the navigation with a hand steering control. We have already built a virtual environment needed for the monkey to navigate using 3D visualization software. Within this environment, the monkey’s body is represented by a life-like avatar. This representation is viewed in the third person by the monkey and employs real-world inverse kinematics to move, allowing the avatar’s limbs to move in close relation to the experimental animal.

Initially, the direction that the avatar is facing will be dictated by the monkey moving a handlebar with its hands. As the animal moves the handlebar left or right, the avatar will rotate in the corresponding direction. The avatar’s legs will mimic the exact motion of the monkey’s legs on the treadmill. The simplest task will be for the animal to simply move the avatar forward to an object that represents a reward, a virtual fruit. Virtual fruits will appear at different angular positions relative to the monkey, which will let us measure the neuronal representation of navigation direction and modulations in cortical arm representation related to the steering. The monkey will have to make several steps while steering in the required direction to approach a virtual reward and to obtain an actual reward. The next set of experiments will allow the animal to control the virtual BMI in a manner similar to how we anticipate that the eventual application will be used: with no active movement of the subject’s body parts. The animals will use the neural control of the environment to obtain rewards when they are seated in a monkey chair. We expect that the monkey will be able to generate periodic neural modulations associated with individual steps of the avatar even though it does not perform any actual steps with its own legs.

Finally, we will use the algorithms developed in these experiments to control a full-body monkey exoskeleton in a non-human primate which has been subjected to a spinal cord anesthetic block to produce a temporary and reversible state of quadriplegia. This exoskeleton will encase the monkey’s arms and legs. It will be attached to the monkey using bracelets molded in the shape of the monkey’s limbs. A full body exoskeleton prototype will be utilized. The basic design and controller will be based on the humanoid robot, Computational Brain (CB). The exoskeleton will provide full-sensory feedback to the BMI set up-joints position/ velocity/torque, ground contacts and orientations. In BMI mode, the exoskeleton will guide the monkey’s limbs with smooth motions while at the same time monitoring its range of motions to ensure it is within the safety limits. This demonstration will provide the first prototype of a neural prosthetic device that would allow paralyzed people to walk again.

At the time these papers were published, we were the only group that developed brain-machine interfaces for primate locomotion. (But jealous competitors still succeeded in blocking Fitzsimmons et al. from publication in high-impact journals.) Now rivalry is building up, as evident from some of the critiques to the World Cup demo. And this is actually good. Competitive environment means speedier progress.