Different types of grasps and bionic design: technological developments in recent decades have already led to advanced artificial hands. They can enable amputees who have lost a hand through accident or illness to regain some movements. Some of these modern prostheses allow independent finger movements and wrist rotation. These movements can be selected via a smartphone app or by using muscle signals from the forearm, typically detected by two sensors. For instance, the activation of wrist flexor muscles can be used to close the fingers together to grip a pen. If the wrist extensor muscles are contracted, the fingers re-open and the hand releases the pen. The same approach makes it possible to control different finger movements that are selected with the simultaneous activation of both flexor and extensor muscle groups. “These are movements that the patient has to learn during rehabilitation,” says Cristina Piazza, a professor of rehabilitation and assistive robotics at TUM. Now, Prof. Piazza’s research team has shown that artificial intelligence can enable patients to control advanced hand prostheses more intuitively by using the “synergy principle” and with the help of 128 sensors on the forearm.
The synergy principle: the brain activates a pool of muscle cells
What is the synergy principle? “It is known from neuroscientific studies that repetitive patterns are observed in experimental sessions, both in kinematics and muscle activation,” says Prof. Piazza. These patterns can be interpreted as the way in which the human brain copes with the complexity of the biological system. That means that the brain activates a pool of muscle cells, also in the forearm. The professor adds: “When we use our hands to grasp an object, for example a ball, we move our fingers in a synchronized way and adapt to the shape of the object when contact occurs.” The researchers are now using this principle to design and control artificial hands by creating new learning algorithms. This is necessary for intuitive movement: When controlling an artificial hand to grasp a pen, for example, multiple steps take place. First, the patient orients the artificial hand according to the grasping location, slowly moves the fingers together, and then grabs the pen. The goal is to make these movements more and more fluid, so that it is hardly noticeable that numerous separate movements make up an overall process. “With the help of machine learning, we can understand the variations among subjects and improve the control adaptability over time and the learning process,” concludes Patricia Capsi Morales, the senior scientist in Prof. Piazza’s team.
Discovering patterns from 128 signal channels
Experiments with the new approach already indicate that conventional control methods could soon be empowered by more advanced strategies. To study what is happening at the level of the central nervous system, the researchers are working with two films: one for the inside and one for the outside of the forearm. Each contains up to 64 sensors to detect muscle activation. The method also estimates which electrical signals the spinal motor neurons have transmitted. “The more sensors we use, the better we can record information from different muscle groups and find out which muscle activations are responsible for which hand movements,” explains Prof. Piazza. Depending on whether a person intends to make a fist, grip a pen or open a jam jar, “characteristic features of muscle signals” result, according to Dr. Capsi Morales – a prerequisite for intuitive movements.