EPFL scientists are working towards improving control of robotic hands. The idea is to combine neuro-engineering with robotics to help amputees, in particular, to have individual finger control and more. The tests were done on 3 amputees and 7 non-amputees o compare results. Those are published in Nature Machine Intelligence.
Combining Neuro-Engineering with Robotics
The study is unique because for the first time concepts from different fields have been merged as well as implemented. Thus, this study is significant in terms of neuroprosthetics. For instance, it observes muscular activity around amputee stump to decode finger movement. This is part of neuro engineering. On the other hand, scientists have used robotics to enable grasping of things and facilitate hold maintenance.
As Aude Billard, Lead – EPFL’s Learning Algorithms and Systems Laboratory, notes – holding is the first step. Second is ensuring that the grasp is firm. But, so far, after a few seconds, the object slips, giving a small window to react. The new hand developed with the study can react within 400 milliseconds. And it can stabilize the object even before the brain can process if the object is slipping. All this is possible thanks to pressures sensors that are present along the fingers.
Decoding Shared Control
The algorithm first takes into account user intentions and then uses those signals to bring about movement in fingers. Like other things that use smart technology, this too needs the user to train this algorithm. Hence, the amputee should move the prosthetic hand a couple of times to make this happen. Once, a pattern is generated, it is robotics all the way. But, it is important to note here that muscle signals can have noise. And so, algorithms have to be intelligent enough to differentiate meaningful activity from noise. Because, only then can prosthetics achieve the level of movement the study aims at.