Inflatable robotic hand gives amputees real-time tactile control by Jennifer Chu for MIT News Boston MA (SPX) Aug 17, 2021
For the more than 5 million people in the world who have undergone an upper-limb amputation, prosthetics have come a long way. Beyond traditional mannequin-like appendages, there is a growing number of commercial neuroprosthetics - highly articulated bionic limbs, engineered to sense a user's residual muscle signals and robotically mimic their intended motions. But this high-tech dexterity comes at a price. Neuroprosthetics can cost tens of thousands of dollars and are built around metal skeletons, with electrical motors that can be heavy and rigid. Now engineers at MIT and Shanghai Jiao Tong University have designed a soft, lightweight, and potentially low-cost neuroprosthetic hand. Amputees who tested the artificial limb performed daily activities, such as zipping a suitcase, pouring a carton of juice, and petting a cat, just as well as - and in some cases better than -those with more rigid neuroprosthetics. The researchers found the prosthetic, designed with a system for tactile feedback, restored some primitive sensation in a volunteer's residual limb. The new design is also surprisingly durable, quickly recovering after being struck with a hammer or run over with a car. The smart hand is soft and elastic, and weighs about half a pound. Its components total around $500 - a fraction of the weight and material cost associated with more rigid smart limbs. "This is not a product yet, but the performance is already similar or superior to existing neuroprosthetics, which we're excited about," says Xuanhe Zhao, professor of mechanical engineering and of civil and environmental engineering at MIT. "There's huge potential to make this soft prosthetic very low cost, for low-income families who have suffered from amputation." Zhao and his colleagues have published their work in Nature Biomedical Engineering. Co-authors include MIT postdoc Shaoting Lin, along with Guoying Gu, Xiangyang Zhu, and collaborators at Shanghai Jiao Tong University in China.
Big Hero hand Rather than controlling each finger using mounted electrical motors, as most neuroprosthetics do, the researchers used a simple pneumatic system to precisely inflate fingers and bend them in specific positions. This system, including a small pump and valves, can be worn at the waist, significantly reducing the prosthetic's weight. Lin developed a computer model to relate a finger's desired position to the corresponding pressure a pump would have to apply to achieve that position. Using this model, the team developed a controller that directs the pneumatic system to inflate the fingers, in positions that mimic five common grasps, including pinching two and three fingers together, making a balled-up fist, and cupping the palm. The pneumatic system receives signals from EMG sensors - electromyography sensors that measure electrical signals generated by motor neurons to control muscles. The sensors are fitted at the prosthetic's opening, where it attaches to a user's limb. In this arrangement, the sensors can pick up signals from a residual limb, such as when an amputee imagines making a fist. The team then used an existing algorithm that "decodes" muscle signals and relates them to common grasp types. They used this algorithm to program the controller for their pneumatic system. When an amputee imagines, for instance, holding a wine glass, the sensors pick up the residual muscle signals, which the controller then translates into corresponding pressures. The pump then applies those pressures to inflate each finger and produce the amputee's intended grasp. Going a step further in their design, the researchers looked to enable tactile feedback - a feature that is not incorporated in most commercial neuroprosthetics. To do this, they stitched to each fingertip a pressure sensor, which when touched or squeezed produces an electrical signal proportional to the sensed pressure. Each sensor is wired to a specific location on an amputee's residual limb, so the user can "feel" when the prosthetic's thumb is pressed, for example, versus the forefinger.
Good grip After completing this 15-minute training, the volunteers were asked to perform a number of standardized tests to demonstrate manual strength and dexterity. These tasks included stacking checkers, turning pages, writing with a pen, lifting heavy balls, and picking up fragile objects like strawberries and bread. They repeated the same tests using a more rigid, commercially available bionic hand and found that the inflatable prosthetic was as good, or even better, at most tasks, compared to its rigid counterpart. One volunteer was also able to intuitively use the soft prosthetic in daily activities, for instance to eat food like crackers, cake, and apples, and to handle objects and tools, such as laptops, bottles, hammers, and pliers. This volunteer could also safely manipulate the squishy prosthetic, for instance to shake someone's hand, touch a flower, and pet a cat. In a particularly exciting exercise, the researchers blindfolded the volunteer and found he could discern which prosthetic finger they poked and brushed. He was also able to "feel" bottles of different sizes that were placed in the prosthetic hand, and lifted them in response. The team sees these experiments as a promising sign that amputees can regain a form of sensation and real-time control with the inflatable hand. The team has filed a patent on the design, through MIT, and is working to improve its sensing and range of motion. "We now have four grasp types. There can be more," Zhao says. "This design can be improved, with better decoding technology, higher-density myoelectric arrays, and a more compact pump that could be worn on the wrist. We also want to customize the design for mass production, so we can translate soft robotic technology to benefit society."
Research Report: "A soft neuroprosthetic hand providing simultaneous myoelectric control and tactile feedback"
Artificial Intelligence learns better when distracted Groningen, Netherlands (SPX) Aug 03, 2021 How should you train your AI system? This question is pertinent, because many deep learning systems are still black boxes. Computer scientists from the Netherlands and Spain have now determined how a deep learning system well suited for image recognition learns to recognize its surroundings. They were able to simplify the learning process by forcing the system's focus toward secondary characteristics. Convolutional Neural Networks (CNNs) are a form of bio-inspired deep learning in artificial intel ... read more
|
|
The content herein, unless otherwise known to be public domain, are Copyright 1995-2024 - Space Media Network. All websites are published in Australia and are solely subject to Australian law and governed by Fair Use principals for news reporting and research purposes. AFP, UPI and IANS news wire stories are copyright Agence France-Presse, United Press International and Indo-Asia News Service. ESA news reports are copyright European Space Agency. All NASA sourced material is public domain. Additional copyrights may apply in whole or part to other bona fide parties. All articles labeled "by Staff Writers" include reports supplied to Space Media Network by industry news wires, PR agencies, corporate press officers and the like. Such articles are individually curated and edited by Space Media Network staff on the basis of the report's information value to our industry and professional readership. Advertising does not imply endorsement, agreement or approval of any opinions, statements or information provided by Space Media Network on any Web page published or hosted by Space Media Network. General Data Protection Regulation (GDPR) Statement Our advertisers use various cookies and the like to deliver the best ad banner available at one time. All network advertising suppliers have GDPR policies (Legitimate Interest) that conform with EU regulations for data collection. By using our websites you consent to cookie based advertising. If you do not agree with this then you must stop using the websites from May 25, 2018. Privacy Statement. Additional information can be found here at About Us. |