Robots sense human touch using camera and shadows by Staff Writers Ithaca NY (SPX) Feb 12, 2021
Soft robots may not be in touch with human feelings, but they are getting better at feeling human touch. Cornell University researchers have created a low-cost method for soft, deformable robots to detect a range of physical interactions, from pats to punches to hugs, without relying on touch at all. Instead, a USB camera located inside the robot captures the shadow movements of hand gestures on the robot's skin and classifies them with machine-learning software. The group's paper, "ShadowSense: Detecting Human Touch in a Social Robot Using Shadow Image Classification," published in the Proceedings of the Association for Computing Machinery on Interactive, Mobile, Wearable and Ubiquitous Technologies. The paper's lead author is doctoral student, Yuhan Hu. The new ShadowSense technology is the latest project from the Human-Robot Collaboration and Companionship Lab, led by the paper's senior author, Guy Hoffman, associate professor in the Sibley School of Mechanical and Aerospace Engineering. The technology originated as part of an effort to develop inflatable robots that could guide people to safety during emergency evacuations. Such a robot would need to be able to communicate with humans in extreme conditions and environments. Imagine a robot physically leading someone down a noisy, smoke-filled corridor by detecting the pressure of the person's hand. Rather than installing a large number of contact sensors - which would add weight and complex wiring to the robot, and would be difficult to embed in a deforming skin - the team took a counterintuitive approach. In order to gauge touch, they looked to sight. "By placing a camera inside the robot, we can infer how the person is touching it and what the person's intent is just by looking at the shadow images," Hu said. "We think there is interesting potential there, because there are lots of social robots that are not able to detect touch gestures." The prototype robot consists of a soft inflatable bladder of nylon skin stretched around a cylindrical skeleton, roughly four feet in height, that is mounted on a mobile base. Under the robot's skin is a USB camera, which connects to a laptop. The researchers developed a neural-network-based algorithm that uses previously recorded training data to distinguish between six touch gestures - touching with a palm, punching, touching with two hands, hugging, pointing and not touching at all - with an accuracy of 87.5 to 96%, depending on the lighting. The robot can be programmed to respond to certain touches and gestures, such as rolling away or issuing a message through a loudspeaker. And the robot's skin has the potential to be turned into an interactive screen. By collecting enough data, a robot could be trained to recognize an even wider vocabulary of interactions, custom-tailored to fit the robot's task, Hu said. The robot doesn't even have to be a robot. ShadowSense technology can be incorporated into other materials, such as balloons, turning them into touch-sensitive devices. In addition to providing a simple solution to a complicated technical challenge, and making robots more user-friendly to boot, ShadowSense offers a comfort that is increasingly rare in these high-tech times: privacy. "If the robot can only see you in the form of your shadow, it can detect what you're doing without taking high fidelity images of your appearance," Hu said. "That gives you a physical filter and protection, and provides psychological comfort."
How modern robots are developed Moscow, Russia (SPX) Feb 04, 2021 Today, neuroscience and robotics are developing hand in hand. Mikhail Lebedev, Academic Supervisor at HSE University's Centre for Bioelectric Interfaces, spoke about how studying the brain inspires the development of robots. Robots are interesting to neuroscience and neuroscience is interesting to robots - this is what the article 'Neuroengineering challenges of fusing robotics and neuroscience' was about in the journal Science Robotics. Such collaborative development contributes to progress in bo ... read more
|
|
The content herein, unless otherwise known to be public domain, are Copyright 1995-2024 - Space Media Network. All websites are published in Australia and are solely subject to Australian law and governed by Fair Use principals for news reporting and research purposes. AFP, UPI and IANS news wire stories are copyright Agence France-Presse, United Press International and Indo-Asia News Service. ESA news reports are copyright European Space Agency. All NASA sourced material is public domain. Additional copyrights may apply in whole or part to other bona fide parties. All articles labeled "by Staff Writers" include reports supplied to Space Media Network by industry news wires, PR agencies, corporate press officers and the like. Such articles are individually curated and edited by Space Media Network staff on the basis of the report's information value to our industry and professional readership. Advertising does not imply endorsement, agreement or approval of any opinions, statements or information provided by Space Media Network on any Web page published or hosted by Space Media Network. General Data Protection Regulation (GDPR) Statement Our advertisers use various cookies and the like to deliver the best ad banner available at one time. All network advertising suppliers have GDPR policies (Legitimate Interest) that conform with EU regulations for data collection. By using our websites you consent to cookie based advertising. If you do not agree with this then you must stop using the websites from May 25, 2018. Privacy Statement. Additional information can be found here at About Us. |