Artificial bug eyes by Staff Writers Washington DC (SPX) Jan 10, 2019
Single lens eyes, like those in humans and many other animals, can create sharp images, but the compound eyes of insects and crustaceans have an edge when it comes to peripheral vision, light sensitivity and motion detection. That's why scientists are developing artificial compound eyes to give sight to autonomous vehicles and robots, among other applications. Now, a report in ACS Nano describes the preparation of bioinspired artificial compound eyes using a simple low-cost approach. Compound eyes are made up of tiny independent repeating visual receptors, called ommatidia, each consisting of a lens, cornea and photoreceptor cells. Some insects have thousands of units per eye; creatures with more ommatidia have increased visual resolution. Attempts to create artificial compound eyes in the lab are often limited by cost, tend to be large and sometimes include only a fraction of the ommatidia and nanostructures typical of natural compound eyes. Some groups are using lasers and nanotechnology to generate artificial bug eyes in bulk, but the structures tend to lack uniformity and are often distorted, which compromises sight. To make artificial insect eyes with improved visual properties, Wenjun Wang and colleagues developed a new strategy with improved structural homogeneity. As a first step, the researchers shot a laser through a double layer of acrylic glass, focusing on the lower layer. The laser caused the lower layer to swell, creating a convex dome shape. The researchers created an array of these tiny lenses that could themselves be bent along a curved structure to create the artificial eye. Then, through several steps, the researchers grew nanostructures on top of the convex glass domes that, up close, resemble a shag carpet. The nanostructures endowed the microlenses with desirable antireflective and water-repellent properties.
How game theory can bring humans and robots closer together Sussex UK (SPX) Jan 08, 2019 Researchers at the University of Sussex, Imperial College London and Nanyang Technological University in Singapore have for the first time used game theory to enable robots to assist humans in a safe and versatile manner. The research team used adaptive control and Nash equilibrium game theory to programme a robot that can understand its human user's behaviour in order to better anticipate their movements and respond to them. The researchers believe the breakthrough could help robots complem ... read more
|
|
The content herein, unless otherwise known to be public domain, are Copyright 1995-2024 - Space Media Network. All websites are published in Australia and are solely subject to Australian law and governed by Fair Use principals for news reporting and research purposes. AFP, UPI and IANS news wire stories are copyright Agence France-Presse, United Press International and Indo-Asia News Service. ESA news reports are copyright European Space Agency. All NASA sourced material is public domain. Additional copyrights may apply in whole or part to other bona fide parties. All articles labeled "by Staff Writers" include reports supplied to Space Media Network by industry news wires, PR agencies, corporate press officers and the like. Such articles are individually curated and edited by Space Media Network staff on the basis of the report's information value to our industry and professional readership. Advertising does not imply endorsement, agreement or approval of any opinions, statements or information provided by Space Media Network on any Web page published or hosted by Space Media Network. General Data Protection Regulation (GDPR) Statement Our advertisers use various cookies and the like to deliver the best ad banner available at one time. All network advertising suppliers have GDPR policies (Legitimate Interest) that conform with EU regulations for data collection. By using our websites you consent to cookie based advertising. If you do not agree with this then you must stop using the websites from May 25, 2018. Privacy Statement. Additional information can be found here at About Us. |