Scientists attempt to teach robots human values by Brooks Hays Ithaca, N.Y. (UPI) Sep 8, 2016
A pair of artificial intelligence experts from Cornell University have joined a nationwide effort to ensure the nightmare science fiction scenarios -- the ones involving corrupted human-killing computers -- don't become a reality. The effort is organized by the Center for Human-Compatible Artificial Intelligence, based at the University of California, Berkeley. To keep computers behaving properly as they take on more and more responsibility and autonomy, researchers are working to instill their software with human values. Researchers want the algorithms that govern the AI decision-making process to include an understanding of human ethics. "We are in a period in history when we start using these machines to make judgments," researcher Bart Selman, a professor of computer science at Cornell, explained in a news release. "If decisions are properly structured, the horrors we've seen in the movies won't happen." Selman is an expert in programming computer decision-making processes. He recently helped Tesla with the company's self-driving car technology. Self-driving cars must be programmed to make a variety of difficult decisions. For example, they must calculate the risks of passing a slow car or veering into another lane to avoid an object. These sorts of decisions aren't necessarily all that different from ethical dilemmas. Given only two poor options, should a car hit a dog or a group of school children? Self-driving cars -- and the humans that program them -- must decide whether to prioritize the driver's safety or the public at large. Should a car risk the driver's life to save a group of pedestrians? The stakes will become magnified as computers take on even bigger, more comprehensive management tasks like controlling an entire air traffic control tower or hospital. Joseph Halpern, a professor of computer science at Cornell and also a "decision theory" expert, says providing an artificial intelligent agent with as much information as possible will make these difficult decisions more manageable. "If you have lots of data you can estimate the probabilities well and get a much better handle on uncertainty," Halpern said. Scientists at Georgia Tech have been working to instill human values by teaching robots fairy tales. The approach isn't all that different from strategy suggested by Selman and Halpern. Ultimately, Halpern says, computers may be best served by watching how humans respond to ethical dilemmas.
Related Links All about the robots on Earth and beyond!
|
|
The content herein, unless otherwise known to be public domain, are Copyright 1995-2024 - Space Media Network. All websites are published in Australia and are solely subject to Australian law and governed by Fair Use principals for news reporting and research purposes. AFP, UPI and IANS news wire stories are copyright Agence France-Presse, United Press International and Indo-Asia News Service. ESA news reports are copyright European Space Agency. All NASA sourced material is public domain. Additional copyrights may apply in whole or part to other bona fide parties. All articles labeled "by Staff Writers" include reports supplied to Space Media Network by industry news wires, PR agencies, corporate press officers and the like. Such articles are individually curated and edited by Space Media Network staff on the basis of the report's information value to our industry and professional readership. Advertising does not imply endorsement, agreement or approval of any opinions, statements or information provided by Space Media Network on any Web page published or hosted by Space Media Network. General Data Protection Regulation (GDPR) Statement Our advertisers use various cookies and the like to deliver the best ad banner available at one time. All network advertising suppliers have GDPR policies (Legitimate Interest) that conform with EU regulations for data collection. By using our websites you consent to cookie based advertising. If you do not agree with this then you must stop using the websites from May 25, 2018. Privacy Statement. Additional information can be found here at About Us. |