People may trust computers more than humans by Staff Writers Athens GA (SPX) Apr 14, 2021
Despite increasing concern over the intrusion of algorithms in daily life, people may be more willing to trust a computer program than their fellow humans, especially if a task becomes too challenging, according to new research from data scientists at the University of Georgia. From choosing the next song on your playlist to choosing the right size pants, people are relying more on the advice of algorithms to help make everyday decisions and streamline their lives. "Algorithms are able to do a huge number of tasks, and the number of tasks that they are able to do is expanding practically every day," said Eric Bogert, a Ph.D. student in the Terry College of Business Department of Management Information Systems. "It seems like there's a bias towards leaning more heavily on algorithms as a task gets harder and that effect is stronger than the bias towards relying on advice from other people." Bogert worked with management information systems professor Rick Watson and assistant professor Aaron Schecter on the paper, "Humans rely more on algorithms than social influence as a task becomes more difficult," which was published April 13 in Nature's Scientific Reports journal. Their study, which involved 1,500 individuals evaluating photographs, is part of a larger body of work analyzing how and when people work with algorithms to process information and make decisions. For this study, the team asked volunteers to count the number of people in a photograph of a crowd and supplied suggestions that were generated by a group of other people and suggestions generated by an algorithm. As the number of people in the photograph expanded, counting became more difficult and people were more likely to follow the suggestion generated by an algorithm rather than count themselves or follow the "wisdom of the crowd," Schecter said. Schecter explained that the choice of counting as the trial task was an important one because the number of people in the photo makes the task objectively harder as it increases. It also is the type of task that laypeople expect computers to be good at. "This is a task that people perceive that a computer will be good at, even though it might be more subject to bias than counting objects," Schecter said. "One of the common problems with AI is when it is used for awarding credit or approving someone for loans. While that is a subjective decision, there are a lot of numbers in there - like income and credit score - so people feel like this is a good job for an algorithm. But we know that dependence leads to discriminatory practices in many cases because of social factors that aren't considered." Facial recognition and hiring algorithms have come under scrutiny in recent years as well because their use has revealed cultural biases in the way they were built, which can cause inaccuracies when matching faces to identities or screening for qualified job candidates, Schecter said. Those biases may not be present in a simple task like counting, but their presence in other trusted algorithms is a reason why it's important to understand how people rely on algorithms when making decisions, he added. This study was part of Schecter's larger research program into human-machine collaboration, which is funded by a $300,000 grant from the U.S. Army Research Office. "The eventual goal is to look at groups of humans and machines making decisions and find how we can get them to trust each other and how that changes their behavior," Schecter said. "Because there's very little research in that setting, we're starting with the fundamentals." Schecter, Watson and Bogert are currently studying how people rely on algorithms when making creative judgments and moral judgments, like writing descriptive passages and setting bail of prisoners.
Hands-free: Monkey plays video game - with its brain San Francisco (AFP) April 10, 2021 Elon Musk's startup devoted to meshing brains with computers was closer to its dream on Friday, having gotten a monkey to play video game Pong using only its mind. Musk has long contended that merging minds with machines is vital if people are going to avoid being outpaced by artificial intelligence. A video posted on YouTube by the entrepreneur's Neuralink startup showed a macaque monkey named "Pager" playing Pong by essentially using thought to move paddles that bounce digital balls back and f ... read more
|
|
The content herein, unless otherwise known to be public domain, are Copyright 1995-2024 - Space Media Network. All websites are published in Australia and are solely subject to Australian law and governed by Fair Use principals for news reporting and research purposes. AFP, UPI and IANS news wire stories are copyright Agence France-Presse, United Press International and Indo-Asia News Service. ESA news reports are copyright European Space Agency. All NASA sourced material is public domain. Additional copyrights may apply in whole or part to other bona fide parties. All articles labeled "by Staff Writers" include reports supplied to Space Media Network by industry news wires, PR agencies, corporate press officers and the like. Such articles are individually curated and edited by Space Media Network staff on the basis of the report's information value to our industry and professional readership. Advertising does not imply endorsement, agreement or approval of any opinions, statements or information provided by Space Media Network on any Web page published or hosted by Space Media Network. General Data Protection Regulation (GDPR) Statement Our advertisers use various cookies and the like to deliver the best ad banner available at one time. All network advertising suppliers have GDPR policies (Legitimate Interest) that conform with EU regulations for data collection. By using our websites you consent to cookie based advertising. If you do not agree with this then you must stop using the websites from May 25, 2018. Privacy Statement. Additional information can be found here at About Us. |