Smarter AI: Machine learning without negative data by Staff Writers Tokyo, Japan (SPX) Nov 27, 2018
A research team from the RIKEN Center for Advanced Intelligence Project (AIP) has successfully developed a new method for machine learning that allows an AI to make classifications without what is known as "negative data," a finding which could lead to wider application to a variety of classification tasks. Classifying things is critical for our daily lives. For example, we have to detect spam mail, fake political news, as well as more mundane things such as objects or faces. When using AI, such tasks are based on "classification technology" in machine learning - having the computer learn using the boundary separating positive and negative data. For example, "positive" data would be photos including a happy face, and "negative" data photos that include a sad face. Once a classification boundary is learned, the computer can determine whether a certain data is positive or negative. The difficulty with this technology is that it requires both positive and negative data for the learning process, and negative data are not available in many cases (for instance, it is hard to find photos with the label, "this photo includes a sad face," since most people smile in front of a camera.) In terms of real-life programs, when a retailer is trying to predict who will make a purchase, it can easily find data on customers who purchased from them (positive data), but it is basically impossible to obtain data on customers who did not purchase from them (negative data), since they do not have access to their competitors' data. Another example is a common task for app developers: they need to predict which users will continue using the app (positive) or stop (negative). However, when a user unsubscribes, the developers lose the user's data because they have to completely delete data regarding that user in accordance with the privacy policy to protect personal information. According to lead author Takashi Ishida from RIKEN AIP, "Previous classification methods could not cope with the situation where negative data were not available, but we have made it possible for computers to learn with only positive data, as long as we have a confidence score for our positive data, constructed from information such as buying intention or the active rate of app users. Using our new method, we can let computers learn a classifier only from positive data equipped with confidence." Ishida proposed, together with researcher Niu Gang from his group and team leader Masashi Sugiyama, that they let computers learn well by adding the confidence score, which mathematically corresponds to the probability whether the data belongs to a positive class or not. They succeeded in developing a method that can let computers learn a classification boundary only from positive data and information on its confidence (positive reliability) against classification problems of machine learning that divide data positively and negatively. To see how well the system functioned, they used it on a set of photos that contains various labels of fashion items. For example, they chose "T-shirt," as the positive class and one other item, e.g., "sandal", as the negative class. Then they attached a confidence score to the "T-shirt" photos. They found that without accessing the negative data (e.g., "sandal" photos), in some cases, their method was just as good as a method that involves using positive and negative data. According to Ishida, "This discovery could expand the range of applications where classification technology can be used. Even in fields where machine learning has been actively used, our classification technology could be used in new situations where only positive data can be gathered due to data regulation or business constraints. In the near future, we hope to put our technology to use in various research fields, such as natural language processing, computer vision, robotics, and bioinformatics."
GMV leads an ambitious campaign of space robotics trials Madrid, Spain (SPX) Nov 26, 2018 Since mid-September the technology multinational GMV has been taking part in the final field tests of the projects included under the European Commission's (H2020) Strategic Research Cluster ( SRC ) programme. These tests are due to run until December 15. The main aim of the space robotics SRC is to create, within the timeframe of 2020-2030, the necessary tools for consolidating the technical maturity of robotics systems for in-orbit- servicing and planetary-exploration missions. The PERASPERA pro ... read more
|
|
The content herein, unless otherwise known to be public domain, are Copyright 1995-2024 - Space Media Network. All websites are published in Australia and are solely subject to Australian law and governed by Fair Use principals for news reporting and research purposes. AFP, UPI and IANS news wire stories are copyright Agence France-Presse, United Press International and Indo-Asia News Service. ESA news reports are copyright European Space Agency. All NASA sourced material is public domain. Additional copyrights may apply in whole or part to other bona fide parties. All articles labeled "by Staff Writers" include reports supplied to Space Media Network by industry news wires, PR agencies, corporate press officers and the like. Such articles are individually curated and edited by Space Media Network staff on the basis of the report's information value to our industry and professional readership. Advertising does not imply endorsement, agreement or approval of any opinions, statements or information provided by Space Media Network on any Web page published or hosted by Space Media Network. General Data Protection Regulation (GDPR) Statement Our advertisers use various cookies and the like to deliver the best ad banner available at one time. All network advertising suppliers have GDPR policies (Legitimate Interest) that conform with EU regulations for data collection. By using our websites you consent to cookie based advertising. If you do not agree with this then you must stop using the websites from May 25, 2018. Privacy Statement. Additional information can be found here at About Us. |