Robot Technology News  
ROBO SPACE
Amazon, Microsoft, 'putting world at risk of killer AI': study
By Issam AHMED
Washington (AFP) Aug 22, 2019

Amazon, Microsoft and Intel are among leading tech companies putting the world at risk through killer robot development, according to a report that surveyed major players from the sector about their stance on lethal autonomous weapons.

Dutch NGO Pax ranked 50 companies by three criteria: whether they were developing technology that could be relevant to deadly AI, whether they were working on related military projects, and if they had committed to abstaining from contributing in the future.

"Why are companies like Microsoft and Amazon not denying that they're currently developing these highly controversial weapons, which could decide to kill people without direct human involvement?" said Frank Slijper, lead author of the report published this week.

The use of AI to allow weapon systems to autonomously select and attack targets has sparked ethical debates in recent years, with critics warning they would jeopardize international security and herald a third revolution in warfare after gunpowder and the atomic bomb.

A panel of government experts debated policy options regarding lethal autonomous weapons at a meeting of the United Nations Convention on Certain Conventional Weapons in Geneva on Wednesday.

Google, which last year published guiding principles eschewing AI for use in weapons systems, was among seven companies found to be engaging in "best practice" in the analysis that spanned 12 countries, as was Japan's Softbank, best known for its humanoid Pepper robot.

Twenty-two companies were of "medium concern," while 21 fell into a "high concern" category, notably Amazon and Microsoft who are both bidding for a $10 billion Pentagon contract to provide the cloud infrastructure for the US military.

Others in the "high concern" group include Palantir, a company with roots in a CIA-backed venture capital organization that was awarded an $800 million contract to develop an AI system "that can help soldiers analyse a combat zone in real time."

"Autonomous weapons will inevitably become scalable weapons of mass destruction, because if the human is not in the loop, then a single person can launch a million weapons or a hundred million weapons," Stuart Russell, a computer science professor at the University of California, Berkeley told AFP on Wednesday.

"The fact is that autonomous weapons are going to be developed by corporations, and in terms of a campaign to prevent autonomous weapons from becoming widespread, they can play a very big role," he added.

The development of AI for military purposes has triggered debates and protest within the industry: last year Google declined to renew a Pentagon contract called Project Maven, which used machine learning to distinguish people and objects in drone videos.

It also dropped out of the running for Joint Enterprise Defense Infrastructure (JEDI), the cloud contract that Amazon and Microsoft are hoping to bag.

The report noted that Microsoft employees had also voiced their opposition to a US Army contract for an augmented reality headset, HoloLens, that aims at "increasing lethality" on the battlefield.

- What they might look like -

According to Russell, "anything that's currently a weapon, people are working on autonomous versions, whether it's tanks, fighter aircraft, or submarines."

Israel's Harpy is an autonomous drone that already exists, "loitering" in a target area and selecting sites to hit.

More worrying still are new categories of autonomous weapons that don't yet exist -- these could include armed mini-drones like those featured in the 2017 short film "Slaughterbots."

"With that type of weapon, you could send a million of them in a container or cargo aircraft -- so they have destructive capacity of a nuclear bomb but leave all the buildings behind," said Russell.

Using facial recognition technology, the drones could "wipe out one ethnic group or one gender, or using social media information you could wipe out all people with a political view."

The European Union in April published guidelines for how companies and governments should develop AI, including the need for human oversight, working towards societal and environmental wellbeing in a non-discriminatory way, and respecting privacy.

Russell argued it was essential to take the next step in the form of an international ban on lethal AI, that could be summarized as "machines that can decide to kill humans shall not be developed, deployed, or used."


Related Links
All about the robots on Earth and beyond!


Thanks for being here;
We need your help. The SpaceDaily news network continues to grow but revenues have never been harder to maintain.

With the rise of Ad Blockers, and Facebook - our traditional revenue sources via quality network advertising continues to decline. And unlike so many other news sites, we don't have a paywall - with those annoying usernames and passwords.

Our news coverage takes time and effort to publish 365 days a year.

If you find our news sites informative and useful then please consider becoming a regular supporter or for now make a one off contribution.
SpaceDaily Contributor
$5 Billed Once


credit card or paypal
SpaceDaily Monthly Supporter
$5 Billed Monthly


paypal only


ROBO SPACE
NASA Robots Compete Underground in DARPA Challenge
Pasadena CA (JPL) Aug 15, 2019
Robots from all over the world will compete to find objects in the DARPA Subterranean Challenge Systems Competition, held Aug. 15-22 in mining tunnels under Pittsburgh. Among them will be a team led by NASA's Jet Propulsion Laboratory in Pasadena, California, that features wheeled rovers, drones and climbing robots that can rise on pinball-flipper-shaped treads to scale obstacles. Held by the Defense Advanced Research Projects Agency (DARPA), the competition is intended to develop technology for f ... read more

Comment using your Disqus, Facebook, Google or Twitter login.



Share this article via these popular social media networks
del.icio.usdel.icio.us DiggDigg RedditReddit GoogleGoogle

ROBO SPACE
AFRL conducts first flight of robopilot unmanned air platform

Drone buzzes above vineyard helping Luxembourg winegrower

Skyfront Perimeter Drone Performs The First Beyond-Line-of-Sight Flight under FAA Part 107

Teams test swarm autonomy in second major OFFSET field experiment

ROBO SPACE
Air Force certifies first field unit for 3D printing of aircraft parts

NASA awards Physical Optics Corporation additional $4M contract for Zero Gravity Optical Fibers

Norway detects radioactive iodine near Russia

Radiation up to '16 times' the norm near Russia blast site

ROBO SPACE
New perovskite material shows early promise as an alternative to silicon

Newfound superconductor material could be the 'silicon of quantum computers'

Quantum light sources pave the way for optical circuits

Researchers produce electricity by flowing water over extremely thin layers of metal

ROBO SPACE
Seven bidders compete to fund Bulgaria nuclear project

Framatome, Warsaw University of Technology to establish nuclear energy training and development programs

UN nuclear watchdog to have new chief in place by January

US renews waivers for Iran civil nuclear projects

ROBO SPACE
Islamic State persists despite territory loss: Pompeo

Chinese state media ups ante over Hong Kong 'mobsters'

IS 'resurging' in Syria as US pulls troops: watchdog

Myanmar army's 'business empire' fuels atrocities: UN probe

ROBO SPACE
Oslo wants to reduce its emissions by 95 percent by 2030

Northern Irish pensioner thrives in off grid cottage

Global warming = more energy use = more warming

Big energy discussion 'scrubbed from record' at UN climate talks

ROBO SPACE
NASA's portable trash bin-sized nuclear power module to be ready by 2022

How much energy storage costs must fall to reach renewable energy's full potential

Physicists make graphene discovery that could help develop superconductors

OXIS Energy to develop proof-of-concept lightweight lithium sulfur cells for BYE AEROSPACE

ROBO SPACE
China launches first private rocket capable of carrying satellites

Chinese scientists say goodbye to Tiangong-2

China's space lab Tiangong 2 destroyed in controlled fall to earth

From Moon to Mars, Chinese space engineers rise to new challenges









The content herein, unless otherwise known to be public domain, are Copyright 1995-2024 - Space Media Network. All websites are published in Australia and are solely subject to Australian law and governed by Fair Use principals for news reporting and research purposes. AFP, UPI and IANS news wire stories are copyright Agence France-Presse, United Press International and Indo-Asia News Service. ESA news reports are copyright European Space Agency. All NASA sourced material is public domain. Additional copyrights may apply in whole or part to other bona fide parties. All articles labeled "by Staff Writers" include reports supplied to Space Media Network by industry news wires, PR agencies, corporate press officers and the like. Such articles are individually curated and edited by Space Media Network staff on the basis of the report's information value to our industry and professional readership. Advertising does not imply endorsement, agreement or approval of any opinions, statements or information provided by Space Media Network on any Web page published or hosted by Space Media Network. General Data Protection Regulation (GDPR) Statement Our advertisers use various cookies and the like to deliver the best ad banner available at one time. All network advertising suppliers have GDPR policies (Legitimate Interest) that conform with EU regulations for data collection. By using our websites you consent to cookie based advertising. If you do not agree with this then you must stop using the websites from May 25, 2018. Privacy Statement. Additional information can be found here at About Us.