Loading...
You are here:  Home  >  Education Resources For Use & Management of Data  >  Data Daily | Data News  >  Current Article

AI Companies, Scientists, and Others Sign Pledge Not to Develop Lethal Autonomous Weapons

By   /  July 20, 2018  /  No Comments

According to a recent press release, “After years of voicing concerns, AI leaders have, for the first time, taken concrete action against lethal autonomous weapons, signing a pledge to ‘neither participate in nor support the development, manufacture, trade, or use of lethal autonomous weapons.’ The pledge has been signed to date by over 160 AI-related companies and organizations from 36 countries, and 2,400 individuals from 90 countries. Signatories of the pledge include Google DeepMind, University College London, the XPRIZE Foundation, ClearPath Robotics/OTTO Motors, the European Association for AI (EurAI), the Swedish AI Society (SAIS), Demis Hassabis, British MP Alex Sobel, Elon Musk, Stuart Russell, Yoshua Bengio, Anca Dragan, and Toby Walsh. Max Tegmark, president of the Future of Life Institute (FLI) which organized the effort, announced the pledge on July 18 in Stockholm, Sweden during the annual International Joint Conference on Artificial Intelligence (IJCAI), which draws over 5,000 of the world’s leading AI researchers. SAIS and EurAI were also organizers of this year’s IJCAI.”

The release goes on, “Said Tegmark, ‘I’m excited to see AI leaders shifting from talk to action, implementing a policy that politicians have thus far failed to put into effect. AI has huge potential to help the world – if we stigmatize and prevent its abuse. AI weapons that autonomously decide to kill people are as disgusting and destabilizing as bioweapons, and should be dealt with in the same way.’ Lethal autonomous weapons systems (LAWS) are weapons that can identify, target, and kill a person, without a human ‘in-the-loop.’ That is, no person makes the final decision to authorize lethal force: the decision and authorization about whether or not someone will die is left to the autonomous weapons system. (This does not include today’s drones, which are under human control. It also does not include autonomous systems that merely defend against other weapons, since “lethal” implies killing a human.)”

Read more at Future of Life Institute.

Photo credit: Neil Rosenstech on Unsplash

You might also like...

Cloud Architecture and Cloud Computing Trends in 2019

Read More →