Tesla CEO Elon Musk and 115 other tech leaders collectively announced Sunday they sent a letter to the United Nations asking it to ban “killer robots,” formally known as lethal autonomous weapons.
The coalition signed the open message to urge the international governing body to address the innate implications of such advanced technology, specifically its use on the battlefield, by effectively outlawing it.
“Lethal autonomous weapons threaten to become the third revolution in warfare. Once developed, they will permit armed conflict to be fought at a scale greater than ever, and at timescales faster than humans can comprehend,” the letter states, according to a press release issued by the Future of Life, an organization with Musk and famous astrophysicist Stephen Hawking as two of the several members of the board of advisers. “These can be weapons of terror, weapons that despots and terrorists use against innocent populations, and weapons hacked to behave in undesirable ways.”
Musk has expressed his dire concerns over artificial intelligence and robotics several times in recent months, despite developing and employing the technology for his self-driving cars.
“Musk of all people should know the future is always rife with uncertainty—after all, he helps construct it with each new revolutionary undertaking,” Ryan Hagemann, director of technology policy at the think tank the Niskanen Center, wrote in a blog post. “Imagine if there had been just a few additional regulatory barriers for SpaceX or Tesla to overcome.”
During a National Governors Association meeting in July, Musk described AI as the “biggest risk we face as a civilization.”
“Until people see robots going down the street killing people,” Musk said, “they don’t know how to react because it seems so ethereal.”
Musk said earlier in August that AI poses more of a risk than a nuclear weapon-armed North Korea.
Tech-focused groups have criticized Musk for his seemingly anti-AI position.
The Information, Technology, and Innovation Foundation (ITIF), for example, called the Tesla CEO an “alarmist” in 2015 for pledging $1 billion to prevent the proliferation of autonomous robots, adding that he and his ilk stoke fear about an upcoming artificial intelligence revolution.
Certain tech executives have also indirectly criticized Musk for his doomsday clamoring. Facebook CEO Mark Zuckerberg said Sunday that people who raise fears over the advent of AI are “pretty irresponsible.”
“I think you can build things and the world gets better. With AI especially, I am really optimistic,” Zuckerberg said, according to Axios. “I think people who are naysayers and try to drum up these doomsday scenarios — I just I don’t understand it … In the next five to 10 years, AI is going to deliver so many improvements in the quality of our lives.”
While he did not name anyone in particular and is presumably referring to general fears about the prospect of such capabilities, the tech wunderkind is in all likelihood alluding to specific comments made by fellow bigwig Musk, and perhaps even Hawking. (RELATED: Elon Musk: Zuckerberg Has ‘Limited’ Understanding Of Artificial Intelligence)
But uneasiness of autonomous functionality does not just originate from some of the people who purportedly understand it the most. A large majority of both major political affiliations believe the should be some form of regulations on AI, according to a fairly recent Morning Consult poll. Seventy-three percent of Democrats, 74 percent of Republicans, and 65 percent of independents, respectively, answered in the affirmative when asked if the U.S. should impose regulations.
Other noteworthy signatories of the letter were Mustafa Suleyman, founder of Google’s DeepMind, and Esben Østergaard, founder and CTO of Universal Robotics in Denmark.
With the apparent schism of opinion on AI in the larger tech field, Musk and several others in the robotics industry are pushing ahead with their intense apprehension towards specific applications of the technology. (RELATED: Sex Robots Are Here And Could Change Society Forever)
“We do not have long to act,” the letter urged. “Once this Pandora’s box is opened, it will be hard to close.”
Send tips to email@example.com.