Google Says No to Doing AI Weapons Work

Google won’t do artificial intelligence work for weapons, the company said Thursday.  

The company will not work on “technologies that cause or are likely to cause overall harm,” wrote Sundar Pichai, Google’s CEO, in a blog post.

 

Google has come under fire in recent months for its contract with the U.S. Department of Defense to use AI for sifting through drone footage. AI is a field of study whereby a computer or technology is able to do things typically associated with human behavior, such as make decisions, plan and learn.

 

Google and other tech firms have been bringing the advances in AI to fields such as medicine, natural disaster planning, energy, transportation and manufacturing.

But these advances have also led to ethical concerns about the kinds of decisions being made without human input.

“How AI is developed and used will have a significant impact on society for many years to come,” Pichai wrote. “As a leader in AI, we feel a special responsibility to get this right.”

 

Sifting through drone footage

In recent months, more than 4,000 Google employees signed a petition calling for the cancellation of the company’s contract with the Department of Defense as part of the DoD’s Project Maven initiative. They joined other critics in raising alarms that the project could lead to the use of autonomous weapons.

 

Last week, a Google executive reportedly told employees that the company would not seek to renew its Project Maven contract with the military.  

Kirk Hanson, the executive director of the Markkula Center for Applied Ethics at Santa Clara University, which counts Google as a financial supporter, said Google’s contract highlights a larger debate about AI and military applications.

“Until we have trust that those systems will not make mistakes, we’re going to have a lot of doubts about the use of artificial intelligence for autonomous weapons,” Hanson said.

 

Google will continue some work for the military, Pichai said.

 

“We want to be clear that while we are not developing AI for use in weapons, we will continue our work with governments and the military in many other areas,” he said. “These include cybersecurity, training, military recruitment, veterans’ healthcare, and search and rescue.”



your ad here

leave a reply:

Discover more from UPONSOFT

Subscribe now to keep reading and get access to the full archive.

Continue reading