Elon Musk and Google promise not to build AI weapons

R13...

Honorary Master
Joined
Aug 4, 2008
Messages
30,827
Right...
"Thou shalt not make a machine in the likeness of a human mind."

and then sometime later in another universe:
Church and Pax: "Thou shalt not build a thinking machine equal to or superior to human kind"
 

eg2505

Honorary Master
Joined
Mar 12, 2008
Messages
16,535
Didn't Google develop a drone technology a while back?

Isn't this now a but hypocritical?
 

saor

Honorary Master
Joined
Feb 3, 2012
Messages
20,796
Didn't Google develop a drone technology a while back?

Isn't this now a but hypocritical?
The pledge requires that companies and high-profile individuals “neither participate in nor support the development, manufacture, trade, or use of lethal autonomous weapons”.

AI weapons that autonomously decide to kill people are as disgusting and destabilising as bioweapons, and should be dealt with in the same way,” said Tegmark.
The focus here is on strong AI involvement and lethal. Can't recall what the drone was about but I don't think it's really covered by these two things.
 

garyc

Expert Member
Joined
Jun 30, 2010
Messages
2,604
Good idea to stay out of this. There is still a legal minefield around who would be criminally liable if one of these weapons kills the wrong person.
 

DMNknight

Expert Member
Joined
Oct 17, 2003
Messages
3,280
This is so illogical it's stupid.
Someone is going to develop lethal weapons AI and if it's not someone moral, it will be someone immoral.

We've seen this kek with the middle eastern governments and chemical weapon warfare despite the Geneva convention.

I would rather someone build a moral AI capable of lethal weapons than someone build an uncontrolled or immoral AI.

The fundamental truth of any truly capable AI, is that the first one out of the door will forever hold a time advantage over all other AI that follows it. Time advantage = technological and the machine equivalence of IQ advantage over any of it's younger siblings.
 
Top