Godfather Of AI Says Artificial Intelligence Could Destroy Humanity In 30 Years
hothardware.comThis story is fairly straightforward, and it's not as alarmist as you might think. Geoffrey Hinton, who won the Nobel prize this year for his pioneering work in artificial intelligence development, stated to BBC Radio 4 that he feels the chances of a potential AI apocalypse are "around 10 to 20%" within the next 30 years. Well, fair enough, but why?
Hinton went on to explain that the existential threat posed by advanced AI primarily comes down to our ability to control something that is smarter than we are. "How many examples do you know of a more intelligent thing being controlled by a less intelligent thing? There are very few examples," he stated. He made the comparison to a mother and her child; a toddler can control its mother, but that's through instincts that have evolved over millions and millions of years. We've developed alarmingly capable ...
Copyright of this story solely belongs to hothardware.com . To see the full text click HERE