“If an AI [Artificial Intelligence] possessed any one of these skills—social abilities, technological development, economic ability—at a superhuman level, it is quite likely that it would quickly come to dominate our world in one way or another. And as we’ve seen, if it ever developed these abilities to the human level, then it would likely soon develop them to a superhuman level. So we can assume that if even one of these skills gets programmed into a computer, then our world will come to be dominated by AIs or AI-empowered humans.”
― Stuart Armstrong, Smarter Than Us: The Rise of Machine Intelligence
Science fiction has had an uncanny way of sometimes becoming science fact.
Just consider fiction author Jules Vern who wrote From the Earth to the Moon in 1865, about man traveling to the moon (the story is also notable in that Verne attempted to do some rough calculations as to the requirements and, considering the comparative lack of any data on the subject at the time, some of his figures are surprisingly close to reality), that actually happened with Apollo 11 in 1969.
Or the 2004 movie I, Robot (based on fiction writer Isaac Asimov’s 1950 short-story collection of the same name), where “humanoid” robots with AI serve humanity in their daily domestic lives. It was reported back in November 2015 that UBTECH Robotics is developing Alpha 2, a small humanoid robot that will cost around $700 (US) that can move on its own and will be capable of talking freely (entertaining you, your children and guests with conversation), follow verbal commands, be a personal secretary (managing your calendar and verbally reminding you to do things), monitoring your home’s security, and more.
And, consider the 1984 movie Terminator, where AI robotic war machines turn on mankind and try to wipe out humanity. Today in the news, this – “Pentagon program works to develop hunter-killer robots”:
In the Armstrong quote above, he suggests that if an AI develops an understanding of human social skills, develops the ability to make or use technology, or learns to use our economic systems, they will dominate our world. And that means we humans will be their servant, not vice versa.
But in war, they don’t need those understandings, all they need is the ability to “think” independently, and to know where and against whom to attack and kill, once instructed.
And it isn’t farfetched to believe that, given enough time, an AI would ultimately become a sentient – conscious, self-aware – being, no longer in need of human direction and capable of independent thought.
One shudders in wonder if this might be the next science fiction to become science fact.