Cambridge University will open a center for "terminator-like studies", a facility which will be focused on understanding the "four greatest threats" to mankind, given as artificial intelligence, nuclear war, climate change and rogue biotechnology.
The Centre for the study of External Risk or CSER will be co-launched by astronomer royal Lord Rees, a leading cosmologists. The center will bring together academics from a wide range of disciplines, including astronomy, biology, robotics, neuroscience and even philosophy and economics.
Rees is the man who warned that mankind could destroy itself completely by the year 2100. He is launching the center alongside Huw Price, a philosophy professor at Cambridge. Also an integral part of the program is Skype co-founder, Jaan Tallinn.
Huw said that, "Nature didn't anticipate us, and we in our turn shouldn't take artificial general intelligence or AGI for granted. We need to take seriously the possibility that there might be a 'Pandora's box' moment with AGI that, if missed, could be disastrous".
The concept of machines wiping out mankind and taking over the world has been inspired over and over by movies and books, but the most prominent example being the "Terminator" film, in which Arnold Schwarzenegger plays the role of a powerful robot sent on a mission.
Professor Price added that, while machines have bested man in speech and handwriting recognition, financial trading, chess and other tasks, there is a concern that man kind could risk yielding control over the planet to intelligences that have different agendas.
In 1965, Irving John 'Jack' Good sat down and wrote a paper for New Scientist called Speculations concerning the first ultra-intelligent machine. Good, a Cambridge-trained mathematician, Bletchley Park cryptographer, pioneering computer scientist, wrote that in the near future an ultra-intelligent machine would be built and that it would be the "last invention of mankind" that would result in an intelligence explosion and render him obsolete.
by RTT Staff Writer
For comments and feedback: email@example.com