Stephen Hawking: 'I fear AI may replace humans altogether'

The theoretical physicist, cosmologist and author talks Donald Trump, tech monopolies and humanity's future
Image may contain Face Human Person Head Skin Photo Photography Portrait Glasses Accessories and Accessory
Professor Stephen Hawking was photographed by WIRED at his office in Cambridge in 2016Platon

Without science, it's all fiction. And yet our world increasingly resembles a fictional one, accelerating towards a dystopian reality that few would have predicted just a few years ago. Despite science's inexorable march of progress - from the discovery of new cancer drugs to the development of quantum computation - extremist political movements and the wanton spread of falsehoods frustrate its dissemination. This opposition to scientific culture has real consequences: diseases once eradicated re-emerge as anti-vaccination beliefs spread; cataclysmic hurricanes batter entire cities as climate-change denial prevents global solutions; democratic elections are undermined by shadowy adversaries using digital technology.

In March this year, the scientific community, beleaguered by the anti-science sentiment stoked by conservative populism, took to the streets, marching for science across cities around the world. But as science becomes politicised, should scientists become political? In times when facts are considered optional rather than essential, should scientists defend their empirical view of the world against demagogy and sensationalism?

Professor Stephen Hawking - theoretical physicist, cosmologist, author and a key member of the advisory board for the Starmus Festival - talked to WIRED, to give his views on today's issues. From Donald Trump to fake news, from the digital duopoly of Facebook and Google to the potential perils of AI, WIRED presents some scientific perspective to cut through the noise.

Stephen Hawking on the anti-science movement

People distrust science because they don't understand how it works. It seems as if we are now living in a time in which science and scientists are in danger of being held in low, and decreasing, esteem. This could have serious consequences. I am not sure why this should be as our society is increasingly governed by science and technology, yet fewer young people seem to want to take up science as a career. One answer might be to announce a new and ambitious space programme to excite them, and stimulate interest in other areas such as astrophysics and cosmology.

On what he'd say to Donald Trump

I would ask him why he thinks his travel ban is a good idea. This brands as Daesh terrorists all citizens of six mainly Muslim countries, but doesn't include America's allies such as Iraq, Saudi Arabia and Qatar, which allegedly help finance Daesh. This blanket ban is inefficient and prevents America from recruiting skilled people from these countries. I would also ask him to renounce his denial of climate change. But again, I fear neither will happen as Trump continues to appease his electorate.

On the potential perils of AI

The genie is out of the bottle. We need to move forward on artificial intelligence development but we also need to be mindful of its very real dangers. I fear that AI may replace humans altogether. If people design computer viruses, someone will design AI that replicates itself. This will be a new form of life that will outperform humans.

On the digital duopoly of Facebook and Google

I worry about the control that big corporations have over information. The danger is we get into the situation that existed in the Soviet Union with their papers, Pravda, which means "truth" and Izvestia, which means "news". The joke was, there was no truth in Pravda and no news in Izvestia. Corporations will always promote stories that reflect well on them and suppress those that don't.

On what scientific research should be pursued most urgently

My preference would be to pursue rigorously a space-exploration programme, with a view to eventually colonising suitable planets for human habitation. I believe we have reached the point of no return. Our Earth is becoming too small for us, global population is increasing at an alarming rate and we are in danger of self-destructing. Whether this would be the result of damage to the environment or a nuclear war of devastating proportions, we need to actively pursue an alternative way of living if the human race is to survive for another 1,000 years.

This article was originally published by WIRED UK