Hawking Warns that a Rich, ‘Superhuman’ Species Could Wipe Humanity

The cosmologist’s last writings predict that humanity risks being replaced by genetically modified "superhumans"

Superhumans

In a newly released book called Brief Answers to the Big Questions, Stephen Hawking, the world-famous British theorist and author, who died in March aged 76, has predicted that advancements in direct manipulation of an organism’s genetic composition using biotechnology might lead to a new breed of genetically modified “superhumans” that could destroy the rest of humanity.

In his book, which is a collection of the esteemed scientist’s final papers, Hawking talks, among other things, about the inevitability of an apocalyptic scenario should the ultra-rich be able to make themselves smarter, stronger, and more resistant to disease.

“Once such superhumans appear, there will be significant political problems with unimproved humans, who won’t be able to compete,” Hawking wrote before adding “presumably, they will die out, or become unimportant.”

The physicist said he expected laws to be put in place to prevent genetic engineering in humans but ultimately some won’t be able “to resist the temptation to improve human characteristics, such as size of memory…and length of life.”

Hawking seems curiously enthusiastic on the longevity point, writing: “There is no time to wait for Darwinian evolution to make us more intelligent and better natured.”

Earth’s Bleak Future

Looking at the current state of society and how economic divisions will continue to lead to spiking levels of social inequality, one can understand Hawking’s argument for such a genetically polarized future, and how an individual’s wealth or sense of class essentialism would factor into it. In fact, well-intentioned emerging technologies like rapid DNA sequencing and genetic manipulation using CRISPR, if accessible only to the ultra-rich could lead to the kind of the genetically-enhanced humans class divide Hawking described.

In his book Hawking also delivered a serious warning on the importance of regulating artificial intelligence (AI) and the need to explore the field’s impact on humanity, from the workplace to the military. According to the renowned physicist, in the future “AI could develop a will of its own, a will that is in conflict with ours.”

Hawking also touched on the biggest threat facing planet Earth: climate change. “A rise in ocean temperature would melt the ice caps and cause the release of large amounts of carbon dioxide,” he noted, adding that “Both effects could make our climate like that of Venus with a temperature of 250C.”

On a positive note, the physicist thought that the best idea humanity could implement is that of nuclear fusion power. The thermonuclear process would give us safety, reliable power and most importantly, clean energy with no pollution or global warming.

References: The Sunday Times, QZ

Be the first to comment

Leave a Reply

Your email address will not be published.


*