Search

Stephen Hawking feared race of 'superhumans' able to manipulate their own DNA

Stephen Hawking, the physicist whose bodily paralysis turned him into a symbol of the soaring power of the human mind, feared a race of “superhumans” capable of manipulating their own evolution.

Before he died in March, the Cambridge University professor predicted that people this century would gain the capacity to edit human traits such as intelligence and aggression. And he worried that the capacity for genetic engineering would be concentrated in the hands of the wealthy.

Hawking mulled this future in a set of essays and articles being published posthumously Tuesday as “Brief Answers to the Big Questions,” a postscript of sorts to his 1988 “A Brief History of Time: From the Big Bang to Black Holes,” which has sold more than 10 million copies.

An excerpt released two days in advance by the Sunday Times sheds light on the final musings of the physicist and best-selling author beset by a degenerative motor neuron disease similar to amyotrophic lateral sclerosis, or Lou Gehrig’s disease.

Humanity, he wrote, was entering “a new phase of what might be called self-designed evolution, in which we will be able to change and improve our DNA. We have now mapped DNA, which means we have read ‘the book of life,’ so we can start writing in corrections.”

Initially, he predicted, these modifications would be reserved for the repair of certain defects, such as muscular dystrophy, that are controlled by single genes and therefore make for relatively simple corrections.

“Nevertheless, I am sure that during this century people will discover how to modify both intelligence and instincts such as aggression,” Hawking wrote.

There would be an attempt to pass laws restricting the genetic engineering of human traits, he anticipated. “But some people won’t be able to resist the temptation to improve human characteristics, such as size of memory, resistance to disease and length of life,” he anticipated.

“Once such superhumans appear, there are going to be significant political problems with the unimproved humans, who won’t be able to compete,” Hawking reasoned. “Presumably, they will die out, or become unimportant.”

Ultimately, he envisioned a “race of self-designing beings who are improving themselves at an ever-increasing rate. If the human race manages to redesign itself, it will probably spread out and colonise other planets and stars.”

In the excerpted material, Hawking did not elaborate on the disparities between “superhumans” and the “unimproved,” but an article accompanying the excerpt in the Sunday Times suggests that Hawking’s fear was specifically that “wealthy people will soon be able to choose to edit their own and their children’s DNA.” The article, by the newspaper’s science editor, draws a parallel to the 20th-century eugenics movement — premised similarly on the notion that human improvement could arise from genetic manipulation.

Some researchers and ethicists already fear that DNA editing may be eclipsing existing standards. Apprehension mainly surrounds CRISPR — “clustered regularly interspaced sort palindromic repeats” — which has evolved from a component of bacterial defense into a means of altering specified DNA sequences with an eye toward enhancement.

The gene-editing technology has already been a source of major patent disputes. There are hopes that the tool could be used for low-cost disease detection. Meanwhile, applications in the food industry have alarmed consumer and environmental groups.

Though Hawking saw himself as an “optimist,” as he wrote in “Brief Answers to the Big Questions,” the book is a warning of runaway technological ambition. He was particularly worried that the promise of artificial intelligence — the possibility of the eradication of disease and poverty — would blind its developers to long-term costs, namely that humans could lose control over its growth.

“One can imagine such technology outsmarting financial markets, out-inventing human researchers, out-manipulating human leaders and potentially subduing us with weapons we cannot even understand,” he wrote. “Whereas the short-term impact of AI depends on who controls it, the long-term impact depends on whether it can be controlled at all.”

Artificial intelligence that is able to improve upon itself without human assistance may be able to amass intelligence that “exceeds ours by more than ours exceeds that of snails,” Hawking warned.

Hawking, whose exploration of gravity and black holes marked a transformation in modern physics, proffered answers to a number of other mysteries. Among them are the long-term feasibility of life on Earth (“I regard it as almost inevitable that either a nuclear confrontation or environmental catastrophe will cripple the Earth at some point in the next 1,000 years”); the existence of God (“If you like, you can call the laws of science ‘God,’ but it wouldn’t be a personal God that you would meet and put questions to”); and the biggest threat to the future of the planet (“asteroid collision”).

Let's block ads! (Why?)

https://www.washingtonpost.com/news/morning-mix/wp/2018/10/15/stephen-hawking-feared-race-of-superhumans-able-to-manipulate-their-own-dna/

Bagikan Berita Ini

0 Response to "Stephen Hawking feared race of 'superhumans' able to manipulate their own DNA"

Post a Comment

Powered by Blogger.