"Weâre all gonna die!" says Brit Astronomer
From al-Guardian; edited to eliminate redundant babbling and irrelevant (but cute) puppy.
Martin Rees is rather chirpy for a horseman of the apocalypse. He welcomes me to his house in Cambridge with a firm handshake and optimistic smile. In his new book, Our Final Century, the astronomer royal predicts that we're doomed. Well, almost. The subtitle is not quite so hopeless: "Will the human race survive the 21st century" it asks. Ultimately, Rees concludes that we have no more than a 50-50 chance of surviving. He acknowledges that many people have been surprised by the book. After all, Rees is an internationally respected astrophysicist best known for highly technical work on black holes, cosmic evolution and the six numbers that define the universe. And this is certainly not a technical book — in many ways, it is not even a science book. . . .
. . . or even one grounded in reality . . .
At heart it is a series of generalised but coherent essays, written by a deeply messed-up worried man. Rees says he doesn't know why others are so puzzled about the book. "Some people have asked me why I've written a book that seems a departure from the books I've written before as though I've suddenly shifted my interests, and that's not the case at all." After all, he's voiced his fears about the abuse of technology for many years, notably about the nuclear arms race. Now, he says, he's simply expanded his thesis to incorporate new and even nastier risks.
Has he always been so pessimistic? "I don't think the new book is irredeemably pessimistic. It says there are threats but there are also opportunities. There are some technologies that are benign socially and benign environmentally. Miniaturisation means we consume less raw materials and the internet democratises information and access." In short we should be able to feed the world, save the planet and redistribute power on a global basis. Which is fantastic. But (and this is where Rees gets into his stride): "We are inevitably empowering more people with the potentiality to harm on an ever growing scale. And we're in a society which is more brittle and interconnected, and I think this is something we are going to have to confront. Also some new technologies don't require very large-scale equipment. It needs a hell of a big facility to make a nuclear weapon, but it doesn't in order to tinker with a virus. So when we have people all over the world who experiment with biotech, then of course we are concerned that even one or two of them might misuse that knowledge with the possibility of disastrous outcomes. What I am saying is that a weirdo, someone with a mindset that could now make a computer virus, may one day be able to create a genetically modified real virus, which could cause thousands of fatalities."
Artificial intelligence is another worry. Soon enough, he says, we may make robots that are smarter than us and they may decide we are redundant. Or we may start inserting chips into our brain to make ourselves that little bit smarter or fitter and find that we end up more computer than human. There is a chapter on asteroid impact, but he's relatively sanguine on this front. "The reason I introduced asteroid impact is that they set a baseline level of risk. But the point is that the risk of impacts is not getting any worse. The environmental risks that have increased are ones made by humans."
The more he talks about what is already happening with terror groups and anthrax scares, the more convincing he becomes. Rees says that by its nature, this jeremiad has to be generalised. When he talks about the horror that awaits us, he refers to it urbanely as a "major setback". It's a lovely way of putting it, I say, strangely comforting. "Well I can't be more specific than that. Although we can foresee what is going to happen in the next 10-20 years in science we can't foresee anything like the next 100 years ahead. In 1900 people like HG Wells and Lord Rutherford failed to foresee most of what happened in 20th-century science. All we can suspect is that the world is going to change faster and in more dimensions. Faster because certain technologies are running away, but in more dimensions because for the first time human nature isn't a fixed quantity. Over the past 2,000 to 3,000 years human beings as such haven't changed, but in this century human beings — as we know from the potentiality of genetics and targeted drugs and implants into our brains and all kinds of things could change."
. . . "We can't dismiss as crazy those Californian futurologists who say that we will have super-human intelligences 50 years from now. I think the main message is that the more catastrophic the potential downside is, the more careful we have to be." He concludes the radio interview with immense charm. "It's been a great pleasure and privilege to be with you. Thank you very much indeed." Responsibility, he says, is ultimately what it's all about. Rees, who describes himself as old Labour, says that today's politicians have an even greater duty to act justly, not to fuel hatreds. "Because of the greater risks it is all the more important to minimise the number of people who have grounds for being disaffected or aggrieved."
[rant mode]
Given that the prof is "old Labour," one suspects that "act justly" is a code word for "regulate everything." What the good professor forgets--what most Luiddites and prophets of doom from Parson Matlthus on forget--is that human ingenuity is the ultimate resource: non-polluting, renewable, and in near-infinite supply. There will be mischief and miscalculation in the future--and also innovation and creativity that we simply cannot imagine. The way to protect ourselves from risk is not to button up society in the control of elites, but to let freedom ring.
[/rant mode]
Posted by: Mike 2003-04-24 |