guardian.co.uk
Monday November 21, 2005
The Guardian
Expect the human of the future to be at least part computer, the inventor and futurologist tells John Sutherland
Inventor and futurologist Ray Kurzweil
The cyber-man ... 'By 2030 we will have achieved machinery that equals and exceeds human intelligence'. Photo: Steven Senne/AP
Ray Kurzweil has enormous faith in science. He takes 250 dietary supplements every day. He is sure computers will make him much, much cleverer within decades. He won't rule out being able to live for ever. Even if medical technology cannot prevent the life passing from his body, he thinks there is a good chance he will be able to secure immortality by downloading the contents of his enhanced brain before he dies.
What is more, he says, his predictions have tended to come true. "You can predict certain aspects of the future. It turns out that certain things are remarkably predictable. Measures of IT - price, performance, capacity - in many different fields, follow very smooth evolutionary progressions. So if you ask me what the price or performance of computers will be in 2010 or how much it will cost to sequence base pairs of DNA in 2012, I can give you a figure and it's likely to be accurate. The Age of Intelligent Machines, which I wrote in the 1980s, has hundreds of predictions about the 90s and they've worked out quite well."
Although he has written some of the defining texts of modern futurology, Kurzweil is not just a theorist: he has decades of experience as an inventor. As a schoolboy he created a computer that could write music in the style of the great classical composers. As an adult, he invented the first flat-bed scanner, and a device that translated text in to speech, to help blind people read. There is much, much more.
His current big idea is "the singularity", an idea first proposed by computer scientist and science fiction writer Vernor Vinge, and expounded by Kurzweil in his new book, The Singularity is Near: When Humans Transcend Biology. The nub of Kurzweil's argument is that technology is evolving so quickly that in the near future humans and computers will, in effect, meld to create a hybrid, bio-mechanical life form that will extend our capacities unimaginably.
"By 2020, $1,000 (£581) worth of computer will equal the processing power of the human brain," he says. "By the late 2020s, we'll have reverse-engineered human brains."
What form will the computer take by the middle of the century: a kind of superhuman clone or just a terrific prosthesis? "I would lean more towards the prosthesis side. Not a prosthetic device that just fixes problems, like a wooden leg, but something that allows us to expand our capabilities, because we're going to merge with this technology. By 2030, we will have achieved machinery that equals and exceeds human intelligence but we're going to combine with these machines rather than just competing with them. These machines will be inserted into our bodies, via nano-technology. They'll go inside our brains through the capillaries and enlarge human intelligence."
It sounds creepily wonderful. But will humans have the political and social structures to accommodate and control these super-enhancing technologies? Look at the problems that stem-cell research is currently having in America, for example.
"That's completely insignificant," he replies. "I support stem-cell research and oppose the government restrictions, but nobody can say that this is having any significant impact on the flow of scientific progress. Ultimately, we don't want to use embryonic stem-cells anyway. Not because of any ethical and political issues. If I want artificial heart cells, or if I want pancreatic cells, it will be done from my own DNA and there'll be an inexhaustible supply. These barriers are stones in the river. The science just flows around them."
OK. But what if the bad guys get hold of the technology? Does that possibility keep Kurzweil awake at night?
"I've been concerned about that for many years," he concedes. "But you can't just relinquish these technologies. And you can't ban them. It would deprive humanity of profound benefits and it wouldn't work. In fact it would make the dangers worse by driving the technologies underground, where they would be even less controlled. But we do need to put more stones on the defensive side of the scale and invest more in developing defensive technology. The main danger we have right now is the ability of some bio-terrorist engineering a brand new type of virus that would be very dangerous. Bill Joy and I had an op-ed piece in the New York Times a couple of weeks ago criticising the publication of the genome of the 1918 avian virus on the web. We do have to be careful."
Kurzweil has plenty of critics. Some are horrified by his vision of a future that doesn't seem to need humans. Others suggest his predictions are based on assertion rather than evidence. Some, such as Steven Pinker, argue that Kurzweil has oversimplified evolution by wrongly claiming it to be a pursuit of greater intellectual complexity and applying it to technology.
"It is truly an evolutionary process," Kurzweil insists. "You have different niches and technology competes for them. The better ones survive and the weaker ones go to the wall. Technology evolves in a virtually straight line. The first important point is that we can make accurate predictions and I've been doing that for several decades now. The other important point is the exponential rate at which technology is moving under what I call the Law of Accelerating Return. It's not just Moore's Law."
Kurzweil is referring to the observation made in 1965 by Gordon Moore, co-founder of Intel, that the number of transistors per square inch on integrated circuits had doubled every year since the integrated circuit was invented and would continue to do so, a key foundation of Kuzweil's thinking.
"It's not just computers. In 1989, only one ten-thousandth of the genome was mapped. Sceptics said there's no way you're gonna do this by the turn of the century. Ten years later they were still sceptical because we'd only succeeded in collecting 2% of the genome. But doubling every year brings surprising results and the project was done in time. It took us 15 years to sequence HIV - a huge project - now we can sequence Sars in 31 days and we sequence other viruses in a week."
All this is moving towards "the singularity", is it? "Yes. Consider how important computers and IT are already. Then go on to consider that the power of these technologies will grow by a factor of a billion in 25 years. And it'll be another factor of a billion by the time we get to 2045".
Source here
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment