As you might already noticed from the Essay against AI-Singularity thread I am sceptical about the practicability of the singularity concept. Obviously there are a lot of people among the transhumanism and life extension crowd who subscribe to Mr. Kurzweils concept/timeframe. I therefore asked my self what position they/you have concerning long term problems of our time? I mean, as the singularity rapture will arrive in the 2040 in your believe system there should be no need to care about issues as climate change and many other problems. E.g.
Climate Change:
According to your premise, from the 2040s on we will have incredible technology and resources at hand, which surely can deal with preexisting greenhouse gases easily. So do you oppose international climate protection efforts as the distraction and waste of resources that they are? Especially as the additional temperature rise up to 2050 could not be stopped even with the most extreme anti-carbon plans proposed?
The national debt:
Vastly accelerated technological progress implies rapid economic growth, thereby offsetting any national debt - e.g. transforming a 60% debt to GDP to less than 10% just by the rising GDP in the first couple of years of the singularity. Absolute tax revenues will also rise accordingly. So do you see long term national debt as the irrelevant issue that it is, at least from the 2030 on?
Human development programmes / enviromental protection:
Poverty is prevalent globally and the world population is rising, putting pressure on natural resources and also contributing to climate change etc. Should anyone care, if by the 2040 our science fiction world can deal with everything? Even with many species and rain forests extinct it should be no problem, and with widespread mind uploading and simulated realities the state of the rain forest is a non-issue - life in heaven (upload reality) is perfect.
SENS-research:
Especially if you are 40 or younger SENS-research appears to be pretty redundand - the singularity salvation will not only mean biological, but near-physical immortality. So do you look at SENS-research as the superflous agenda that it is instead of promoting it?
Non-AI-research:
As we alrady talk about the issue - every non-AI/computer/IT research should be unnecesary; the money is better spend on AI research anyway. So do you oppose spending on climate science, particle physics, chemical sciences etc. as the waste of resources that they are? In the end the singularity will figure out everything instantly so the money is better spend on tax deductions and national Ice Cream Day?
Hypothetical scenarios:
A 100km astroid is detected to hit earth by 2070. Do you care at all? Do you see the resources spend on deflecting it as the total waste that they are? More than 20 years of singularity are the solution!
Appreciate any comment.
Edited by TFC, 20 July 2011 - 07:40 PM.