• Log in with Facebook Log in with Twitter Log In with Google      Sign In    
  • Create Account
  LongeCity
              Advocacy & Research for Unlimited Lifespans


Adverts help to support the work of this non-profit organisation. To go ad-free join as a Member.


Photo

Singularity believe hurts long term goals


  • Please log in to reply
6 replies to this topic

#1 Guest

  • Guest
  • 320 posts
  • 214

Posted 20 July 2011 - 07:36 PM


As you might already noticed from the Essay against AI-Singularity thread I am sceptical about the practicability of the singularity concept. Obviously there are a lot of people among the transhumanism and life extension crowd who subscribe to Mr. Kurzweils concept/timeframe. I therefore asked my self what position they/you have concerning long term problems of our time? I mean, as the singularity rapture will arrive in the 2040 in your believe system there should be no need to care about issues as climate change and many other problems. E.g.


Climate Change:

According to your premise, from the 2040s on we will have incredible technology and resources at hand, which surely can deal with preexisting greenhouse gases easily. So do you oppose international climate protection efforts as the distraction and waste of resources that they are? Especially as the additional temperature rise up to 2050 could not be stopped even with the most extreme anti-carbon plans proposed?


The national debt:

Vastly accelerated technological progress implies rapid economic growth, thereby offsetting any national debt - e.g. transforming a 60% debt to GDP to less than 10% just by the rising GDP in the first couple of years of the singularity. Absolute tax revenues will also rise accordingly. So do you see long term national debt as the irrelevant issue that it is, at least from the 2030 on?


Human development programmes / enviromental protection:

Poverty is prevalent globally and the world population is rising, putting pressure on natural resources and also contributing to climate change etc. Should anyone care, if by the 2040 our science fiction world can deal with everything? Even with many species and rain forests extinct it should be no problem, and with widespread mind uploading and simulated realities the state of the rain forest is a non-issue - life in heaven (upload reality) is perfect.


SENS-research:

Especially if you are 40 or younger SENS-research appears to be pretty redundand - the singularity salvation will not only mean biological, but near-physical immortality. So do you look at SENS-research as the superflous agenda that it is instead of promoting it?


Non-AI-research:

As we alrady talk about the issue - every non-AI/computer/IT research should be unnecesary; the money is better spend on AI research anyway. So do you oppose spending on climate science, particle physics, chemical sciences etc. as the waste of resources that they are? In the end the singularity will figure out everything instantly so the money is better spend on tax deductions and national Ice Cream Day?


Hypothetical scenarios:

A 100km astroid is detected to hit earth by 2070. Do you care at all? Do you see the resources spend on deflecting it as the total waste that they are? More than 20 years of singularity are the solution!



Appreciate any comment.

Edited by TFC, 20 July 2011 - 07:40 PM.


#2 Traclo

  • Guest, F@H
  • 101 posts
  • 3
  • Location:Ontario

Posted 21 July 2011 - 12:17 AM

I'm not sure about the die-hard Kurzweil fans, but I suspect that there is a great deal more skepticism towards his ideas (or at least precise dates) than you might think. But I'm in no position to judge the general attitude towards the singularity. In principle I see no reason why it won't happen. Provided we can build an intelligence greater than our own it will eventually enhance itself.

As to the question why care about future problems when there may be some kind of global solution down the line, I think that the natural uncertainty of the future is a very good reason to continue caring. Even if you are mostly convinced that the singularity will happen, to buy into the exact dates and outline of a prediction seems foolish. Maybe it will be delayed 20 years from its predicted 'advent'. The amount of suffering that could occur with regards to the problems you've raised in the meanwhile is not trivial, and so even if you think that the singularity will eventually happen, you still have good reason to act on them now.

But I guess if you are utterly convinced that it will happen on schedule there really is no reason to plan for those problems beyond their immediate effects. But will people really do that? Some perhaps (the people who give away all their possessions before a predicted apocalypse come to mind), but I think even if I were confident everything I did would be meaningless in a short while, I'd still continue in spite of it, along with many others. It may be roughly analogous to death for an atheist; after that point nothing matters but you still live as though it does.

So there IS good reason to care about it, and even if there isn't people will anyway.
  • like x 1

sponsored ad

  • Advert

#3 niner

  • Guest
  • 16,276 posts
  • 2,000
  • Location:Philadelphia

Posted 21 July 2011 - 01:35 AM

TFC, the future has a positive bias, but it would be foolish in the extreme to bet on the Singularity as you suggest. I think we still have enough time to wreck the economy, if not the whole planet, before the Singularity happens. If we behave foolishly enough, we can probably forestall the Singularity, perhaps forever. Maybe we could kill off all life on Earth. Even after "The Singularity" or something like it occurs, we don't know what it's going to be like. Should be interesting, though.
  • like x 1

#4 robomoon

  • Guest
  • 209 posts
  • 18

Posted 24 July 2011 - 01:38 PM

It looks foolish that software used to generate this reply does not work enough to inform http://www.longecity...earth-required/ authorities with a timely computer presentation against unfriendly human group intelligence acting to kill off all life with the non-AI-research in particle physics.

<br />... If we behave foolishly enough, we can probably forestall the Singularity, perhaps forever. Maybe we could kill off all life...<br />

<br /><br /><br />

#5 Kolos

  • Guest
  • 209 posts
  • 37
  • Location:Warszawa

Posted 31 July 2011 - 02:14 PM

From what I see most transhumanists and singularitarians are very possitive about almost any kind of progress because it brings singularity closer one way or another. It's not about passively waiting for the rapture to come because the Singularity supposed to be a man made "rapture" so it requires much work, our work to happen. I doubt many people believe it will happen in a day when all of a sudden we have the most advanced technologies at our disposal and know how to solve any problem, singularity is supposed to be rather long (altough accelerating) process actually. TFC seem to think that we see singularity as something independent from todays science and contemporary reality in general which is not the case, singularity is the science of the future and more than just science is needed to solve our problems.

#6 Guest

  • Topic Starter
  • Guest
  • 320 posts
  • 214

Posted 01 August 2011 - 04:04 AM

From what I see most transhumanists and singularitarians are very possitive about almost any kind of progress because it brings singularity closer one way or another. It's not about passively waiting for the rapture to come because the Singularity supposed to be a man made "rapture" so it requires much work, our work to happen. I doubt many people believe it will happen in a day when all of a sudden we have the most advanced technologies at our disposal and know how to solve any problem, singularity is supposed to be rather long (altough accelerating) process actually. TFC seem to think that we see singularity as something independent from todays science and contemporary reality in general which is not the case, singularity is the science of the future and more than just science is needed to solve our problems.


Well, of course my initial post is intended to be provocative. Nonetheless the fact remains, that Kurzweil is proposing a timeframe that makes it unattractive for his followers or people who buy into his theories in general to actually contribute. And as a matter of fact as they believe in incredible improved technology by the time AI arrives (the 2040 according to RK) logic suggests that we should not bother about things as climate change or other long term problems if we believe in a singularity within, say, the next 60, 70 years.

Singularity is basically about AI. It does not matter so much what level we have in other sciences, as the AI will develop new technology so rapidly, maybe in 2 years what normally would take 15 years - and after 5 years of self improvement it will do it even in 6 month. Those kinds of statements are the essence of AI-Singularity. Due to selfimprovement AI will get so "good", that it eventually will do all kinds of research in a fraction of the time we need today. So we can just stop any non-AI related research right now.

Also, as you are appearently not a passive kind of person I invite you to have a look at the SENS-forum.

sponsored ad

  • Advert

#7 Guest

  • Topic Starter
  • Guest
  • 320 posts
  • 214

Posted 01 August 2011 - 04:04 AM

From what I see most transhumanists and singularitarians are very possitive about almost any kind of progress because it brings singularity closer one way or another. It's not about passively waiting for the rapture to come because the Singularity supposed to be a man made "rapture" so it requires much work, our work to happen. I doubt many people believe it will happen in a day when all of a sudden we have the most advanced technologies at our disposal and know how to solve any problem, singularity is supposed to be rather long (altough accelerating) process actually. TFC seem to think that we see singularity as something independent from todays science and contemporary reality in general which is not the case, singularity is the science of the future and more than just science is needed to solve our problems.


Well, of course my initial post is intended to be provocative. Nonetheless the fact remains, that Kurzweil is proposing a timeframe that makes it unattractive for his followers or people who buy into his theories in general to actually contribute. And as a matter of fact as they believe in incredible improved technology by the time AI arrives (the 2040 according to RK) logic suggests that we should not bother about things as climate change or other long term problems if we believe in a singularity within, say, the next 60, 70 years.

Singularity is basically about AI. It does not matter so much what level we have in other sciences, as the AI will develop new technology so rapidly, maybe in 2 years what normally would take 15 years - and after 5 years of self improvement it will do it even in 6 month. Those kinds of statements are the essence of AI-Singularity. Due to selfimprovement AI will get so "good", that it eventually will do all kinds of research in a fraction of the time we need today. So we can just stop any non-AI related research right now.

Also, as you are appearently not a passive kind of person I invite you to have a look at the SENS-forum.




3 user(s) are reading this topic

0 members, 2 guests, 0 anonymous users


    Bing (1)