• Log in with Facebook Log in with Twitter Log In with Google      Sign In    
  • Create Account
  LongeCity
              Advocacy & Research for Unlimited Lifespans

Photo
- - - - -

POLL: Which Future Tech Frightens You Most?


  • Please log in to reply
63 replies to this topic

Poll: Which Future Tech Frightens You Most? (174 member(s) have cast votes)

Which of the following do you think we should we be most cautious about?

  1. Nanotechnology (18 votes [10.29%])

    Percentage of vote: 10.29%

  2. Gene Therapy (4 votes [2.29%])

    Percentage of vote: 2.29%

  3. Designer Babies (12 votes [6.86%])

    Percentage of vote: 6.86%

  4. Nuclear Fusion (11 votes [6.29%])

    Percentage of vote: 6.29%

  5. Biometric Identification (15 votes [8.57%])

    Percentage of vote: 8.57%

  6. Artificial Intelligence (50 votes [28.57%])

    Percentage of vote: 28.57%

  7. Particle Accelerators (11 votes [6.29%])

    Percentage of vote: 6.29%

  8. Designer Drugs (3 votes [1.71%])

    Percentage of vote: 1.71%

  9. Virtual Reality (7 votes [4.00%])

    Percentage of vote: 4.00%

  10. Genetically Modified Organisms (24 votes [13.71%])

    Percentage of vote: 13.71%

  11. Future Media (20 votes [11.43%])

    Percentage of vote: 11.43%

Vote Guests cannot vote

#61 redbaron4321

  • Guest
  • 4 posts
  • 2
  • Location:United States

Posted 30 October 2018 - 01:52 AM

I voted for AI, but this is pretty terrifying if not managed correctly. Since the government is involved, being managed poorly is a given.

 

https://nexusnewsfee...difying-insects

 

If they can make bugs that can genetically alter plants then they could just as easily make biting insects that genetically alter humans. Maybe make humans more docile or die the day before the first social security check gets mailed out. In reality though if it is used as a bioweapon and then accidentally escapes then a social security check may be the least of our problems. Once that genie is out of bottle it can't be put back in.

 

 

 



#62 kurdishfella

  • Guest
  • 2,397 posts
  • -69
  • Location:russia
  • NO

Posted 25 March 2020 - 09:50 PM

Either BIO WEAPON viruses or if they somehow create an animal by mixing with existing animals or something, that is almost impossible to kill and very smart and fast etc that escape the lab and runs rampant on earth and just kills every human it encounters.



#63 orion22

  • Guest
  • 186 posts
  • -1
  • Location:Romania
  • NO

Posted 28 March 2020 - 01:32 PM

if we can t beat a stupid virus who thinks we can beat Artificial Intelligence?


  • Good Point x 1
  • Agree x 1

sponsored ad

  • Advert
Advertisements help to support the work of this non-profit organisation. [] To go ad-free join as a Member.

#64 Question Mark

  • Registrant
  • 25 posts
  • 4
  • Location:Pennsylvania
  • NO

Posted 09 June 2021 - 10:57 PM

AI is the only one of these technologies that is truly capable of leading to fates far worse than death. All the other technologies on this list will only lead to suffering and death on a limited scale, or extinction at worst. S-risks are far more terrifying. Misaligned AI could potentially create truly astronomical levels of suffering resembling the Biblical Hell, with suffering so extreme that it makes the worst forms of torture humans have hitherto invented feel like mere pinpricks by comparison.

 

The Center on Long-Term Risk is the only AI-focused organization I'm aware of with a primary focus on reducing S-risks. There's also the Center for Reducing Suffering, but they are less AI-focused. With regards to AI alignment, Brian Tomasik believes that a slightly misaligned AI has far more risk than a totally unaligned AI, and that AI organizations like MIRI may be actively harmful for this reason.

 

Regarding S-risks, David Pearce said the following:

 

However, the practical s-risk I worry most about is the dark side of our imminent mastery of the pleasure-pain axis. If we conventionally denote the hedonic range of Darwinian life as -10 to 0 to +10, then a genetically re-engineered civilisation could exhibit a high hedonic-contrast +70 to +100 or a low-contrast +90 to +100. Genome-editing promises a biohappiness revolution: a world of paradise engineering. Life based on gradients of superhuman bliss will be inconceivably good. Yet understanding the biological basis of unpleasant experience in order to make suffering physically impossible carries terrible moral hazards too – far worse hazards than anything in human history to date. For in theory, suffering worse than today’s tortures could be designed too, torments that would make today’s worst depravities mere pinpricks in comparison. Greater-than-human suffering is inconceivable to the human mind, but it’s not technically infeasible to create. Safeguards against the creation of hyperpain and dolorium – fancy words for indescribably evil phenomena – are vital until intelligent moral agents have permanently retired the kind of life-forms that might create hyperpain to punish their “enemies” – lifeforms like us. Sadly, this accusation isn’t rhetorical exaggeration. Imagine if someone had just raped and murdered your child. You can now punish them on a scale of -1 to -10, today’s biological maximum suffering, or up to -100, the theoretical upper bounds allowed by the laws of physics. How restrained would you be? By their very nature, Darwinian lifeforms like us are dangerous malware.

Mercifully, it’s difficult to envisage how a whole civilisation could support such horrors. Yet individual human depravity has few limits – whether driven by spite, revenge, hatred or bad metaphysics. And maybe collective depravity could recur, just as it’s practised on nonhuman animals today. Last century, neither Hitler and the Nazis nor Stalin and the Soviet Communists set out to be evil. None of us can rationally be confident we understand the implications of what we’re doing – or failing to do. Worst-case scenario-planning using our incomplete knowledge is critical. Safeguards are hard to devise because (like conventional “biodefense”) their development may inadvertently increase s-risk rather than diminish it. In the twenty-first-century, unravelling the molecular basis of pain and depression is essential to developing safe and effective painkillers and antidepressants. More suicidally depressed and pain-ridden people kill themselves, or try to kill themselves, each year than died in the Holocaust. A scientific understanding of the biology of suffering is necessary to endow tomorrow’s more civilised life with only a minimal and functional capacity to feel pain. A scientific understanding of suffering will be needed to replace the primitive signalling system of Darwinian life with a transhuman civilisation based entirely on gradients of bliss.
But this is dangerous knowledge – how dangerous, I don’t know.


Edited by Question Mark, 09 June 2021 - 11:12 PM.





18 user(s) are reading this topic

0 members, 18 guests, 0 anonymous users