• Log in with Facebook Log in with Twitter Log In with Google      Sign In    
  • Create Account
  LongeCity
              Advocacy & Research for Unlimited Lifespans

Photo
- - - - -

Something easy to prevent robots from killing us

robots

  • Please log in to reply
5 replies to this topic

#1 Florian Xavier

  • Guest
  • 242 posts
  • 37

Posted 07 July 2015 - 06:31 AM


We all know the risk : when singularity happen, robots will improve themselves and may kill us.

 

But what can they do if we kept robots from being manual ? I mean, juste for calculating. If they don't have arms, they can't do nothing bad but still help us doing science work.


Edited by Florian Xavier, 07 July 2015 - 06:31 AM.

  • Enjoying the show x 1

#2 Antonio2014

  • Guest
  • 634 posts
  • 52
  • Location:Spain
  • NO

Posted 07 July 2015 - 12:40 PM

Create suicide booths for robots.

 

NDa5cfJ.gif

 



#3 kaskiles

  • Guest
  • 7 posts
  • 2
  • Location:Merritt Island, FL
  • NO

Posted 07 July 2015 - 04:29 PM

We will probably need genetically adapted human sympathetic Cyborgs with the technology to override and control the rogue robots.  These Cyborgs will need their existing human organic intelligence fully integrated with the same synthetic Artificial Intelligence engines driving the robots.


Edited by kaskiles, 07 July 2015 - 04:30 PM.


sponsored ad

  • Advert
Advertisements help to support the work of this non-profit organisation. [] To go ad-free join as a Member.

#4 PWAIN

  • Guest
  • 1,288 posts
  • 241
  • Location:Melbourne

Posted 08 July 2015 - 07:38 AM

I trust humans with that much power even less. A super smart AI will figure out a way to get physical form, it may trick someone, access a 3d printer or something else. Our only hope is that we can somehow make it friendly.

Edited by PWAIN, 08 July 2015 - 07:39 AM.

  • like x 1

#5 Kalliste

  • Guest
  • 1,148 posts
  • 159

Posted 08 July 2015 - 09:03 AM

Yep, Florian, you should do some transhumanist reading. There are many books and websites. Some very smart people have been thinking about FAI this for a long time and there are no easy solutions. Boxing certainly will not work. No safety mechanism devised by humans is likely to work. A mind that has a subjective reality a million times faster than ours, with a million times the memory-capacity and hand made modules to devise code with, will overcome any human barrier in a split second. Read the intro to A Fire Upon the Deep by Vernor Vinge for a scary insight into how such a thing would behave.



sponsored ad

  • Advert
Advertisements help to support the work of this non-profit organisation. [] To go ad-free join as a Member.

#6 A941

  • Guest
  • 1,027 posts
  • 51
  • Location:Austria

Posted 12 July 2015 - 10:28 PM

Hm, there is no way we can control a being which is more intelligent than all of us to multiple powers of ten, and which thinks and learns faster than all of us.

 

To keep it under control the best would be to completely isolate that thing, but why make one in the first place if you will not use it for anything?

To keep it in a "bottle" and show it to friends during a BBQ Party?

 

In other posts i wrote that such a being doesnt have to be hostile toward us, but it doesnt need to be hostile (skynet level hostile) to take away power/freedom from us for our own best, and keep us "inside" so we wont do stupid things.

 

So the only way to make sure that we stay in control is to upgrade ourself to AI-Levels, or at least a few of us... which again doesnt sound that good since we allready know that we are capable and willing to crush, kill, and destroy and be it only for the sake of the entertainment value of those activities.

 

 







Also tagged with one or more of these keywords: robots

6 user(s) are reading this topic

0 members, 6 guests, 0 anonymous users