Paul Crowley (ciphergoth) wrote,

Ways the singularity could fail to happen - the poll

Coo, free accounts can do LJ polls now! Following on from this post, a poll.

I often find discussion of the idea of superintelligence quickly devolves into a less interesting discussion about how people feel about those who talk about superintelligence. I'm interested to know what you think will happen

Read carefully: twothree important caveats:

  • For each option, what I'm interested in is how likely you think it is GIVEN THAT NONE OF THE ABOVE ARE TRUE. So for the third one, what I want to know is, ASSUMING THAT human minds are NOT fundamentally different to other physical things, AND that it is meaningful to think of one mind being greatly more efficient than another, would you say it's more likely than not that human minds are within a few orders of magnitude of the most efficient minds possible in principle in our corner of the Universe? So for the penultimate option, you're asking, given that the idea of superintelligence is meaningful and we have the capacity to build them, will we?
  • And there's one big assumption that I don't even put in the poll: the assumption that we don't wipe ourselves out or otherwise permanently limit our potential some other way before doing the things we discuss here. I think there's an excellent chance we will; I just want to set that aside for a moment in order to have a distinct discussion about this.
  • Given that, the question is more likely than not, not merely "likely enough to be worth taking seriously". Emphasized because I'm very surprised to see some boxes being ticked - you really think those things are more likely than the converse?

Poll #1861629 Ways the Singularity can fail to happen

I think it's more likely than not that

Human minds are fundamentally different to other physical things, and not subject to thinking about like an engineer
4(9.1%)
The idea of one mind being greatly more efficient than another isn't meaningful
5(11.4%)
Human minds are within a few orders of magnitude of the most efficient minds possible in principle in our corner of the Universe
4(9.1%)
Humanity will never be able to design superintelligences that we can build
5(11.4%)
Humanity will never study the fields needed in sufficient detail to gain the ability to design vastly more efficient minds
4(9.1%)
Humanity will never build superintelligence (given the ability to do so)
0(0.0%)
Humanity building superintelligence won't make a really big difference to the world
2(4.5%)
None of the above
20(45.5%)
  • Post a new comment

    Error

    Anonymous comments are disabled in this journal

    default userpic

    Your reply will be screened

    Your IP address will be recorded 

  • 59 comments