I often find discussion of the idea of superintelligence quickly devolves into a less interesting discussion about how people feel about those who talk about superintelligence. I'm interested to know what you think will happen
- For each option, what I'm interested in is how likely you think it is GIVEN THAT NONE OF THE ABOVE ARE TRUE. So for the third one, what I want to know is, ASSUMING THAT human minds are NOT fundamentally different to other physical things, AND that it is meaningful to think of one mind being greatly more efficient than another, would you say it's more likely than not that human minds are within a few orders of magnitude of the most efficient minds possible in principle in our corner of the Universe? So for the penultimate option, you're asking, given that the idea of superintelligence is meaningful and we have the capacity to build them, will we?
- And there's one big assumption that I don't even put in the poll: the assumption that we don't wipe ourselves out or otherwise permanently limit our potential some other way before doing the things we discuss here. I think there's an excellent chance we will; I just want to set that aside for a moment in order to have a distinct discussion about this.
- Given that, the question is more likely than not, not merely "likely enough to be worth taking seriously". Emphasized because I'm very surprised to see some boxes being ticked - you really think those things are more likely than the converse?
Poll #1861629 Ways the Singularity can fail to happen
Open to: All, detailed results viewable to: All, participants: 37
I think it's more likely than not that
|Human minds are fundamentally different to other physical things, and not subject to thinking about like an engineer|
|The idea of one mind being greatly more efficient than another isn't meaningful|
|Human minds are within a few orders of magnitude of the most efficient minds possible in principle in our corner of the Universe|
|Humanity will never be able to design superintelligences that we can build|
|Humanity will never study the fields needed in sufficient detail to gain the ability to design vastly more efficient minds|
|Humanity will never build superintelligence (given the ability to do so)|
|Humanity building superintelligence won't make a really big difference to the world|
|None of the above|