I often find discussion of the idea of superintelligence quickly devolves into a less interesting discussion about how people feel about those who talk about superintelligence. I'm interested to know what you think will happen
- For each option, what I'm interested in is how likely you think it is GIVEN THAT NONE OF THE ABOVE ARE TRUE. So for the third one, what I want to know is, ASSUMING THAT human minds are NOT fundamentally different to other physical things, AND that it is meaningful to think of one mind being greatly more efficient than another, would you say it's more likely than not that human minds are within a few orders of magnitude of the most efficient minds possible in principle in our corner of the Universe? So for the penultimate option, you're asking, given that the idea of superintelligence is meaningful and we have the capacity to build them, will we?
- And there's one big assumption that I don't even put in the poll: the assumption that we don't wipe ourselves out or otherwise permanently limit our potential some other way before doing the things we discuss here. I think there's an excellent chance we will; I just want to set that aside for a moment in order to have a distinct discussion about this.
- Given that, the question is more likely than not, not merely "likely enough to be worth taking seriously". Emphasized because I'm very surprised to see some boxes being ticked - you really think those things are more likely than the converse?
I think it's more likely than not that