I will stay with the term Singularity. We've used it to refer to the radical AI transition for 20 years. Yes, the term is interpreted broadly. How could it not be? We all come to the party with our own world views. I'm an optimist, hence much closer to Kurzweil than Skynet.
The term singularity works on many levels. I guess I am just turned off by this metaphor in some sense because a blackholes event horizon really represents an area of space-time where information is impossible to obtain. Getting closer to the event horizon does not make it easier to know whats beyond it. However, as we approach the 2040s-2050s we will start to develop a better sense of what will happen "post-singularity." In that sense there will always be a future event horizon. It will just constantly change.
I think it's aptly named. The concept of a singularity is really in reference to the event horizon of a black hole. The point in which we couldn't go back to operating as a society without it, even if we wanted to. Perhaps you could say the same for modern agricultural practices, which are part of what defines a civilization. That's a whole different discussion. More than that, I question my fear that we will become pancake people -- ever expanding in awareness of knowledge, but very limited in understanding the depth of things. That all we will satiate is the immediate gratification of our reptilian brain. It's already apparent with the lack of empathy with kids online today, the banal indifference of the news and the overall deconstruction of community. I'm reminded of Faust, moreso that we are actually becoming more like computers than it like us. Anyone who is an artist or programmer will tell you the idea of constraints is necessary. What we are talking about with a singularity is in the words of yoda, "absolute power." It's a winner-take-all kind of power and it's hard not to imagine powerful interests using it to dominate all outcomes of almost everything. If you can patent and own "DNA" such as Montsanto can, then why can't you patent and own the consciousness algorithm that makes up you? So where are we headed with the singularity? The Dark Ages. Because what we are really talking about isn't technology at all, it's theology. As ridiculous as it may sound, go to church and understand what it means to have a rational sense of faith.
I agree that 'the Singularity' won't be identifiable as a point in time, and that it will probably come to pass in a number of seemingly mundane ways. IMO one problem with the concept, is that I think it leads people to think of a Short Circuit moment, where we slap our faces, and say: "My GOD, the robot has feelings!" Instead, I think it there will come a point where we create AIs that can learn and adapt to their environment to such an extent, that we will start to seriously debate whether or not they deserve special status. And if history is any evidence, we will look back at when we granted them special status (if we can) and realize that we did so much too late.
I'm not sure most even pinpoint A.I. consciousness as "the singularity." It has become more closely aligned with the idea of infinitely self-generating technology (i.e., A.I. that can make new and better A.I. in an infinite loop that spirals exponentially forward). But your point about A.I., consciousness, and how we will deal with it is well taken.
We could call it the "Beyond Postulation Point." or BPP
BPP is a cool concept. I suppose that could be used to refer to the future point in time where our models for technological evolution break down. That future point would always change, depending on the models we use and the entities constructing them.
I sometimes feel I am a bit harsh on the singularity prolly cuz it is scifi and comes from a religious urge but it is fun to speculate.
It's funny, because I feel the same way about AI doubters. That is, I think putting human-level intelligence on a level that cannot be artificially surpassed is a kind of ptolemic view. Not saying that I'm backing Kurzweil, I haven't read his stuff. I just see intelligence only as elusive as our inability to understand it.
"A religious urge," meaning from a place of hopefulness?