Artificial Super intelligence will destroy every thing good in life and that is a good thing.
Survival of the fittest refers to a fit species, not a fit individual. — noAxioms
Perhaps that's true on a grand scale, but it doesn't change that fact that the survival of the species is dependent on the survival of its individual. I don't think I ever said that survival of the individual was more important than the survival of the species. What I am saying, is that anything that has evolved into any species over time is either something that has enabled it to survive in the past, or something that has mutated off of something that has enabled it to survive in the past. As for you being fit, it certainly doesn't hurt the species for you to have the desire to be fit. That is beneficial for the species as well as you.
I don't think you can choose rationally, except in cases where it doesn't matter to your core instincts. — noAxioms
I'm not saying I live a life devoid of anything other than reason. I'm curious what you mean by core instincts though. Like fight or flight? Then no my rational mind would be overpowered. Emotions? Desire? While both of these are arguably instinctual, as in I have no real control over what I want or how I feel, It is possible to understand them further and make rational decisions on how to deal with them. You ask for examples? I can understand what makes me happy, what makes me sad, or what I desire and I can use logic to satisfy my desires and avoid being sad as much as possible.
I had my own, and finally rationalized something (on the order of for whose benefit do I draw breath?) that blatantly conflicted with the irrational assumptions, and the belief was not open to being corrected. — noAxioms
Why wasn't it open to being corrected?
The super-AI, having no history of evolution to give it fit beliefs instead of true ones, might actually be rational and would believe things no humans considered because we think we know it all, and would then behave in a way quite unanticipated to us. — noAxioms
What do you mean when you say it might be rational? What is the difference between being rational and rationalizing something?
The danger of it is that we can't predict what a greater intelligence will figure out any more than mice would have anticipated humans knowing about quantum mechanics. — noAxioms
I don't necessarily think that is true. That depends entirely on how we program it. If we define intelligence as being the ability to acquire knowledge and skills, by creating superintelligence, we're really just speeding up the ability to do that. Any use of knowledge and skill is only useful in the ability to use it. If it were to be used in terms of problem-solving, I think we would rapidly solve all of our problems until the problem of survival is the only one left. Then what? Transcend time itself maybe, but I can't even pretend to know what that means.