By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Oh and one thing I've always wondered is how a sentient AI, if developed to be completely as intelligent as us or even more intelligent, would perceive our culture and our history? I'm talking of a completely blank slate full of intelligence. Would they agree with our moral laws and the way in which we treat other human beings? Would such an AI react and make decisions in order to preserve itself or further its own cause? Would it try to foster us as children, and decide our future (like GW) as a God-like figure? If we create AIs this powerful to steer aspects of our existence, will they make a decision that we can agree with? How do we know that we are correct to disagree with that decision, if we cannot be taken out of the context of our own existence?

*puts on a tinfoil hat* What if we're in a game anyhow? Are we observing, or are we being observed?

Ok, that last bit was comedic, but I am completely serious in the rest. As an Anthropologist, the main goal is to remove yourself from the context of your own culture so that you may avoid ethnocentricity in the judgement of other cultures. But talking about our entire species is a separate can of worms to open. As human beings, we want control over our existence. We can never remove ourselves from the context of our existence - I argue that it is impossible to do so. Hell, we can barely remove ourselves from the context of our own game console on this forum! I mean damn! That's only a small step prior to removing ones' self from their culture - which is a microscopic step towards removing ones self from their species existence. That's why you can either disagree with - or just passively resign to - the ride that someone else sets you up for.

Aristotle's Natural Law argued that all living beings are completely driven to reach their end goal; and their lives will be dedicated to seeing it through. We are (arguably) the only species on Earth whom have an endgoal that goes past species preservation - we have evolved past that point.
When a super AI shows up we'll want to control it, but who knows whether or not it will have a "better" plan for us in mind than we do.



#1 Amb-ass-ador