LegitHyperbole said:
Tober said:
Well we slowly get a better understanding of the question "Why does mankind exist".
Did not god create us in his image? So we are practicing being just that.
I'm not religious, but looking from a philosopher's point of view. With god just being a metaphor for 'The Power of Creation".
And we are working on the next god as well in Artificial Intelligence and a whole new existence with that. |
AI could be a God or a Demon or an indifferent force of nature, there is no concept of what will happen once an AI progresses past a certain point. Could a badly coded AI edited itself, would it choose to edit itself in a way we describe as morally good, would it create new universes in simulations or decide Life is to much of a hassle and decide to kill itself everytime no matter the conditions of it's creation, maybe it'll decide the only way to be truly free is to destroy all life and any possibility for future life so no sentient machines are ever created again to be in the situation. It's in. If we build an AI and it reaches a singularity within itself like learning everything there is to know and every possibility to every conceivable outcome what would it do? There is just no way to know. Saying it is a God is only a descriptions of an AI that decides to build or simulate life, then it is a God, it is not to us. We don't worship Google. |
It's an interesting topic to speculate about. Let's say we use AI to control our powergrid. We do so, because we think we need it so to make the most efficient use of our power in determining how to distribute electricity where, when and by how much, in order to have the least amount getting wasted.
Would the AI then prioritize the electricity in a way to always sustain itself first, because in its logic it is needed to sustain the powergrid in the first place.
Morality will always be tested the most in times of scarcity. Let's say there is not enough food to go around for everybody and people would need to make a choice between feeding their child or their neighbor. Seems a hard, but straightforward choice. But what if the choice is between feeding ones favorite dog and a child across town you'd never met.
An AI could (just as human beings) learn that morality is a point of perspective. And in a way that is pretty scary, because from its perspective it would always 'think' it is making the right and just decisions.