By using this site, you agree to our Privacy Policy and our Terms of Use. Close
sc94597 said:
Soundwave said:

  

Even global warming, I'm not saying at all that it shouldn't be a concern and people shouldn't want to take steps to mitigate it, but human beings are not magically going to go extinct even if the global temperature rises by 3-4 degrees. Your ancestors survived the Ice Age when the climate of the planet was brutally far more different without any benefit of any modern technology. No working toilets, no insulated homes, no shopping mall to go buy a winter jacket and gloves from, no grocery store to get food from, no modern medicine, no understanding of mathematics or science really, we still survived. You're only here because they were able to survive that. 

The reason why climate change is an x-risk has to do with the second-order geo-political effects. 

The consequences of hundreds of millions to billions of people being displaced (Indonesia alone has nearly 150 million+ who would be displaced) can't be understated. 

Many people in developed countries go bonkers right now with very low rates of immigration and refugees. Combine that with low native birth rates in these countries, and scarce fresh water access, and you have the conditions for potential nuclear wars. 

All of this is a reality we know is coming if we don't do anything new. 

Climate change is an x-risk. 

AI has the same or larger risk frankly. What jobs are there even going to be left for your kids to make a living from? What is their home that they live in even going to be worth then? 

Just understand again, that AI doesn't care what you concern yourself with. If it's allowed to develop, it will develop. 

Think of it like this. You have two boxers. Boxer A is the champion of the world, he's rich, he has a mansion, fancy cars, wife, kids, mistress, luxury vacations, all the aspects of celebrity, etc. etc. that concern his life. As a result he doesn't really pay as much attention to training like he did when he was a broke boxer trying to make it. 

Now you have Boxer B. Boxer A doesn't even view Boxer B as a threat, but Boxer B is training. Every day. Hours and hours every day. He is in the best shape of his life and highly motivated. He doesn't have any of those distractions in his life. 

Now Boxer B is not going to care that Boxer A was distracted by 100 other things in his life or got soft or whatever when he's knocked Boxer A out on his ass and taking the title from him. 

We're not prepared to deal with a threat like AI if it becomes self learning. We don't even have any kind of context of how to even co-exist with anything remotely as intelligent as us, let alone something that might supersede us. This will be the biggest issue in the lives of the next generation IMO. And it won't even be close.