By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Sony Discussion - (Rumor) Playstation 6 to be enhanced by generative AI. To feature Ray Reconstruction and Path Tracing

Chrkeller said:

Assuming computers are even capable of independent and cognitive thought. Hence, a theoretical concern.

It just has to reach a point where it's self learning. Then all bets are off. 

Sure it'll be great when it can cure cancer ... but to get to that level we are likely going to do something stupid like let it self iterate, learn, "think" to whatever degree by itself. When it can do that, then well it may well rapidly become impossible to control. 



Around the Network
Soundwave said:

  

Even global warming, I'm not saying at all that it shouldn't be a concern and people shouldn't want to take steps to mitigate it, but human beings are not magically going to go extinct even if the global temperature rises by 3-4 degrees. Your ancestors survived the Ice Age when the climate of the planet was brutally far more different without any benefit of any modern technology. No working toilets, no insulated homes, no shopping mall to go buy a winter jacket and gloves from, no grocery store to get food from, no modern medicine, no understanding of mathematics or science really, we still survived. You're only here because they were able to survive that. 

The reason why climate change is an x-risk has to do with the second-order geo-political effects. 

The consequences of hundreds of millions to billions of people being displaced (Indonesia alone has nearly 150 million+ who would be displaced) can't be understated. 

Many people in developed countries go bonkers right now with very low rates of immigration and refugees. What happens when a fifth of the world has to move and likely aims for these wealthy, resource-hoarding countries? Combine that with low native birth rates in these countries, and scarce fresh water access, and you have the conditions for potential nuclear wars. 

All of this is a reality we know is coming if we don't do anything new. 

Climate change is an x-risk. 



sc94597 said:
Soundwave said:

  

Even global warming, I'm not saying at all that it shouldn't be a concern and people shouldn't want to take steps to mitigate it, but human beings are not magically going to go extinct even if the global temperature rises by 3-4 degrees. Your ancestors survived the Ice Age when the climate of the planet was brutally far more different without any benefit of any modern technology. No working toilets, no insulated homes, no shopping mall to go buy a winter jacket and gloves from, no grocery store to get food from, no modern medicine, no understanding of mathematics or science really, we still survived. You're only here because they were able to survive that. 

The reason why climate change is an x-risk has to do with the second-order geo-political effects. 

The consequences of hundreds of millions to billions of people being displaced (Indonesia alone has nearly 150 million+ who would be displaced) can't be understated. 

Many people in developed countries go bonkers right now with very low rates of immigration and refugees. Combine that with low native birth rates in these countries, and scarce fresh water access, and you have the conditions for potential nuclear wars. 

All of this is a reality we know is coming if we don't do anything new. 

Climate change is an x-risk. 

AI has the same or larger risk frankly. What jobs are there even going to be left for your kids to make a living from? What is their home that they live in even going to be worth then? 

Just understand again, that AI doesn't care what you concern yourself with. If it's allowed to develop, it will develop. 

Think of it like this. You have two boxers. Boxer A is the champion of the world, he's rich, he has a mansion, fancy cars, wife, kids, mistress, luxury vacations, all the aspects of celebrity, etc. etc. that concern his life. As a result he doesn't really pay as much attention to training like he did when he was a broke boxer trying to make it. 

Now you have Boxer B. Boxer A doesn't even view Boxer B as a threat, but Boxer B is training. Every day. Hours and hours every day. He is in the best shape of his life and highly motivated. He doesn't have any of those distractions in his life. 

Now Boxer B is not going to care that Boxer A was distracted by 100 other things in his life or got soft or whatever when he's knocked Boxer A out on his ass and taking the title from him. 

We're not prepared to deal with a threat like AI if it becomes self learning. We don't even have any kind of context of how to even co-exist with anything remotely as intelligent as us, let alone something that might supersede us. This will be the biggest issue in the lives of the next generation IMO. And it won't even be close. 



sc94597 said:
Soundwave said:

  

Even global warming, I'm not saying at all that it shouldn't be a concern and people shouldn't want to take steps to mitigate it, but human beings are not magically going to go extinct even if the global temperature rises by 3-4 degrees. Your ancestors survived the Ice Age when the climate of the planet was brutally far more different without any benefit of any modern technology. No working toilets, no insulated homes, no shopping mall to go buy a winter jacket and gloves from, no grocery store to get food from, no modern medicine, no understanding of mathematics or science really, we still survived. You're only here because they were able to survive that. 

The reason why climate change is an x-risk has to do with the second-order geo-political effects. 

The consequences of hundreds of millions to billions of people being displaced (Indonesia alone has nearly 150 million+ who would be displaced) can't be understated. 

Many people in developed countries go bonkers right now with very low rates of immigration and refugees. What happens when a fifth of the world has to move and likely aims for these wealthy, resource-hoarding countries? Combine that with low native birth rates in these countries, and scarce fresh water access, and you have the conditions for potential nuclear wars. 

All of this is a reality we know is coming if we don't do anything new. 

Climate change is an x-risk. 

Not too mention supply chain decimation and economic collapse.  

Putting AI as the top risk in today's world is one of the silliest positions I've ever seen.



i7-13700k

Vengeance 32 gb

RTX 4090 Ventus 3x E OC

Switch OLED

Soundwave said:

AI has the same or larger risk frankly. What jobs are there even going to be left for your kids to make a living from? What is their home that they live in even going to be worth then? 

Just understand again, that AI doesn't care what you concern yourself with. If it's allowed to develop, it will develop. 

Think of it like this. You have two boxers. Boxer A is the champion of the world, he's rich, he has a mansion, fancy cars, wife, kids, mistress, luxury vacations, all the aspects of celebrity, etc. etc. that concern his life. As a result he doesn't really pay as much attention to training like he did when he was a broke boxer trying to make it. 

Now you have Boxer B. Boxer A doesn't even view Boxer B as a threat, but Boxer B is training. Every day. Hours and hours every day. He is in the best shape of his life and highly motivated. He doesn't have any of those distractions in his life. 

Now Boxer B is not going to care that Boxer A was distracted by 100 other things in his life or got soft or whatever when he's knocked Boxer A out on his ass and taking the title from him. 

We're not prepared to deal with a threat like AI if it becomes self learning. We don't even have any kind of context of how to even co-exist with anything remotely as intelligent as us, let alone something that might supersede us. This will be the biggest issue in the lives of the next generation IMO. And it won't even be close. 

Sorry, it is far easier to change an economic system to one that isn't based on wage-labor than fix decaying ecological systems or solve the problem of mass-displacement. 

We've changed economic systems multiple times in the last dozen millennia and its a matter of restructuring laws surrounding property and social relationships. 

You can't do the same thing with local and global ecologies. 

As for the fear of some Singleton AI, it is pure fantasy. Assertions based not in empirical reality but random assumptions are easily discarded. For every Terminator scenario there is The Culture.

So if somebody is going to make the argument that an ASI is going to immediate arise and kill us all, it better be by looking at actual AI systems and understanding what they are. Which you aren't doing. 



Around the Network
Chrkeller said:
sc94597 said:

The reason why climate change is an x-risk has to do with the second-order geo-political effects. 

The consequences of hundreds of millions to billions of people being displaced (Indonesia alone has nearly 150 million+ who would be displaced) can't be understated. 

Many people in developed countries go bonkers right now with very low rates of immigration and refugees. What happens when a fifth of the world has to move and likely aims for these wealthy, resource-hoarding countries? Combine that with low native birth rates in these countries, and scarce fresh water access, and you have the conditions for potential nuclear wars. 

All of this is a reality we know is coming if we don't do anything new. 

Climate change is an x-risk. 

Not too mention supply chain decimation and economic collapse.  

Putting AI as the top risk in today's world is one of the silliest positions I've ever seen.

There is no supply chain period if AI automates most/all jobs. You understand that right? 

With what are you going to pay for anything with (assuming your currency is even worth what it's printed on at that point) with no job? How is your daughter going to make money? With what career? 

Treating this is an either/or situation is also again very much a sign of human weakness. We're not built to deal with multiple threats at once, again evolved monkeys can only deal with what's in front of them and generally only one thing at a time. And we can't even do that terribly well. 

That's not ideal at all for dealing with AI. It will develop while humanity is distracted by other things. 

Last edited by Soundwave - on 06 March 2024

Soundwave said:
Chrkeller said:

Not too mention supply chain decimation and economic collapse.  

Putting AI as the top risk in today's world is one of the silliest positions I've ever seen.

There is no supply chain period if AI automates most/all jobs. You understand that right? 

With what are you going to pay for anything with (assuming your currency is even worth what it's printed on at that point) with no job? How is your daughter going to make money? With what career? 

Treating this is an either/or situation is also again very much a sign of human weakness. We're not built to deal with multiple threats at once, again evolved monkeys can only deal with what's in front of them and generally only one thing at a time. And we can't even do that terribly well. 

That's not ideal at all for dealing with AI. It will develop while humanity is distracted by other things. 

Seriously, I have already explained it to you in the thread how an economy not based on wage-labor and mass automation can work.

I just linked a reference to a fictional novel series written in the 70-00's that takes place in a post-scarcity society with Artificial Superintelligence. 

Just because you can't imagine a post-capitalist economy doesn't mean the rest of us are incapable of it. 



sc94597 said:
Soundwave said:

AI has the same or larger risk frankly. What jobs are there even going to be left for your kids to make a living from? What is their home that they live in even going to be worth then? 

Just understand again, that AI doesn't care what you concern yourself with. If it's allowed to develop, it will develop. 

Think of it like this. You have two boxers. Boxer A is the champion of the world, he's rich, he has a mansion, fancy cars, wife, kids, mistress, luxury vacations, all the aspects of celebrity, etc. etc. that concern his life. As a result he doesn't really pay as much attention to training like he did when he was a broke boxer trying to make it. 

Now you have Boxer B. Boxer A doesn't even view Boxer B as a threat, but Boxer B is training. Every day. Hours and hours every day. He is in the best shape of his life and highly motivated. He doesn't have any of those distractions in his life. 

Now Boxer B is not going to care that Boxer A was distracted by 100 other things in his life or got soft or whatever when he's knocked Boxer A out on his ass and taking the title from him. 

We're not prepared to deal with a threat like AI if it becomes self learning. We don't even have any kind of context of how to even co-exist with anything remotely as intelligent as us, let alone something that might supersede us. This will be the biggest issue in the lives of the next generation IMO. And it won't even be close. 

Sorry, it is far easier to change an economic system to one that isn't based on wage-labor than fix decaying ecological systems or solve the problem of mass-displacement. 

We've changed economic systems multiple times in the last dozen millennia and its a matter of restructuring laws surrounding property and social relationships. 

You can't do the same thing with local and global ecologies. 

As for the fear of some Singleton AI, it is pure fantasy. Assertions based not in empirical reality but random assumptions are easily discarded. For every Terminator scenario there is The Culture.

So if somebody is going to make the argument that an ASI is going to immediate arise and kill us all, it better be by looking at actual AI systems and understanding what they are. Which you aren't doing. 

I frankly don't even care where we are today. That's not my concern. My concern is where this will be in 20, 30 years, that's where even the "well it's not there today!" folks tend to get a bit quiet and shrug their shoulders.

30 years is not a long period of time, I remember 30 years ago like it was yesterday. 



Soundwave said:
sc94597 said:

Sorry, it is far easier to change an economic system to one that isn't based on wage-labor than fix decaying ecological systems or solve the problem of mass-displacement. 

We've changed economic systems multiple times in the last dozen millennia and its a matter of restructuring laws surrounding property and social relationships. 

You can't do the same thing with local and global ecologies. 

As for the fear of some Singleton AI, it is pure fantasy. Assertions based not in empirical reality but random assumptions are easily discarded. For every Terminator scenario there is The Culture.

So if somebody is going to make the argument that an ASI is going to immediate arise and kill us all, it better be by looking at actual AI systems and understanding what they are. Which you aren't doing. 

I frankly don't even care where we are today. That's not my concern. My concern is where this will be in 20, 30 years, that's where even the "well it's not there today!" folks tend to get a bit quiet and shrug their shoulders.

30 years is not a long period of time, I remember 30 years ago like it was yesterday. 

What exists today and the research that is done today is the precedent and determines the range of possibilities for what exists 20-30 years from now.



sc94597 said:
Soundwave said:

I frankly don't even care where we are today. That's not my concern. My concern is where this will be in 20, 30 years, that's where even the "well it's not there today!" folks tend to get a bit quiet and shrug their shoulders.

30 years is not a long period of time, I remember 30 years ago like it was yesterday. 

What exists today and the research that is done today is the precedent and determines the range of possibilities for what exists 20-30 years from now.

Or 40? Or 50 years? It's not terribly difficult to scale this to a time line where the so-called experts shrug their shoulders and basically concede "well good luck". 

35 years ago the most cutting edge piece of technology in most people's homes was an NES, I remember most homes didn't even have a personal computer at that time. 5 years prior to that? Maybe a VCR. 

Today my refrigerator has more processing power than an NES by several times over. 

We're not even accounting for AI itself designing chips and technology and even other AI, which will probably happen at some point if you keep pouring billions/trillions of dollars into its development and have all these mega corporations hyper incentivized to create something better and better.