By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Sony Discussion - (Rumor) Playstation 6 to be enhanced by generative AI. To feature Ray Reconstruction and Path Tracing

Chrkeller said:
Soundwave said:

You really want to co-exist with an intelligence that is smarter than any human on the planet? 

You really can't see how that could go very wrong very quickly? 

A human being is not being malicious (in most cases) when they have a fly in the room. But if the fly is buzzing all night long and the human decides to squash it dead with a newspaper, it's not personal. The person just has reached a point where it doesn't want to co-exist with a fly because it's trying to get some sleep, the fly was too stupid to understand it's existence was at threat by flying around.  

Now think about your enthusiasm for this kind of a scenario long and hard once again. 

Yes, it might help you with COVID. It may also decide a few years down the road that's it's tired of just working for a bunch of morons that are beneath its intelligence level, and then good luck. 

I'm honestly not worried.  I'm far more concerned about putin and trump than I am with skynet.  US might not have a democracy and a madmen with nuclear codes.  AI is the least of my concerns.  AI would make a better president.  

All that tells me is that kind of attitude is exactly how AI will grow out of control. People will just willingly bend over for it and probably not realize what they have done, then they'll panic and try to shut it off but it'll be far too late. 



Around the Network
Soundwave said:
Chrkeller said:

I'm honestly not worried.  I'm far more concerned about putin and trump than I am with skynet.  US might not have a democracy and a madmen with nuclear codes.  AI is the least of my concerns.  AI would make a better president.  

All that tells me is that kind of attitude is exactly how AI will grow out of control. People will just willingly bend over for it and probably not realize what they have done, then they'll panic and try to shut it off but it'll be far too late. 

I'm far more concerned about current issues. Middle East, Russia, trump, etc.  



Chrkeller said:
Soundwave said:

All that tells me is that kind of attitude is exactly how AI will grow out of control. People will just willingly bend over for it and probably not realize what they have done, then they'll panic and try to shut it off but it'll be far too late. 

I'm far more concerned about current issues. Middle East, Russia, trump, etc.  

Exactly. Human beings more or less can only really process what's in front of them. Put against an intelligence that theoretically doesn't just concern itself with the immediate going ons (serious or not) of "current human events" but can plan things far further in advance for example ... would be a massive problem. 

We're not equipped to handle this kind of a threat, not even close. If it develops to the point where it can think to some degree but can learn and learn exponentially ... we could be in trouble very quickly, probably not even able to understand what's happened before its too late. 

Last edited by Soundwave - on 06 March 2024

Soundwave said:
Chrkeller said:

I'm far more concerned about current issues. Middle East, Russia, trump, etc.  

Exactly. Human beings more or less can only really process what's in front of them. Put against an intelligence that theoretically doesn't just concern itself with the immediate going ons (serious or not) of "current human events" but can plan things far further in advance for example ... would be a massive problem. 

We're not equipped to handle this kind of a threat, not even close. If it develops to the point where it can think to some degree but can learn and learn exponentially ... we could be in trouble very quickly, probably not even able to understand what's happened before its too late. 

Maybe.  But for now I'm concerned about my daughters' bodily rights.  

Fear of AI just doesn't make my list of concerns.  



Chrkeller said:
Soundwave said:

Exactly. Human beings more or less can only really process what's in front of them. Put against an intelligence that theoretically doesn't just concern itself with the immediate going ons (serious or not) of "current human events" but can plan things far further in advance for example ... would be a massive problem. 

We're not equipped to handle this kind of a threat, not even close. If it develops to the point where it can think to some degree but can learn and learn exponentially ... we could be in trouble very quickly, probably not even able to understand what's happened before its too late. 

Maybe.  But for now I'm concerned about my daughters' bodily rights.  

Fear of AI just doesn't make my list of concerns.  

I would say it probably should.

It will likely be the biggest issue in your daughter's life, not the politics of today. That will at some point probably be seen as quaint and people will be nostalgic for this time right now. 

But that's human nature. We're terrible at sizing up threats to us that aren't directly in front of us and an obvious immediate threat. We're just evolved monkeys, we're limited in that our brain can only handle so much so it prioritizes what's in front of us immediately. We're not evolved to handle much more than that. But that evolutionary aspect of humans won't serve us well if AI goes out of control. 



Around the Network
Soundwave said:
Chrkeller said:

Maybe.  But for now I'm concerned about my daughters' bodily rights.  

Fear of AI just doesn't make my list of concerns.  

I would say it probably should.

It will likely be the biggest issue in your daughter's life, not the politics of today. That will at some point probably be seen as quaint and people will be nostalgic for this time right now. 

But that's human nature. We're awful at sizing up threats to us that aren't directly in front of us. We're just evolved monkeys, we're limited our brain can only handle so much so it prioritizes what's in front of us immediately. But that evolutionary aspect of humans won't serve us well if AI goes out of control. 

Lol.  I'm pretty sure my two daughters losing their constitutional rights and having to potentially weather another insurrection is far more relevant than a movie based fear of something that might happen in 20 years.  

Edit

Assuming they don't get shot while in the classroom.  



Chrkeller said:
Soundwave said:

I would say it probably should.

It will likely be the biggest issue in your daughter's life, not the politics of today. That will at some point probably be seen as quaint and people will be nostalgic for this time right now. 

But that's human nature. We're awful at sizing up threats to us that aren't directly in front of us. We're just evolved monkeys, we're limited our brain can only handle so much so it prioritizes what's in front of us immediately. But that evolutionary aspect of humans won't serve us well if AI goes out of control. 

Lol.  I'm pretty sure my two daughters losing their constitutional rights and having to potentially weather another insurrection is far more relevant than a movie based fear of something that might happen in 20 years.  

I'm not saying don't be concerned with the politics of today, I'm just saying an A.I. that doesn't sleep, isn't distracted by petty human squabbles and politics, isn't interested in distractions like entertainment, social media, family matters, work business, career ambitions, what's going to happen in the next season of House of the Dragon, etc. etc. etc. an entity like that which could learn exponentially 24/7, non-stop ... just understand it's not going to care what you're preoccupied with. 

AI absolutely I think for children today will be the biggest issue of their future. For us, in the here and now, Chat GPT or Midjourney spitting out pictures is fine and dandy, the problem is where this will be in 10 years, 20 years, 30 years. 



Soundwave said:
Chrkeller said:

Lol.  I'm pretty sure my two daughters losing their constitutional rights and having to potentially weather another insurrection is far more relevant than a movie based fear of something that might happen in 20 years.  

I'm not saying don't be concerned with the politics of today, I'm just saying an A.I. that doesn't sleep, isn't distracted by petty human squabbles and politics, isn't interested in distractions like entertainment, social media, family matters, work business, career ambitions, what's going to happen in the next season of House of the Dragon, etc. etc. etc. an entity like that which could learn exponentially 24/7, non-stop ... just understand it's not going to care what you're preoccupied with. 

AI absolutely I think for children today will be the biggest issue of their future. For us, in the here and now, Chat GPT or Midjourney spitting out pictures is fine and dandy, the problem is where this will be in 10 years, 20 years, 30 years. 

The divide is you are concerned about a theoretical threat.  I'm concerned about actual threats.  



Chrkeller said:
Soundwave said:

I'm not saying don't be concerned with the politics of today, I'm just saying an A.I. that doesn't sleep, isn't distracted by petty human squabbles and politics, isn't interested in distractions like entertainment, social media, family matters, work business, career ambitions, what's going to happen in the next season of House of the Dragon, etc. etc. etc. an entity like that which could learn exponentially 24/7, non-stop ... just understand it's not going to care what you're preoccupied with. 

AI absolutely I think for children today will be the biggest issue of their future. For us, in the here and now, Chat GPT or Midjourney spitting out pictures is fine and dandy, the problem is where this will be in 10 years, 20 years, 30 years. 

The divide is you are concerned about a theoretical threat.  I'm concerned about actual threats.  

The "theoretical threat" could be species extinction though. The "current day politics threats" are not. 

Even global warming, I'm not saying at all that it shouldn't be a concern and people shouldn't want to take steps to mitigate it, but human beings are not magically going to go extinct even if the global temperature rises by 3-4 degrees. Your ancestors survived the Ice Age when the climate of the planet was brutally far more different without any benefit of any modern technology. No working toilets, no insulated homes, no shopping mall to go buy a winter jacket and gloves from, no grocery store to get food from, no modern medicine, no understanding of mathematics or science really, we still survived. You're only here because they were able to survive that. 

You may not be concerned with A.I., but your kids will be when they are your age. Don't kid yourself about that. I think it will be the pressing issue for their generation. 

We don't have any experience of what co-existing with a human level (or better) intelligence might be like. The best kind of intelligence we've come head to head against are like bonobo monkeys, chimpanzees (basically our cousins) or dolphins, we're not remotely even ready for what could be coming. 



Assuming computers are even capable of independent and cognitive thought. Hence, a theoretical concern.