By using this site, you agree to our Privacy Policy and our Terms of Use. Close

One of the biggest issues I have with the x-risk arguments is that they don't consider how much A.I can mitigate other x-risks. 

Climate change and its effects, for example, is far more likely to cause human extinction in my opinion than some fantasy unaligned singleton. 

Likewise, with demographic collapses that are occurring in developed countries due to low birth-rates. 

When accounting for the x-risk of A.I we need to consider other x-risks and how A.I can be used to reduce them.

And that is beyond the basic benefits, like for example how we solved the protein-folding problem using AlphaFold.