Superintelligence as a Cause or Cure for Risks of Astronomical Suffering
Discussions about the possible consequences of creating superintelligence have included the possibility of existential risk, usually understood as the risk of human extinction. We argue that suffering risks (s-risks) present comparable severity and probability. Just as with existential risks, s-risks can be caused as well as reduced by superintelligent AI.
Read more