Not for this thread I know, but it's the most serious threat faced my mankind ever, and hardly anyone talks about it. Very conceivably every single human being could be dead in 20 or 30 years, wiped out. I mean how serious does a threat have to be? Nuclear armaggedon is piffling compared to this. Climate change, irrelevant fluff.
“There is a 10-20 percent probability that within the next thirty years, Artificial Intelligence will cause the extinction of humanity.” - Geoffrey Hinton
“Superintelligent AI ... could lead to ‘the disempowerment of humanity or even human extinction,’ since engineers are unable to prevent AI from ‘going rogue.’” - Ilya Sutskever
“It is entirely possible that artificial intelligence (AI) could lead to the extinction of Homo sapiens." - Nick Bostrom (Philosopher, University of Oxford)
“The real problem with AI is not malice, it's incompetence. ... Humanity's strategy is to learn from mistakes. When the end of the world is at stake, that is a terrible strategy.” - Max Tegmark (MIT)
“I have a sense that our current remaining timeline looks more like five years than 50 years. Could be two years, could be 10. The difficulty is, people do not realise, we have a shred of a chance that humanity survives.” - Eliezer Yudkowsky