Sean Carroll interviewed Martin Rees on his podcast.
"Anyone who has read histories of the Cold War, including the Cuban Missile Crisis and the 1983 nuclear false alarm, must be struck by how incredibly close humanity has come to wreaking incredible destruction on itself. Nuclear war was the first technology humans created that was truly capable of causing such harm, but the list of potential threats is growing, from artificial pandemics to runaway super-powerful artificial intelligence. In response, today’s guest Martin Rees and others founded the Cambridge Centre for the Study of Existential Risk. We talk about what the major risks are, and how we can best reason about very tiny probabilities multiplied by truly awful consequences. In the second part of the episode we start talking about what humanity might become, as well as the prospect of life elsewhere in the universe, and that was so much fun that we just kept going."