Human extinction risks are poorly understood and severely underestimated by society. Most worrying is the subset of existential risks that arise from human technology, a subset that is expected to grow in number and potency over the next century.
Bostrum is doing a phenomenal job at The Future of Humanity Institute. I find him to be one of the most exciting thinkers at the moment. We need more people and more institute's supporting research that will help us understand existential risk so that we can reduce the chances of them happening.
Perhaps a bit too cynical, but this might highlight our shortcoming. We don't seem to have been able to overcome the effect that spatial differences have on our moral obligations. We only seem very motivated when there are active cultural bonds.Well suppose you have a moral view that counts future people as being worth as much as present people. You might say that fundamentally it doesn't matter whether someone exists at the current time or at some future time, just as many people think that from a fundamental moral point of view, it doesn't matter where somebody is spatially---somebody isn't automatically worth less because you move them to the moon or to Africa or something.