Short-term thinking hinders governments in preparing for human-driven global risk
Governments all over the world lack the necessary understanding to address and effectively plan for global catastrophe, says a recently published report (pdf) by the University of Cambridge’s Centre for the Study of Existential Risk. This is as terrifying as it sounds, because global interconnectivity, increased population densities, and the advent of ever-more sophisticated discoveries — ranging from AI to advanced biotechnologies or nuclear weapons — could compound the potential impact of catastrophes we may not be fully able to control. And that’s before we even get into the risks we may not have anticipated yet.
Why are governments underprepared and what can be done about it? The problem is essentially a disconnect between science and policy, the report says. Policy decisions are often driven by short-term perceptions of what’s important, with policymakers unlikely to make global risk mitigation a priority because it doesn’t help them win elections. Long-term planning is seen as important when it comes to monetary policy or infrastructure, but not for natural hazards, climate change or nuclear war. The report recommends that governments hire more scientists and quality internal teams to research risk, strengthen the links between science and policy (so that the former informs the latter), finance independent academic research on catastrophic risk, and increase government resources to attract scientific and technological expertise.