Managing the runway to Doomsday

A full-spectrum analysis of the science of doomsdayology might see humanity’s deliberations divide broadly into three categories. At one end where ignorance and insanity fight it out for a place at the bottom of the table, the brimstone peddlers of Revelation and snake-oil salesmen of cults and corruption cry out, never calling the End of Days too soon for them not to profit in some way, but never so distant as to prevent the innervation of the credulous with the literal Fear of God.

At the other end, providing more than enough ballast for that intellectual vacuity, are the impressively serious academic institutes that will leave you marvelling at the cognitive compass of our species. These include but are not limited to the Centre for the Study of Existential Risk in Cambridge; the Future of Humanity Institute in Oxford; the geographically decentralised Global Catastrophic Risk Institute; and the Machine Intelligence Research Institute (MIRI) in California.

Midway, we have the crooked timber of humanity just goofing along, although perhaps inclining more to our first polarity when wondering if street lighting, polyphonic music or lighting rods might impede God’s will. More responsibly but without marked improvement in predicting the end of civilisation, we have the deliberators over the Industrial Revolution and the advent of the railway; the physicists who worried that nuclear explosions would ignite the atmosphere, or the Hadron Collider cleave the planet in two. Then there’s recombinant DNA, Y2K, genetically modified food, global warming and, bringing us up to date, killer robots. And have fun spotting the ones that really are existential threats.

Maybe the answer lies in recombinant AI&HI, on the basis that HI will only get so far with academic reflections, bereft yet of the revelations of AI yet to come.

Lively Stuff from Planet BAM!

Leave a Reply

Your email address will not be published. Required fields are marked *