The Register Lecture: AI turning on us? Let's talk existential risk
Cambridge comes to Clerkenwell to talk Existential Risk
A sneaking fear that the machines might turn on us is just not good enough - we need to be able to quantify that risk if we want to avoid it, or at least manage it. Or we could just push on regardless and see how things work out.
Whatever your take, we’re thrilled to have Dr Adrian Currie of Cambridge’s Centre for the Study of Existential Risk joining us for a Register Lecture on April 25 to discuss How Can We Develop a Science of Existential Risk?
As Adrian puts it, existential risks are threats to the very existence of the human species. Old-school ones such as meteor strikes, massive volcanic eruptions and climate change, leave traces for us to study. Others are much trickier to track, such as those technological developments that have enabled our species to have unprecedented effects on a global level.
So, to understand how we can reap the benefits of AI, automation, synthetic biology and advanced gene-editing techniques, and so on, without, well, imperilling our very existence, we need to find a way of understanding, communicating and minimizing those risks.
Adrian believes that a science of existential risk must be speculative and creative: “Its targets are complex, obstinate and little understood and the incentive structures which govern science (I’ll argue) actually discourage that kind of work. Which means we need to rethink what science looks like, and perhaps the role of scientists in society.”
This journey into the future begins at the Yorkshire Grey on Theobalds Road, London, on April 25. Doors will be open from 6.30pm, with the lecture proper starting at 7pm. As ever, refreshments of the liquid and solid variety, will be available.
We’ll break for a drink and a bite following Adrian’s presentation, after which the floor will be open to questions. It promises to be a fascinating evening, and we look forward to seeing you there. ®
You can see all out upcoming lectures, and videos of our previous lectures, right here.