Humanity can defeat SkyNet with BOOKS, says IT think tank
If CompSci kiddies read Neal Stephenson and Dave Eggers, our species will endure
A group of researchers working for National ICT Australia reckons computer science courses need to look at artificial intelligence from an ethical point of view – and the popularity of sci-fi among comp.sci students makes that a good place to start.
As the research team, which included NICTA's Nicholas Mattei, the University of Kentucky's Judy Goldsmith and Center College's Emanuelle Burton, explain in their paper, ethical questions arise I a variety of AI environments. There's the “mechanics of the modern military”, the “slow creep of a mechanized workforce” for example.
“We have real, present ethics violations and challenges arising from current AI techniques and implementations, in the form of systematic decreases in privacy; increasing reliance on AI for our safety, and the ongoing job losses due to mechanization and automatic control of work processes,” the paper states.
Computer science courses, they reckon, fall short in the ethical debate, even though “AI professionals come up against ethical issues on a regular basis”.
Such things have, they note, been argued in sci-fi for decades, and from a teaching point of view the genre provides settings that allow “students to detach from personal preconceptions.”
Also, sci-fi as a framing for teaching AI and ethics is popular.
What's interesting to The Register is the corpus cited in the paper: Melissa Scot's The Jazz, various Neal Stephenson works (Reamde, Diamond Age), Scott Westerfield's Extras, Gary Shteyngart’s Super Sad True Love Story, Mary Doria Russell’s The Sparrow, David Eggers’ The Circle, and a nod to film by way of Charlie Chaplain's Modern Times.
Vulture South would be interested in what readers think: what other works in the world's vast store of science fiction encapsulate the ethics of artificial intelligence?
We'll offer Philip K Dick's Ubik, in which people have to negotiate payment with their apartments to get in or out – but we're certain that you can do better … ®