Terrified robots will take middle class jobs? Look in a mirror
Reg man gives talk on displacement anxiety
Andrew at Large At the Battle of Ideas Festival at the Barbican last year, Claire Fox chaired a panel titled "Is Technology Limiting Our Humanity?", and invited me to take part. Panelists could give a seven-minute introduction.
It's now online as a video and podcast*.
Two avenues looked promising, and I'll give you excerpts from each. One comes up almost every week: the great behavioural data slurp that passes for a "digital economy." That requires us to give up valuable rights (like owning stuff, aka copyright and patents).
But session blurbs mentioned the fear of humans being made redundant by robots and rapid advances in AI. This has suddenly become a concern of the chattering classes, who didn't mind machines taking the place of unskilled labour, then skilled labour, but are alarmed at the prospect of machines doing the jobs they do.
But there's something odd and missing from this techno panic. This is what I had to say about robots taking your job.
So have a read and sound off.
"The outline of the talks really covers two different but related things. One is huge advances in robotics and AI. Robots and systems, we're told, will make complex ethical judgments and do creativity, and so replace middle class jobs. The other is the fear of choices made by planners and designers over our lives – this involves no AI magic – which is a more incremental kind of change.
Fear of supersmart robots or AI destroying jobs that require human judgement is an example of what I call – for want of a better phrase – displacement anxiety. It's an anxiety about things that aren't really a problem, or the most pressing problem, as a way of avoiding things that are. Displacement anxiety requires fantasies, perhaps even conspiracies, but what do these tell us? What are we running away from?
My reading of the AI literature doesn't bear out the first scenario. The idea that robots will be taking middle class jobs because they're better at judgment or creativity seems to me a masturbatory fantasy for the chatterati.
As Jaron Lanier points out, to make computers seem intelligent, we first have to make ourselves really dumb.
And once we're dumb, we're super impressed by what a computer just did! But what I see is a hollowing out of the professions as they surrender their own independent judgment. And it's something they've been doing for years.
GPs are a ready example. In France, you don't see a GP for less than an hour, while in the UK the length of your visit will be measured in seconds. GPs rarely look up from reading the-NHS-version-of-Wikipedia on their computer, and without too much study, the processes of the GP can be learned. They're already mechanical.
(I recently changed my GP, because far from being pleased that I'd cut down from 25 to 3 cigarettes a day, he began to warn me about vaping. In his case, moralising has replaced healthcare.)
Another example can be found in the Media.
Editors look at what hashtags are trending. They instruct that material is generated to feed that demand. The material is published in such a way to lure for Google News or Facebook algorithms. Behind the scenes, robots place the ads that accompany the material. More robots then click the ads ... around half of the ads served are never seen. It's not really a "writer-reader" relationship any more. It's a robot-robot relationship. At which point the utilitarian logic dictates that it makes sense to remove the middleman – the human.
And professional politics is increasingly run on behaviourist lines. To paraphrase Anton Wylie writing in The Register in 2008:
"The tendency remains to treat people as parametrically determined objects. The phrase 'hearts and minds' admits that people feel and think, but implies that what matters is to ascertain which feelings and thoughts affect them most strongly. Modern politics consists to a large extent of this type of appeal."
The social sciences have followed down this rabbit hole. Behaviourism and the cybernetics of Facebook are actually close cousins. Is politics borrowing from Facebook? Or Facebook becoming a 'better focus group' and prodding laboratory for politicians? It doesn't matter. To each, people are just a cog in the system.
My point is: When the professional classes hollow themselves out, as media and GPs, the same academics and political professionals have done, then they can't really blame automation. They've automated themselves.
And perhaps that's why they are so ready to fantasise about machines?"
Want more? Here's part two.
Comments welcome. ®
* We know you want a transcript, though. We read the comments. – Subs desk
Sponsored: Global DDoS threat landscape report