The Modern MACH Man
So there we have it: the ultimate in wearable computing, a compact, subcutaneous computer able to communicate by speech. This being the modern age, of course it would have wireless links too - cellular for range and ubiquity - for interrogating remote databases plus GPS to keep our secret agent on track to his destination and to find the quickest escape route. Again, all this is possible now: we have the technology, we have the capability.
The question is, of course, whether our man would need it at all. The very technology that would allow a computer to be miniaturised and embossed on a secret agent’s skull as per the original MACH 1 is already there in his hand: it’s his phone.
Give him a hands-free headset and a feed from base, and he can be continually aided by humans at home who can monitor his progress using military grade GPS tech and, at least while he’s outdoors, satellite-mounted cameras. Modern agents don’t need to undergo expensive surgery to get real-time data feeds - just a trip to a local Carphone Warehouse.
That’s the trouble with looking ahead an wondering how technology might evolve. Writers often get the broad thrust right - as Pat Mills did, anticipating the miniaturisation of advanced compute power - but when it comes to details they are usually out by miles. They’re interested in what we can’t do now but would like to. But they forget that it’s the technologies we discovered but didn’t know what they might make possible that change the world.
Take the microprocessor. Integrating circuits is one thing, and a necessary step toward the greater miniaturisation of electronics, but using the technique to create a programmable logic device that, at first, was less powerful than any computer of the time... well, no wonder Intel, back in the 1960s when the first microprocessors were being created, thought its future lay in memory chips. Even if the company could have foreseen the personal computer revolution, and perhaps made the logical leap off the desktop and into users’ hands, would it have conceived a world just 50 years hence when people own laptops, desktops, tablets and smartphones, all of them general purpose computing devices?
Pat Mills at least envisaged an interesting new form-factor: the cranium. Of course, he created MACH 1 to entertain ten-year-old boys, not forecast the future. John Probe’s computer ally is a plot device first, prediction second, and an unintentional one at that. Post World War II science-fiction is rarely deliberately predictive. Any fiction set in the future is going to make implicit predictions about what technology humans will be using at that time, but it’s important not to forget that these ‘forecasts’ are first and foremost tools to help tell a story.
Think about Star Trek, perhaps the most enduring example of SF widely believed to be intentionally predictive, especially by Americans working in the technology industry, most of whom seem to be Trekkies. The show’s Communicator is held by some to be a mobile phone prototype, but it’s really just a futuristic walkie-talkie. There’s no indication everyone in the Federation or beyond possesses such a device and uses it not only to talk to people over great distances but to read the news, play games, to navigate on foot, to tell other folk what they’re currently doing, and to take pictures. Star Trek: The Next Generation may have miniaturised the Communicator into a badge, but it’s still just a voice device, not a smartphone.
Not to pick on Trek - few novels, comics, films or TV series forecast what mobile phones have become and their impact on today’s society let alone tomorrow’s. Those that hinted at widespread phone use didn’t see text messaging coming or - despite the Sony Walkman - that we’d be downloading music to them. Most SF authors who got wind of the internet in the 1970s and 1980s didn’t suggest we’d be turning to it on mobile devices for daily news and telling World+Dog in 140 characters or less what we’re doing at any given moment. There is no Twitter, no Facebook in William Gibson’s early novels, for instance.
Just like MACH 1’s cranial computer, it’s easy to come up with ‘impossible now, possible sometime’ technologies, but rather harder to work out whether they’ll find a use - and even more to spot other technologies that make them redundant. ®
Making MACH 1: Can we build a cranial computer today?
Re: Two words
Well, sort of. It's not the brain per se but the ability to learn language that's crucial. Every child will learn to speak between the ages of one and five. During that window they learn (by imitation and emulation) the sounds and words that make up their language. This is an involuntary process and it's on a timer - if the process has not been kicked off by five or six years old, then it becomes harder and harder. This is why many deaf people who have been taught to speak sound odd to us. It's also where accents come from, because the sounds that you learn to use to make language become unconscious. This is why many continentals have such trouble with the English "th". They have to work to learn it as older children and most of them can't or won't. (Disclaimer: I speak four languages, have a German wife and live in the Netherlands, so this is not Euro-bashing - it's experience).
Cochlear implants have a huge advantage over hearing aids because they are more sensitive; they have a huge disadvantage because the "sounds" that they generate in the wearer's head are not sounds that we would readily recognise. If they are the first sounds that children hear then yes, they can form the basis of learning to speak. My daughter grew up in an environment where German, Dutch and English were spoken interchangeably, and so she speaks all three without accent, as does her (hearing) younger sister.
With adults who have never heard, results of implantation are almost always disappointing. Hearing adults deafened by age or misfortune are better at adapting because they know what to expect and they can adapt to it.
The fact is, this is a game-changer for the human race. When we were still in the diagnostic phase a German doctor said to me "From now on, no German child will ever need to learn sign language". The doctor who implanted my daughter describes the CI as "die einzige Sinnesprothese"- the only sensory prosthesis. During the last ten years I have seen serious effort being put in by the deaf community to <a href="http://en.wikipedia.org/wiki/Sound_and_Fury_(film)">turn back the tide</a>, and I have seen with my own eyes children who could have learned to speak being crippled by withholding implantation until they were five years old. The difficulty the child then has learning to speak is held up as an example by defenders of sign language that "these things don't work, see? Sign language is better".
My daughter attends a normal secondary school, although she is a year younger than the rest of her class. That's not attributable to the CI, but the opportunity that she has to complete a regular education is. As is the fact that she has a clear speaking voice even when she's not "plugged in". A relatively small investment by my medical insurer (40k Euros) has made the difference between a future taxpayer and a charity case. It's a no-brainer.
BTW sorry about the trumpet-blowing but I'm proud of my daughter and I'm not ashamed of that.
"Fanbois would have to undergo painful upgrade surgery"
I fail to see the downside...
Re: Memory is the second thing to go
Hundreds of years? I think decades is far more likely. Hundreds of years back from today, the fastest anyone had ever travelled was the top speed of a horse, and the most complicated device of the day was probably a clock - which was rarer and more expensive than a spacecraft is today.
When you think of the enormous strides in science and technology we've made in just the last 50 years, I think it's quite reasonable to say that in another fifty years we will be able to create and implant artificial nerves and memory.
Re: Star trek's communicator?
Read it anyway, even if you don't.
Sci-fi has also been amazingly prescient.
Imagine a small handheld book that contained a vast amount of, usually inaccurate or unhelpful, information about every subject known to man (or Vogon).
There you go, The Hitch-Hikers Guide to the Galaxy was only Wikipedia on a Tablet.