Singapore invests in TIA snake oil
Poindexter's dream comes true
Comment Retired US Admiral and convicted felon John Poindexter has been a busy man since Congress scrapped his Total Information Awareness (TIA) system and punctured his Orwellian dream of linking every government database imaginable in pursuit of evildoers, as Wired News reports.
Indeed, the Iran-Contra scandal alum has got a seat on the board of BrightPlanet, a data extraction and mining tools provider, and has just been speaking in Singapore, where officials welcomed him to the unveiling of his TIA counterterrorist brainchild, finally sold to an unsuspecting government.
The Singaporean version of TIA, currently a working prototype scheduled for deployment in the Fall, is called Risk Assessment and Horizon Scanning (RAHS). Its development has been influenced by two former consultants for Poindy's TIA system, John Peterson of the Arlington Institute, and Dave Snowden of Cognitive Edge.
Peterson bears the marks of a techno-utopian who might be harmless if he weren't involved in something this serious. His mission statement says it all, in words we have heard a thousand times, from a thousand tech-enthused lotus eaters. He speaks of an "exponential increase of human knowledge, and the acceleration of its application through technology", which is "propelling humanity towards a new era of thought and endeavour". He believes that "we are living in an era of global transition, to a degree that our species has never seen before."
An "exponential increase of human knowledge" indeed. Noise, not knowledge, is the thing that information technology is helping to increase exponentially. Instant messaging, email, social networking, BlackBerries, mobile phones, cable television, satellite radio...noise, noise, noise. Everyone's got something to say, and everyone is saying it. But still no cure for cancer, as they like to point out at Fark.com.
Snowden appears more pragmatic, displaying prominently on his website the model Poindexter used to impress the masses at the RAHS Symposium, and taking time to disagree with it to some extent. Although, like any good pragmatist, Snowden reckons there might be some useful bits in what Poindy had to say on the subject of data mining and he intends to evaluate those bits with greater care.
Point-and-drool national security
We have heard a technologically illiterate mainstream press agonising over the notion that major terrorist attacks in the West could have been prevented if only our tech-savvy national security geeks had "connected the dots" in time. If only they had been able to separate the signal from all that noise.
Of course, all the data needed to detect the impending attacks in New York, Madrid, and London existed, but there has been no end to speculation by bureaucrats, legislators, and naive journalists that these atrocities could have been prevented if only the right sort of information technology had been in place at the time. If only some magic filter could have sifted through the noise and saved those people's lives.
Such a push button solution is what everyone wants, so it, or rather an illusion of it, is what companies like BrightPlanet, Cognitive Edge and scores of others are selling. "Instead of having analysts trawl through huge amounts of data to decide what it means, the data is tagged very quickly, then they decide what the patterns in the metadata mean," Cognitive Edge's Snowden explains to Wired.
The idea of some massively-complex "system" silently beavering away, sifting and gathering and analysing disparate bits of data from the vast ocean of noise surrounding us has an appeal that is universal and everlasting. A silent guardian, a tireless mechanical brain; a slave intelligence without distractions like hunger and thirst and sexual urges, immune to fear, anger, laziness, or pettiness. What a wonderful world it would be.
But it's Peterson, not surprisingly, who provides the money quote, feeding the dream with words that every tech believer longs to hear. "Essentially, [RAHS is] a strategic tool that ties together every one of the agencies in a government into a large network that is constantly scanning the horizon looking for weak signals that point toward the possibility of a significant event that would have important implications for Singapore," Wired quotes him as saying.
There are other, more meaningful ways to talk about what Peterson means by his exceptionally vague phrase, "weak signals". Vague phrases are useful whenever a more accurate, more precise one would exude an unfortunate air of truth, and here one is deployed with care. "False positives" would give us some precision here in place of "weak signals", and it invokes one of the defining features of the process of data mining, which is a moderately useful marketing tool now promoted to the status of a national security crystal ball.
Only, it's never going to work. For example, in the past five years we've seen our airports become hubs of data mining and analysis. Not surprisingly, we've seen many thousands of innocent passengers detained, questioned, bullied, inconvenienced, and embarrassed, while not one terrorist has ever been caught. The rate of false positives appears to be one hundred per cent.
And as for false negatives, surely, in the past five years, at least a few terrorists have flown commercially, and perhaps quite a few. They're not being caught because, unlike the dumb technological tools deployed against them, human adversaries learn. When one thinks of data mining as a threat, one takes steps to avoid detection. Innocent people don't take steps to protect themselves so they get "caught" every day. Meanwhile, the terrorists run rings around the national security agencies and their magic machinery.
Sponsored: Protecting mobile certificates