Original URL: https://www.theregister.com/2007/01/06/wtf_is_information_part2/

A Brief History of Information

From Shannon to Dar Interweb

By Ted Byfield

Posted in Bootnotes, 6th January 2007 10:02 GMT

Part 2 In the centuries of use before its modern redefinition, as we've seen in Part 1, "information" had already toted up a formidable list of ambiguities. For example, it's an action in some usages and a thing in others, it's both singular and plural, and it's both an informal assertion of fact as well as a procedure for making a formal statement.

These slippery qualities made "information" a very amenable candidate as cybernetics pioneer Claude Shannon (and others) sought to name their developing, doubly negative idea of the reduction of uncertainty. They also seem to have made the word resistant to efforts to fix it with a precise or stable new meaning. So, in addition to its long-standing contradictory substance, Shannon's efforts added still more paradoxical attributes: information became something abstract yet measurable, significant but not meaningful, and, last but not least, present wherever communication occurs but is nowhere to be found.

In another context, Shannon's research across such maddeningly disparate fields might have been the mark of a polymath or a dilettante; but in the decades surrounding the war, his erratic trajectory (along with those of many others) was both a sign of, and a key to, the development of a new field, cybernetics.

Cybernetics examined the level, or levels of abstraction "above" disciplinary differences. Norbert Wiener, widely credited as the father of cybernetics, called it a "science of control and communication is the animal in the machine"; for Louis Couffignal, an early French pioneer in the field, it was "art of ensuring the efficacy of action". For our purposes, the study of how feedback systems work was best summed up by its dandyish British practitioner, Gordon Pask:

"Its interdisciplinary character emerges when it considers economy not as an economist, biology not as a biologist, and engineering not as an engineer," he wrote. The common coin of this field was information, in this sense, description extracted from the obvious or immediate contexts that gave rise to it.

Stretching a point

Early on, in 1953, Shannon acknowledged that -

[t]he word 'information' has been given different meanings by various writers in the general field of information theory. It is likely that at least a number of these will prove sufficiently useful in certain applications to deserve further study and permanent recognition. It is hardly to be expected that a single concept of information would satisfactorily account for the numerous possible applications of this general field.

In this light, Shannon's suspicion about speculative efforts to generalize his work outside of his specific application in communications seems amazingly parochial. Shannon described the fundamental problem of communication as "'reproducing at one point either exactly or approximately a message selected at another point". In the context of telegraphy, we can assume that the message, whether meaningful or not, is a "text" of sorts, and that the points are geographically and maybe temporally separate.

However, if we follow Pask's cue and look at Shannon's work not as a communications engineer, the nature of the message and the points become open-ended in the extreme.

The "message" could potentially be whatever feedback a "system" produces: for example, the complexity of the oxygen level of a bloodstream, or the walk of a dog, or the economy of a stone-age tribe, or the profitability a financial system needs to continually steer its oscillations toward a sustainable equilibrium. The "points" might be separated by seconds or centuries and by microns or miles - to say nothing of the various "media" through which this feedback is produced and reproduced. And, perhaps most important, the information potentially becomes useful to non-economists, non-biologists, non-engineers - in other words, to actors and forces outside of the circuits of the immediate setting.

It's no great mystery why information, seen in this light in addition to the pervasive applications of Shannon's work, would capture the imagination of people well outside the fields of study where it developed.

But it's wrong to assume that the idea of information itself has been reproduced outside of Shannon's context with any accuracy at all. It hasn't been; but how could it be? According to what authority, and grounded in what discipline or context?

The result is that the concept of information is like a lump of coal and more like a diamond - glittering with complementary, refractory definitions. Above all, information became and in many ways remains almost impossible to pin down.

Thus, at one disciplinarily precise extreme, for example, even such a gifted scholar as Luciano Floridi falls into hilarious contortions in his meticulous survey of "Semantic Conceptions of Information" in the Stanford Encyclopedia of Philosophy when he concedes that -

[s]ometimes the several concepts of information ... can be variously coupled together. This should not be taken as necessarily a sign of confusion, for in some philosophers it may be the result of an intentional bridging.

Heaven help the teeming masses of non-philosophers as they unintentionally and confusedly couple away at making sense of the "information society" in the "information age."

Fads and bubbles

Phrases like "information age" are bandied about with such numbing frequency that we can be sure that Shannon's subtly counterintuitive specification of information hasn't been reproduced with any accuracy at all. And those phrases are too often invoked as justifications for some sort of action or reaction - often "radical" adapting, moving, hiring, firing, restructuring, or replacing, for example - so we can be equally sure that the ways in which specific information is translated into specific consequences haven't, in the main, been guided by much science or art. On the contrary: it doesn't require much discipline at all to see that, empirically speaking, most uses of the word "information" are puffery.

Yet how is that, with each new layer of elusiveness, whether technically specific or diffusely vague, "information" seems to have become more useful to more and more people?

The assumption underlying this sort of question is that a word's worth can be measured by the consistency or specificity of its meanings. That kind of assumption seems all the more relevant in the case of a word that often suggests objectivity, accuracy, precision, consistency, or clarity. But that assumption is false in everyday ways: there are very common words - "stuff," say, or "power" - that are useful because they're indiscriminate or polysemic.

Moreover, the value of a word in a technical context is hardly diminished by its use in other settings: "power" can be used by political scientists and nuclear plant operators alike without either group, as Floridi's says of information, "complain[ing] about misunderstandings and misuses of the very idea."

Still, what of a word that has increasingly been used to describe literally anything and everything?

Surely, it has to reach a point at which it becomes vaporously useless, a distinction without meaning. Why would it be exempt from another kind of dynamic, also seen and heard every day, in which faddish phrases or idea come and go? After all, it's been almost sixty years since Shannon published his theory, and the litany of trends, big and small, that came and went is boggling. How has "information" been exempt from this?

Sooner or later, of course, a certain amount of faddish babble will die down - if it hasn't begun to do so already. But it would be a mistake to assume that the word's gradual disappearance will merely mark the quiet passing of a trend.

One could point of course to the propagation of thousands of television and radio channels over the last decades, the penetration of digital devices into everyday life throughout much of the world, the rise of companies like Microsoft and Google whose reach extends deeply into people's "private" lives, or the vast capital accumulations and transfers that have attended the rise of new technologies as evidence of the impact of "information." And, while undeniably true, these phenomena obscure some of the more blatant aspects of the rise of information, which we'll examine in the the third and final installment.®

Ted Byfield is Associate Chair of the Communication Design and Technology Department at Parsons the New School for Design in New York City; he co-moderates the Nettime-l mailing list. This article is based on an essay in Matthew Fuller (ed.), Software Studies: A Lexicon (Cambridge, Mass: MIT Press, forthcoming 2007).

References

Pask: An Approach to Cybernetics (New York: Harper, 1961), p. 11).

Shannon: Abstract to The Lattice Theory of Information, published in IEEE Transactions on Information Theory 1.1 (Feb. 1953), p. 105; reprinted in Shannon Collected Papers, ed. N. J. A. Sloane and A. D. Wyner (New York: IEEE Press, p.180)