This article is more than 1 year old

A Brief History of Information

The word that means everything - and nothing

Part 1 Kierkegaard said that irony was "as baffling as depicting an elf wearing a hat that makes him invisible." He's lucky he never encountered information.

The word seems to stand for everything, and nothing. "Information" describes everything from a precise mathematical property of communication systems, to discrete statements of fact or opinion (for example, the time a film begins or someone's perspective on a situation), to a staple of marketing rhetoric, to a world-historical phenomenon on the order of agriculture or industrialization.

The frequency and disparity of its use, by specialists and lay people alike, to describe countless general and specific aspects of life makes it difficult to analyze; no single academic discipline or method can offer an adequate explanation of the phenomenon.

A typical approach to a problem of this kind is to start with the word as such: to gather examples of its use, codify their meanings, and arrange them into a taxonomy. This has been done with varying degrees of success; for example, one prominent American-English dictionary defines the word in slightly less than 200 words.

These efforts are admirable; but if we grant any credence at all to the widely made claim that we live in an "information society" or, even more grandly, in an "information age," then surely information must be more the sum of the word's multiple meanings. Apparently, it - the word or, more properly, the category - is sui generis, and in a particularly compelling way. What qualities would make it so?

From meaning to noise

The word itself dates in English to the late fourteenth century, and almost from the beginning showed ambiguities very similar to current usages. The Oxford English Dictionary cites early uses - in, among other sources, Chaucer's Canterbury Tales, as evidence for defining it variously as

[t]he action of informing ... communication of instructive knowledge (I.1.a)

communication of the knowledge or ... 'news' of some fact or occurrence (I.2)

and,

[a]n item of training; an instruction (I.1.b)

That is to say, generally, an action in the first cases, and a thing in the last case.

Even the obscurity of whether it is singular or plural, which is still prevalent, seems to date to the early sixteenth century.

an item of information or intelligence," curiously "with _an_ and _pl_[_ural_]" [I.3.b])

As the word came into wider use in the centuries leading up to 1900, it took on a variety of additional meanings. Of these, the most striking trend was its increasingly legalistic aspect. This included informal usages (for example, related to or derived from "informing" on someone) as well as narrow technical descriptions of charges lodged "in order to the institution of criminal proceedings without formal indictment" [sic].

This disparity - in one aspect referring to particular allegations of a more or less precise factual nature and, in other aspects, to a formal description of a class or type of assertion - is still central to current usage of the word; so are connotations that information relates to operations of the state.

Yet it was in the twentieth century that the word was given decisively different meanings. The first of these modern usages appears in the work of the British statistician and geneticist R. A Fisher.

In his 1925 article Theory of Statistical Estimation published in Proceedings of the Cambridge Philosophical Society he described "the amount of information in a single observation" in the context of statistical analysis. In doing so, he appears to have introduced two crucial aspects to "information". Firstly, that it is abstract yet measurable, and secondly that it is an aspect or byproduct of an event or process.

"Fisher information" has had ramifications across the physical sciences, but its most famous elaboration has been in the applied context of electronic communications. These, and related definitions differ from Fisher's work, but they remain much closer to his conception than to any earlier meanings.

Three years after Fisher's paper appeared, the American-born electronics researcher Ralph VL Hartley - who had studied at Oxford University almost exactly the same years that Fisher studied at Cambridge (1909-1913) before returning to the United States - published a seminal article in Bell System Technical Journal. In it, he built upon the work of the Swedish-American engineer Harry Nyquist (who was working mainly at AT&T and Bell Laboratories), specifically on Nyquist's 1924 paper Certain Factors Affecting Telegraph Speed, which sought in part to quantify what he called "intelligence" in the context of a communication system's limiting factors.

However, Hartley's 1928 article, titled Transmission of Information seems to have fused aspects of Fisher's conception of information with Nyquist's technical context - albeit without citing either of them - or any other source. Hartley specifically proposed to "set up a quantitative measure whereby the capacities of various systems to transmit information may be compared." He also added another crucial aspect by explicitly distinguishing between "physical as contrasted with psychological considerations" - meaning more or less, by the latter, "meaning." According to Hartley, information is something that can be transmitted but has no specific meaning.

It was on this basis that, decades later, the American mathematician and geneticist-turned-electrical engineer Claude Shannon made most famous of all modern contributions to the development of the idea of information.

Next page: Double negatives

More about

TIP US OFF

Send us news


Other stories you might like