Plucking the Fruit of the Tree of Knowledge

Plucking the Fruit of the Tree of Knowledge Jesse Lenz

James Gleick asks: as scientists crunch and quantize the world, will they ever reach the end?

A countryman came into a telegraph office in Bangor, Maine, with a message, and asked that it be sent immediately…



…The operator took the message as usual, put his instrument in communication with its destination, ticked off the signals upon the key, and then, according to the rule of the office, hung the message paper on the hook with others that had been previously sent. … The man lounged around some time, evidently unsatisfied. “At last,” says the narrator of the incident, “his patience was exhausted, and he belched out, ‘Ain’t you going to send that dispatch?’” The operator politely informed him that he had sent it. “No, yer ain’t,” replied the indignant man; “there it is now on the hook.”—Harper’s New Monthly Magazine, 1873

A hard lesson to learn was the difference between a message and the paper on which it was written. The telegraph was a great teacher. The information had to be divorced from the physical object. It was abstracted—encoded, first as dots and dashes and then again as electrical impulses, to be sent along wires and, soon, beamed through the ether. In our sophisticated age, we process it with computers, we store it in “the cloud,” and we carry it about in our portable devices and our very cells. Information is the vital principle of our world.

But what is information? Thanks to the mathematical information theory created by Claude Shannon in 1948, we can measure information in bits. As the fundamental unit of information, Shannon decided, a bit would represent the amount of uncertainty that exists in the flipping of a coin: 1 or 0. Using his tool kit of theorems and algorithms, a mathematician or engineer could quantify not just the number of symbols (words or phonemes or letters or interruptions in an electrical circuit) but also the relative probabilities of each symbol’s occurrence. Information as Shannon defined it, became a measure of surprise—of uncertainty. These are abstractions; a message is no longer tangible or material, like a piece of paper.

“The fundamental problem of communication,” he declared, “is that of reproducing at one point either exactly or approximately a message selected at another point.” Simple enough—or so it seemed. “What is significant is the difficulty of transmitting the message from one point to another.” “Point” was a carefully chosen word. The origin and destination of a message could be separated in space or in time; information storage, as in a phonograph record, counts as a communication. Messages are formed from symbols, not from atoms. Those distant points could be the telegraph offices of Baltimore and Washington, or they could be planets light years apart, or they could be neurons in a human brain. But even though information is weightless, transmission has a cost.



Warren Weaver, who wrote a companion essay for Shannon’s classic book, The Mathematical Theory of Communication, saw the sweep and grandeur of this abstract view of information, “not only written and oral speech, but also music, the pictorial arts, the theater, the ballet, and in fact all human behavior.” No wonder information theory quickly began to influence researchers in fields as diverse as genetics and psychology.

continue reading: popsci