The Technium

Infinite Order in All Directions


I found myself dissatisfied with one major point while reading James Gleick’s new book The Information (previous posts here and here) — and that was his definition of information. For such a long book on the subject I felt that in the end he did not adequately explain what is was. Or rather that he defined only one aspect of information. He acknowledges the problems of definition in the later chapters in his book, without resolving the problem. The problem is that we use the word in two different ways, and these two senses are not unified.

In fact, the two ways we use the term information are opposites, almost diametrically opposed to each other.

The definition that Gleick explores in greatest scientific detail is the definition of the mathematician Claude Shannon. Here “information” is a measurement of entropy, of disorder. More information means more disorder, more randomness. As Gleick says, “Information is surprise.”

But in most of the rest of the book, Gleick uses information in a more vernacular sense, of structure and order. It is the flood of which he speaks in his subtitle. Here information is meaning. That is the sense that most of us mean when we talk about “more information,” although we also sense that more information can flood meaningful signals.

I tend to use the term “information” in the later sense of increasing order and structure, and increasing meaning. I would say that over time, evolution increases the amount of information in a biological lineage. Or that we increase the amount of information in our society. But that only makes sense in a non-Shannon use, for in Shannon’s terminology, evolution and technology decrease information by increasing order.

For a long time, I thought that the problem was on my end. But the physicist Freeman Dyson reviewed Gleick’s book in the New York Review of Books and mentioned something in passing that I thought important.

The explosive growth of information in our human society is a part of the slower growth of ordered structures in the evolution of life as a whole. Life has for billions of years been evolving with organisms and ecosystems embodying increasing amounts of information. The evolution of life is a part of the evolution of the universe, which also evolves with increasing amounts of information embodied in ordered structures, galaxies and stars and planetary systems. In the living and in the nonliving world, we see a growth of order, starting from the featureless and uniform gas of the early universe and producing the magnificent diversity of weird objects that we see in the sky and in the rain forest. Everywhere around us, wherever we look, we see evidence of increasing order and increasing information. The technology arising from Shannon’s discoveries is only a local acceleration of the natural growth of information…..Information and order can continue to grow for billions of years in the future, as they have evidently grown in the past.

Freeman says that the total amount of information can continue to expand without end, but by information he means increasing order, and yet, according to physics, entropy, or decreasing order, will expand without end. (In the same article, Freeman explains why he does not believe that.) It seems to me both can’t expand with out end. Does the universe hold increasing disorder or increasing order? Is information increasing disorder or increasing order?

Info

I wrote to Dyson, and asked him about this conflict of meaning between his use of information and science’s use of it. I asked him,

Do you have any suggestions of a better word for the second use of information (increase order), or for replacing the first use (decrease order)? I find it very confusing when both contradictory meanings (less order, more order?) are used in the same context of thermodynamics.

He replied by email:

In scientific discourse there is no problem since we have two words with clear meanings, information meaning order and entropy meaning disorder. Information is negative entropy, as Szilard first observed. The problem arose in my review because the NYRB readers do not like the word entropy. So I avoided entropy and used other words instead. I apologize for the resulting ambiguity. The cure is to make the word entropy a part of colloquial discourse. That is a slow process.

I wasn’t that happy with his reply, because while he avoided the term “entropy,” he wasn’t avoiding the association of information with the concept of decreased order. So I pressed him in reply:

That distinction might prevail in popular culture, except I think the confusion arises from the scientific language of Shannon’s mathematical theory. As you say, “Shannon supplied the theory to understand all of these systems together, defining information as an abstract quantity…” Shannon was the core of Gleick’s book, and he seems to make an equivalency between “information” (that’s the word used) and entropy. So what would be a good word to used in Shannon’s equation instead of “information”? Data? Code? Should the theory become Data Theory or Code Theory? And should we replace the “I” of I(x) in the equivalency with a different term? It seems to me that as long as science talks about “Information” as entropy, popular culture will remain confused.

And as a writer (and editor) what can I do to hasten this change for clarity. Should I stop using the word information in reference to Shannon? What do you do?

Dyson’s reply was surprising but very Freeman Dyson-ish:

I never read any of Shannon’s papers. For me information was always the opposite of entropy. So there was no problem. If anyone says information and entropy are the same thing, that is rubbish. Not just confusing, but wrong.

Well, as far as I can tell Shannon is saying that information and entropy are the same thing, but in fact I agree with Dyson that in the way we ordinarily use the word, information should not mean entropy, but should mean order. However, because I hate the double negative term used by Dyson (and most physicists) for order above — negative entropy — or “negative no-order,” I use the term exotropy, which is a single positive term. It means the same thing as information, as Dyson would use it. That in turn means we should have a different word for what Shannon is talking about. As mentioned above, I propose “data.” We should call Shannon’s theory “Data Theory.”

That would release “information” to soar in the direction of evolution, towards more order. And if Freeman Dyson is correct in his other heresy, that information is unbounded, then order will increase in all directions.




Comments


© 2023