ENTROPY


In "Order and Chaos", Angrist and Hepler write:

"Even though society can effect local reductions in entropy, the general and universal trend of entropy increase easily swamps the anomalous but important efforts of civilized man. Each localized, man-made or machine-made entropy decrease is accompanied by a greater increase in entropy of the surroundings, thereby maintaining the required increase in total entropy."

The point of this quote is that our human endeavours, synergetically inspired or otherwise, do not decrease net entropy in the universe. There is no need to be depressed by this, or be put off from reforming the environment and so on. It seems to me that Fuller's description of our local universe activity is only partly correct. Here are two more quotes from Angrist and Hepler:

"The ceaseless urge of man to bring order out of his experiences so that he may understand them gives rise to science, which is an example of entropy reduction."

"Brillouin has shown that any information resulting from a physical observation must be paid for by an increase in entropy in the laboratory."

This means that our efforts to reduce entropy in one area or system produce a net increase in entropy for the universe as a whole, in accordance with the second law of thermodynamics.

Fuller says in Synergetics 305.04:

"The scientist was able to define physical Universe by virtue of the experimentally verified discovery that energy can neither be created nor lost; therefore that energy is conserved; therefore it is finite. Thus, man has been able to define successfully physical Universe..."

I interpret this to mean that it is presumably correct to also concur with the implications of the second law of thermodynamics. This in turn seems to imply that standard reckonings of entropy apply, irrespective of whatever the metaphysical component may comprise, precisely because the latter is not physical.

Here's another source of comment on entropy, in New Scientist 2168, 9 Jan 1999:

"There is, says Igor Novikov in 'The River of Time' (Cambridge, ISBN 0521461774), a 'psychological arrow of time'. This points to the future, as defined by the thermodynamic arrow of time, because thought processes create neural order at the expense of global disorder, hastening the chaotic 'heat death' of the Universe. Thinking, in short, reduces the life of the cosmos."

This tends to confirm the impression that there is necessarily an entropic cost of syntropy, if that's the right way to put it, although, if he's right, maybe we shouldn't think about it too hard.

It would be wrong to insist that the "true" meaning of entropy is that which is expressed only in terms of thermal energy exchanges. The concept is generalized in statistical mechanics as a measure of the disorder among the atoms of a system. This is again generalized in cybernetics to measure the tendency of any closed system to move from a less to a more probable state, such that information is defined precisely as negative entropy.

In the definitions given, it is usually mentioned that the entropy of any closed system never decreases. This I take to be part of the meaning of the term: how it finds its operation in physical cases. The above generalizations don't leave behind this principle, they just express it in different terms, retaining the structure. Hence the statistical mechanics way of expressing the phenomenon can logically cover the state of affairs in steam boilers and so on.

An analogous relationship exists between Einstein's and Newton's systems: the latter is a special case of the former. But note that Newton's system is not irrelevant to Einstein's, just subsumed in it.

What is difficult to detect is this kind of relationship operant between the Fullerian concept of entropy and the standard one. If it's there, it's not clear. If it's not there, can we still be talking about entropy?

Fuller's statements on the subject do not seem to construe information in terms of probability, as one would expect from the standard cybernetic generalization. If that is not the appropriate procedure, how can entropy remain a measurable quantity, in the way that it is in standard terms?

"The second law [of thermodynamics] allows us to calculate only differences in the entropy of a system when something happens to it. The third law allows an absolute estimation of the entropy of a system."
Silver (1996)


These are some of the problems here:

In Feynman's (1965) words:

"There is a great difference between energy and availability of energy...The availability of energy is always decreasing. This is... what is called the entropy law, which says the entropy is always increasing."

If standard definitions of entropy can be shown to be operationally equivalent, so that we are free to choose which we care to use in any given situation, we are stuck with the implications of the first definition:

"The thermodynamic sense of order decrease that is enshrined in the second law is at first sight in conflict with many of the complicated things that we see going on around us. We see complexity and order increasing with time in many situations: when we tidy up our office,... the evolution of complex life-forms from the simpler ones...

"In many of these cases, we must be careful to pay attention to all the order and disorder that is present in the problem. Thus the process of tidying the office requires physical effort on someone's part. This causes ordered biochemical energy stored in starches and sugars to be degraded into heat. If one counts this into the entropy budget, then the decrease in entropy or disorder associated with the tidied desk is more than compensated for by the other increases."
Barrow (1990)


What counts is what sort of information we are dealing with. Toffler's "Powershift" is a convincing story of the gigantic importance of information as a global factor in our efforts to run our economies and lives. But he is careful at the outset to distinguish between different meanings and levels of "information", "data", and "knowledge". The key term is "knowledge", needless to say.

The global distribution and reproduction of knowledge has great effect in terms of creating order in our local universe. Wealth is not mere gold, nor is it just energy, without our appropriate use of it.

Freeman Dyson (1989) figures that:

"For a society with the same complexity as the present human society on Earth, starting from the present time and continuing forever, the total reserve of energy required is about equal to the energy now radiated by the Sun in eight hours."

We may then take some budgetary comfort from this solar bounty. If we synergetically organize this input without translating it into slaughter, we might stand a decent chance of enduring indefinitely.

No-one has shown why the steady increase in entropy would not apply to the general doings of humanity. However, while this may be applicable to individual and sum-totalled transactions, it does not herald imminent bankruptcy, thanks to sunshine. Indeed, the open, sun-fuelled character of Spaceship Earth may make it inappropriate to think in terms of such entropy calculations. But if that is the case, we are also disabled from making claims about sum-totally defeating entropy, either on a case-by-case basis or in the long run.

It is this energy input which we must chiefly thank for our survival prospects, but the factor of order must be taken to another level. It is ordered information that really counts. Ultimately, I suggest, what is most synergetically valuable is the development and distribution of rationality. (See Sutherland for a discussion of the opposite.)

Suppose someone devises a brilliant way of tidying offices, and he memetically spreads the information to thousands of people. Then he may effectively save people all kinds of wasted effort and mental disorder.

Suppose, on the other hand, that I become smitten by feng-shui, and get thousands of people to re-arrange the furniture and so on. I will have perhaps reproduced various kinds of pattern and order, but wasted everyone's time and energy.



There's a story that the physicist Arnold Sommerfeld was asked why he hadn't written a book about thermodynamics. Apparently he replied:

"Thermodynamics is a funny subject. The first time you go through the subject, you don't understand it at all. The second time you go through it, you think you do understand it, except for one or two small points. The third time you go through it, you know you don't understand it, but by that time you are so used to the subject, it doesn't bother you any more."

Silver mentions the American historian Henry Adams as being someone who misapplied the concept of entropy to history. Once again, it is unfortunate that Fuller hardly ever refers to any other sources for his ideas. It is possible, though merely a speculation, that he studied Adams' writings at some stage.



Summary

Either local universe energy transactions (including humanity's ordering activities) result in a net increase in entropy, however slight, or, because of the gargantuan solar energy income and the fact that quality of information matters more than mere information, entropy and its notional opposite, syntropy, may not be usefully applied to Continuous Man.



[BACK]

THE FULLER MAP



© Paul Taylor 2001