[Book] Reason in Revolt: Marxist Philosophy and Modern Science

8. The Arrow of Time

The second law of thermodynamics

“This is the way the world ends
Not with a bang but a whimper.”
(T. S. Eliot)

Thermodynamics is the branch of theoretical physics that deals with the laws of heat motion, and the conversion of heat into other types of energy. The word is derived from the Greek words therme (“heat”) and dynamis (“force”). It is based upon two fundamental principles originally derived from experiments, but which are now regarded as axioms. The first principle is the law of the conservation of energy, which assumes the form of the law of the equivalence of heat and work. The second principle states that heat cannot of itself pass from a cooler body to a hotter body without changes in any other bodies.

The science of thermodynamics was a product of the industrial revolution. At the beginning of the 19th century, it was discovered that energy could be transformed in different ways, but can never be created or destroyed. This is the first law of thermodynamics—one of the fundamental laws of physics. Then, in 1850, Robert Clausius discovered the second law of thermodynamics. This states that “entropy” (i.e., the ratio of a body's energy to its temperature) always increases in any transformation of energy, for example, in a steam engine.

Entropy is generally understood to signify an inherent tendency towards disorganisation. Every family is well aware that a house, without some conscious intervention, tends to pass from a state of order to disorder, especially when young children are around. Iron rusts, wood rots, dead flesh decays, the water in the bath gets cold. In other words, there appears to be a general tendency towards decay. According to the second law, atoms, when left to themselves, will mix and randomise themselves as much as possible. Rust occurs because the iron atoms tend to mingle with oxygen in the surrounding air to form iron oxide. The fast moving molecules on the surface of the bath water collide with the slower moving molecules in the cold air and transfer their energy to them.

This is a limited law, which has no bearing on systems consisting of a small number of particles (microsystems) or to systems with an infinitely large number of particles (the universe). However, there have been repeated attempts to extend its application well beyond the proper sphere, leading to all kinds of false philosophical conclusions. In the middle of the 19th century, R. Clausius and W. Thomson, the authors of the second principle of thermodynamics, attempted to apply the second law to the universe as a whole, and arrived at a completely false theory, known as the “thermal death” theory of the end of the universe.

This law was redefined in 1877 by Ludwig Boltzmann, who attempted to derive the second law of thermodynamics from the atomic theory of matter, which was then gaining ground. In Boltzmann's version, entropy appears as a function of the probability of a given state of matter: the more probable the state, the higher its entropy. In this version, all systems tend towards a state of equilibrium (a state in which there is no net flow of energy). Thus, if a hot object is placed next to a cold one, energy (heat) will flow from the hot to the cold, until they reach equilibrium, i.e., they both have the same temperature.

Boltzmann was the first one to deal with the problems of the transition from the microscopic (small-scale) to the macroscopic (large-scale) level in physics. He attempted to reconcile the new theories of thermodynamics with the classical physics of trajectories. Following Maxwell's example, he tried to resolve the problems through the theory of probability. This represented a radical break with the old Newtonian methods of mechanistic determinism. Boltzmann realised that the irreversible increase in entropy could be seen as the expression of a growing molecular disorder. His principle of order implies that the more probable state available to a system is one in which a multiplicity of events taking place simultaneously within the system cancel each other out statistically. While molecules can move randomly, on average, at any given moment, the same number will be moving in one direction as in another.

There is a contradiction between energy and entropy. The unstable equilibrium between the two is determined by temperature. At low temperatures, energy dominates and we see the emergence of ordered (weak-entropy) and low energy states, as in crystals, where molecules are locked in a certain position relative to other molecules. However, at high temperature, entropy prevails, and is expressed in molecular disorder. The structure of the crystal is disrupted, and we get the transition, first to a liquid, then to a gaseous state.

The second law states that the entropy of an isolated system always increases, and that when two systems are joined together, the entropy of the combined system is greater than the sum of the entropies of the individual systems. However, the second law of thermodynamics is not like other laws of physics, such as Newton's law of gravity, precisely because it is not always applicable. Originally derived from a particular sphere of classical mechanics, the second law is limited by the fact that Boltzmann took no account of such forces as electromagnetism or even gravity, allowing only for atomic collisions. This gives such a restricted picture of physical processes, that it cannot be taken as generally applicable, although it does apply to limited systems, like boilers. The second law is not true of all circumstances. Brownian motion contradicts it, for example. As a general law of the universe in its classical form, it is simply not true.

It has been claimed that the second law means that the universe as a whole must tend inexorably towards a state of entropy. By an analogy with a closed system, the entire universe must eventually end up in a state of equilibrium, with the same temperature everywhere. The stars will run out of fuel. All life will cease. The universe will slowly peter out in a featureless expanse of nothingness. It will suffer a “heat death”. This bleak view of the universe is in direct contradiction to everything we know about its past evolution, or see at present. The very notion that matter tends to some absolute state of equilibrium runs counter to nature itself. It is a lifeless, abstract view of the universe. At present, the universe is very far from being in any sort of equilibrium, and there is not the slightest indication either that such a state ever existed in the past, or will do so in the future. Moreover, if the tendency towards increasing entropy is permanent and linear, it is not clear why the universe has not long ago ended up in a tepid soup of undifferentiated particles.

This is yet another example of what happens when attempts are made to extend scientific theories beyond the limits where they have a clearly proven application. The limitations of the principles of thermodynamics were already shown in the 19th century in a polemic between Lord Kelvin, the celebrated British physicist, and geologists, concerning the age of the earth. The predictions made by Lord Kelvin on the basis of thermodynamics ran counter to all that was known by geological and biological evolution. The theory postulated that the earth must have been molten just 20 million years ago. A vast accumulation of evidence proved the geologists right, and Lord Kelvin wrong.

In 1928, Sir James Jean, the English scientist and idealist, revived the old arguments about the “heat death” of the universe, adding in elements taken from Einstein's relativity theory. Since matter and energy are equivalents, he claimed, the universe must finally end up in the complete conversion of matter into energy: “The second law of thermodynamics,” he prophesied darkly, “compels materials in the universe (sic!) to move ever in the same direction along the same road which ends only in death and annihilation.” 46

Similar pessimistic scenarios have been put forward. In the words of a book, by Eric Lerner:

“The universe of the very far future would thus be an inconceivably dilute soup of photons, neutrinos, and a dwindling number of electrons and positrons, all slowly moving farther and farther apart. As far as we know, no further basic physical processes would ever happen. No significant event would occur to interrupt the bleak sterility of a universe that has run its course yet still faces eternal life—perhaps eternal death would be a better description.

“This dismal image of cold, dark, featureless near-nothingness is the closest that modern cosmology comes to the 'heat death' of nineteenth century physics.” 47

What conclusion must we draw from all this? If all life, indeed all matter, not just on earth, but throughout the universe, is doomed, then why bother about anything? The unwarranted extension of the second law beyond its actual scope of application has given rise to all manner of false and nihilistic philosophical conclusions. Thus, Bertrand Russell, the British philosopher, could write the following lines in his book Why I Am Not a Christian:

“All the labours of the ages, all the devotion, all the inspiration, all the noonday brightness of human genius, are destined to extinction in the vast death of the solar system, and…the whole temple of man's achievement must inevitably be buried beneath the debris of a universe in ruins—all these things, if not quite beyond dispute, are yet so nearly certain that no philosophy which rejects them can hope to stand. Only within the scaffolding of these truths, only on the firm foundation of unyielding despair, can the soul's habitation henceforth be safely built.” 48

Order out of chaos

In recent years, this pessimistic interpretation of the second law has been challenged by a startling new theory. The Belgian Nobel Prize winner Ilya Prigogine and his collaborators have pioneered an entirely different interpretation of the classical theories of thermodynamics. There are some parallels between Boltzmann’s theories and those of Darwin. In both, a large number of random fluctuations lead to a point of irreversible change, one in the form of biological evolution, the other in that of the dissipation of energy, and evolution towards disorder. In thermodynamics, time implies degradation and death. The question arises, how does this fit in with the phenomenon of life, with its inherent tendency towards organisation and ever increasing complexity.

The law states that things, if left to themselves, tend towards increased entropy. In the 1960s, Ilya Prigogine and others realised that in the real world atoms and molecules are almost never “left to themselves”. Everything affects everything else. Atoms and molecules are almost always exposed to the flow of energy and material from the outside, which, if it is strong enough, can partially reverse the apparently inexorable process of disorder posited in the second law of thermodynamics. In fact, nature shows numerous instances not only of disorganisation and decay, but also of the opposite processes—spontaneous self-organisation and growth. Wood rots, but trees grow. According to Prigogine, self-organising structures occur everywhere in nature. Likewise, M. Waldrop concluded:

“A laser is a self-organising system in which particles of light, photons, can spontaneously group themselves into a single powerful beam that has every photon moving in lockstep. A hurricane is a self-organising system powered by the steady stream of energy coming in from the sun, which drives the winds and draws rainwater from the oceans. A living cell—although much too complicated to analyse mathematically—is a self-organising system that survives by taking in energy in the form of food and excreting energy in the form of heat and waste.” 49

Everywhere in nature we see patterns. Some are orderly, some disorderly. There is decay, but there is also growth. There is life, but there is also death. And, in fact, these conflicting tendencies are bound up together. They are inseparable. The second law asserts that all of nature is on a one-way ticket to disorder and decay. Yet this does not square with the general patterns we observe in nature. The very concept of “entropy”, outside the strict limits of thermodynamics, is a problematic one.

“Thoughtful physicists concerned with the workings of thermodynamics realise how disturbing is the question of, as one put it, 'how a purposeless flow of energy can wash life and consciousness into the world.' Compounding the trouble is the slippery notion of entropy, reasonably well defined for thermodynamic purposes in terms of heat and temperature, but devilishly hard to pin down as a measure of disorder. Physicists have trouble enough measuring the degree of order in water, forming crystalline structures in the transition to ice, energy bleeding away all the while. But thermodynamic entropy fails miserably as a measure of the changing degree of form and formlessness in the creation of amino acids, of microorganisms, of self-reproducing plants and animals, of complex information systems like the brain. Certainly these evolving islands of order must obey the second law. The important laws, the creative laws, lie elsewhere.” (Gleick) 50

The process of nuclear fusion is an example, not of decay, but of the building-up of the universe. This was pointed out in 1931 by H.T. Poggio, who warned the prophets of thermodynamic gloom against the unwarranted attempts to extrapolate a law that applies in certain limited situations on earth to the whole universe. “Let us not be too sure that the universe is like a watch that is always running down. There may be a rewinding.” 51

The second law contains two fundamental elements—one negative and another positive. The first says that certain processes are impossible (e.g. that heat flows from a hot source to a cold one, never vice versa) and the second (which flows from the first) states that entropy is an inevitable feature of all isolated systems. In an isolated system all non-equilibrium situations produce evolution towards the same kind of equilibrium state. Traditional thermodynamics saw in entropy only a movement towards disorder. This, however, refers only to simple, isolated systems (e.g., a steam engine). Prigogine's new interpretation of Boltzmann's theories is far wider, and radically different.

Chemical reactions take place as a result of collisions between molecules. Normally, the collision does not bring about a change of state; the molecules merely exchange energy. Occasionally, however, a collision produces changes in the molecules involved (a “reactive collision”). These reactions can be speeded up by catalysts. In living organisms, these catalysts are specific proteins, called enzymes. There is every reason to believe that this process played a decisive role in the emergence of life on earth. What appear to be chaotic, merely random movements of molecules, at a certain point reach a critical stage where quantity suddenly becomes transformed into quality. And this is an essential property of all forms of matter, not only organic, but also inorganic.

“Remarkably, the perception of oriented time increases as the level of biological organisation increases and probably reaches its culminating point in human consciousness.” 52

Every living organism combines order and activity. By contrast, a crystal in a state of equilibrium is structured, but inert. In nature, equilibrium is not normal but, to quote Prigogine “a rare and precarious state”. Non-equilibrium is the rule. In simple isolated systems like a crystal, equilibrium can be maintained for a long time, even indefinitely. But matters change when we deal with complex processes, like living things. A living cell cannot be kept in a state of equilibrium, or it would die. The processes governing the emergence of life are not simple and linear, but dialectical, involving sudden leaps, where quantity is transformed into quality.

“Classical” chemical reactions are seen as very random processes. The molecules involved are evenly distributed in space, and their spread is distributed “normally” i.e., in a Gauss curve. These kinds of reaction fit into the concept of Boltzmann, wherein all side-chains of the reaction will fade out and the reaction will end up in a stable reaction, an immobile equilibrium. However, in recent decades chemical reactions were discovered that deviate from this ideal and simplified concept. They are known under the common name of “chemical clocks”. The most famous examples are the Belousov-Zhabotinsky reaction, and the Brussels model devised by Ilya Prigogine.

Linear thermodynamics describes a stable, predictable behaviour of systems that tend towards the minimum level of activity possible. However, when the thermodynamic forces acting on a system reach the point where the linear region is exceeded, stability can no longer be assumed. Turbulence arises. For a long time turbulence was regarded as a synonym for disorder or chaos. But now, it has been discovered that what appears to be merely chaotic disorder on the macroscopic (large-scale) level, is, in fact, highly organised on the microscopic (small-scale) level.

Today, the study of chemical instabilities has become common. Of special interest is the research done in Brussels under the guidance of Ilya Prigogine. The study of what happens beyond the critical point where chemical instability commences has enormous interest from the standpoint of dialectics. Of particular importance is the phenomenon of the “chemical clock”. The Brussels model (nicknamed the “Brusselator” by American scientists) describes the behaviour of gas molecules. Suppose there are two types of molecules, “red” and “blue”, in a state of chaotic, totally random motion. One would expect that, at a given moment, there would be an irregular distribution of molecules, producing a “violet” colour, with occasional flashes of red or blue. But in a chemical clock, this does not occur beyond the critical point. The system is all blue, then all red, and these changes occur at regular interval. According to Prigogine and Stengers:

“Such a degree of order stemming from the activity of billions of molecules seems incredible, and indeed, if chemical clocks had not been observed, no one would believe that such a process is possible. To change colour all at once, molecules must have a way to 'communicate'. The system has to act as a whole. We will return repeatedly to this key word, communicate, which is of obvious importance in so many fields, from chemistry to neurophysiology. Dissipative structures introduce probably one of the simplest physical mechanisms for communication.”

The phenomena of the “chemical clock” shows how in nature order can arise spontaneously out of chaos at a certain point. This is an important observation, especial in relation to the way in which life arises from inorganic matter.

“'Order through fluctuations' models introduce an unstable world where small causes can have large effects, but this world is not arbitrary. On the contrary, the reasons for the amplification of a small event are a legitimate matter for rational inquiry.”

In classical theory, chemical reactions take place in a statistically ordered manner. Normally, there is an average concentration of molecules, with an even distribution. In reality, however, local concentrations appear which can organise themselves. This result is entirely unexpected from the standpoint of the traditional theory. These focal points of what Prigogine calls “self-organisation” can consolidate themselves to the point where they affect the whole system. What was previously thought of as marginal phenomena turn out to be absolutely decisive. The traditional view was to regard irreversible processes as a nuisance, caused by friction and other sources of heat loss in engines. But the situation has changed. Without irreversible processes, life would not be possible. The old view of irreversibility as a subjective phenomenon (a result of ignorance) is being strongly challenged. According to Prigogine irreversibility exists on all levels, both microscopic and macroscopic. For him, the second law leads to a new concept of matter. In a state of non-equilibrium, order emerges. “Non-equilibrium brings order out of chaos.” 53

46. Quoted in Lerner, E. op. cit., p. 134.

47. Davies, P. The Last Three Minutes, pp. 98-9.

48. Quoted by Davies, P. op. cit., p. 13.

49. Waldrop, M. Complexity, The Emerging Science at the Edge of Order and Chaos, pp. 33-4.

50. Gleick, J. op. cit., p. 308.

51. Lerner, E. op. cit., p. 139.

52. Prigogine, I. and Stengers, I. op. cit., p. 298.

53. Prigogine, I. and Stengers, I. op. cit., pp. 148, 206 and 287.

Join us

If you want more information about joining the IMT, fill in this form. We will get back to you as soon as possible.