Economics as if the laws of thermodynamics mattered

Have you ever considered the question: What is life? If we aim for a new economic system which will preserve and enhance life, rather than the current one, which more often than not seems to destroy and degrade it, perhaps we should consider what life is and how it is made possible? I recall learning about “living things” in high-school biology classes, but always found the definitions of these “living things” to be somewhat vague. Let me try a physicist’s definition then, which might feel unfamiliar at first. A living thing is a kind of low-entropy-maintenance machine: a configuration of differentiated parts that succeeds in performing complex, inter-dependent functions for a prolonged period of time.

I used the word “entropy” in the previous sentence, so I should try to explain what it is. All living and non-living things (and hence all human economies, whether or not economists pay attention to the fact!) obey the laws of thermodynamics. The second law, in particular, states the concept of entropy and the idea that the entropy of a closed system must either remain constant or increase, but never fall. Entropy is a measure of how “special” a particular arrangement of parts is – the lower the entropy, the more “special”. Life is “special”.

To illustrate this concept of “specialness”, imagine first a set of red and blue gas molecules, fifty of each say, bouncing around in a room. Which is more likely: that all 50 red molecules will be in one half of the room and all 50 blue in the other half, or that some roughly even mixture of red and blues will be present in both halves? The second is more likely, less “special” therefore, but why? The answer is that there are many ways of arranging the molecules to have “some roughly even mixture” of red and blue – a great many pairs of molecules can be swapped between the halves without making a difference. However, with the perfect red and blue split, if any molecule is swapped with a partner in the other half of the room, then each half gets “contaminated” with one molecule of the “wrong” color – such a swap does make a difference. Hence what we see tends to be an equal mixture of each color, just because there are vastly many more ways of seeing an equal mixture.

Now we are able to state the notion of entropy precisely – the entropy of such a set of molecules is a number that is large when there are many ways of swapping pairs of molecules and getting the same overall state and small when there are few ways of swapping them and getting the same overall state. Explicitly, an entropy S is given by Boltzmann’s entropy law:

S = k log W

Here k = 1.38 x 10−23 joule/kelvin is Boltzmann’s constant, W is the number of ways of swapping the components of a state (say red and blue molecules) without making an overall difference to that state and log W means “the natural logarithm of W” – the power you have to raise Euler’s number e = 2.718… to in order to get W (for example if W is e then log W is 1, because e to the power 1 is e – and so on).

Boltzmann’s tomb, with his famous entropy law above the bust

That little equation of Boltzmann’s explains a huge number of things. For example, why do hot things tend to get colder and cold things hotter? Easy – bring a hot thing and a cold thing into contact and it’s like the red and blue molecules all over again – there are many, many more ways for hot molecules and cold ones to all get mixed together equally than for them to stay separated into a hot part and a cold part. So the temperature equalizes.

Another example: why do balls bounce lower and lower, but never start bouncing higher and higher? Easy – after they’re done falling, ball molecules are moving more on average that floor ones. During each bounce, there are more ways of sharing out this motion randomly amongst the ball and floor than there are of keeping all the faster molecules in the ball and all the slower molecules in the floor. So this sharing out is what happens – the ball eventually stops bouncing. The opposite case – a ball spontaneously bouncing higher and higher – never happens in practice because it is so unlikely. That’s how you tell a film is being played backwards – everything that happens is so unlikely it is never seen to happen in practice. This explains the second law: the total entropy always increases and never decreases because of how incredibly unlikely a decrease is.

What about life and entropy? A living thing has a very low entropy compared to its surroundings, because there are not many ways of swapping its constituent parts and leaving it in an invariant state. For example, swapping molecules between your heart and brain wouldn’t leave you in “an invariant state” – it would kill you! In fact, coming into thermodynamic equilibrium with your surroundings is also known as “being dead”!

Next question: how is life able to maintain this low entropy state, in apparent defiance of the second law? Well, life is part of the Earth-Sun system. We can regard this as “a closed system” to a very good approximation – a vast ocean of space separates it from other systems. But the Earth alone (plus moon, of course!) is not “a closed system”: the Sun – a nuclear fusion reactor – provides the Earth with a constant input of low entropy “organised” energy in the form of high intensity photons (particles of light). Plants can use this energy to make food which animals such as ourselves can eat, keeping the low-entropy-maintenance machinery of life running.

The Earth-Sun (plus moon) system, of which the human economy is a sub-system

Save for a few ocean vent ecosystems, this low entropy input from the Sun makes all life on Earth possible – and hence all human economies (again, whether or not economists pay attention to the fact!). Even when humans are reckless enough to burn reserves of oil and coal laid down over many millions of years in a geological eye-blink (all the while giving minimal consideration to the question “what next?”) they are merely liberating the low entropy energy captured from ancient sunlight and buried deep underground.

What does the second law of thermodynamics, which I have just endeavored to explain, mean for economics and for our economies today then? We now understand why a constant stream of low entropy energy from the sun is required to maintain life’s organized state – without this “entropy gradient” the machinery of life would soon wind down, like the bounding balls or mixing molecules did. And now that we understand this, we realise that to prolong life on Earth we should try to use this vital low-entropy input as efficiently as possible, try to recycle it through all sectors of the our human economy (as the picture above hints at) – certainly not waste it wantonly and assume we will be able to increase our use of it more and more and more, forever.

Unfortunately, mainstream economists don’t seem to have heard of the second law of thermodynamics. Perhaps this isn’t really their fault, since it’s not in their textbooks. But it should be – it makes all life on Earth (hence all human economies) possible. If our economists, in their ignorance, make economic models that must defy the laws of thermodynamics to work indefinitely; worse still pay no attention to the question of how quickly their erroneous models must stop working (In ten years? One hundred? Or perhaps tomorrow?) then sooner or later it will all end in tears for the human race and much of life on Earth. I worry that the tears have already begun.

_____________________________________

My title was in homage to E. F. Schumacher’s book “Small is Beautiful: Economics as if People Mattered”. People do matter of course and our social systems should aim to make their lives better. What is economics for then? Prof. Herman Daly has introduced the idea of a means-ends spectrum:

Given the above spectrum, a possible definition of economics is the following:

Economics is how we manage our ultimate means – what the laws of thermodynamics state we can do – in the service of our ultimate ends – what a system of ethics decide we ought to do.

The above definition will look alien to many economists and to many members of the public. Today, we have an understandable aversion to the word “economics”. It has become associated with ruthless, competitive behaviors and forms of modern-day serfdom and technocratic rule: in short profoundly anti-human tendencies. But the word “economics” derives from the Greek word οἰκονομία meaning “household management”. It is how we decide to manage our global “house-hold”: the Earth’s finite resources and its vital and increasingly fragile ecosystems. Perhaps today’s “economic” orthodoxy should be called “household mis-management” instead!

Schumacher’s aim was to restore human values to the discipline of economics; to make its strivings for the ultimate ends of Daly’s spectrum explicit. This used to be the goal of economics – classical economists like Adam Smith and John Stuart Mill saw themselves as moral philosophers rather than narrow “economists”. They rejected the goal of economics as mere accumulation of material goods, to continue forever; nor was the idea of an end to growth and the ensuing steady state anathema to them. As Mill wrote more that a century and-a-half ago in his Principles of Political Economy:

“In contemplating any progressive movement, not in its nature unlimited, the mind is not satisfied with merely tracing the laws of the movement; it cannot but ask the further question, to what goal? Towards what ultimate point is society tending by its industrial progress? When the progress ceases, in what condition are we to expect that it will leave mankind? It must always have been seen, more or less distinctly, by political economists, that the increase of wealth is not boundless: that at the end of what they term the progressive state lies the stationary state, that all progress in wealth is but a postponement of this, and that each step in advance is an approach to it. …

It is scarcely necessary to remark that a stationary condition of capital and population implies no stationary state of human improvement. There would be as much scope as ever for all kinds of mental culture, and moral and social progress; as much room for improving the art of living and much more likelihood of its being improved, when minds ceased to be engrossed by the art of getting on. Even the industrial arts might be as earnestly and as successfully cultivated, with this sole difference, that instead of serving no purpose but the increase of wealth, industrial improvement would produce their legitimate effect, that of abridging labour.”

One wonders what the classical economists would make of “neo-classical” economics!

So much for ultimate ends and the need for economics to rehabilitate them. As I have argued in the remainder of this essay, any economics worth its name (and thus intending to preserve life) must also make our ultimate means – the laws of thermodynamics – explicit and build its models upon their foundations. Then we will understand how to make the wisest use of what we have to provide what we truly need as people – not what today’s economics tells us we want and what nature tells us we cannot have.

Life only exists in accordance with the second law of thermodynamics (whether or not economists pay attention to the fact!) and as John Ruskin put it so succinctly “There is no wealth but life”. Any economics cut adrift from these foundations of life on Earth, free-floating and drunk on the hubris of man’s final victory over nature, is not an ideology that can serve human beings.


“Now, why was Laputa destroyed?
I know perfectly well.
There’s a song in the valley of Gondoa:
“We need roots in the Earth;
Let’s live with the wind;
With seeds, make fat the winter;
With the birds, let’s sing of spring.”
No matter how many weapons you may have,
or how many poor robots you use,
you can’t live separated from the ground.”
— Sheeta in Hayao Miyazaki’s film Laputa: Castle in the Sky.

Advertisements

~ by freedomthistime on May 7, 2012.

2 Responses to “Economics as if the laws of thermodynamics mattered”

  1. I hope you don’t mind, but I have some quibbles about the physics you discuss regarding the workings of life. I completely agree with nearly everything you say, except that the definition of life as being a low entropy state compared to its surroundings can be a bit misleading. As you say there is an entropy gradient. We lie along it. But it is important to recognize that this is not a gradient in entropy but rather specific entropy, i.e. entropy per unit matter. The fuel we consume (whether current or stored sunlight) lies along lower specific isentropic surfaces, while outer space lies along a higher specific isentropic surface. So are we low are high? It depends on perspective.

    Further, the *total* entropy of life will grow in proportional to its amount of matter: more matter more available states if you like. Or just take the Clausius expression (divided by k) and it is straightforward to derive an expression showing that total entropy grows as the amount of matter along specific isentropic surfaces (i.e. surfaces of constant temperature (or enthalpy) and pressure). More of us means more of our entropy. The overall requirement of entropy increasing with time is still satisfied because the net material flow is across isentropes from low to high.

    Personally, I find entropy a non-intuitive and entirely unnecessary concept. For example, sunlight is low entropy, but what is unorganized about radiation concentrated into 1/10,000th of a steradian? We’re organized? Says who? My life sure as hell doesn’t feel that way. The concept of available energy can express everything relevant just as well physically, and is much easier to comprehend. As Gibbs so neatly expressed, with an external energetic input, stuff can fall down and spread out. The economy consumes concentrated available potential energy to do work, and ultimately dissipate the energy through diffuse thermal radiation to space.

    • Yeah, i suppose I should have talked about specific entropy. I should probably also say that I may be a little out of my depth here, hehe – I essentially wrote this off the back of some half-remembered undergraduate physics courses and some of Roger Penrose’s popular accounts of the second law. I am a research physical scientist, but my research isn’t really much connected with thermodynamics, whereas yours very much is.

      RE your last paragraph: I guess I liked entropy as a concept when I learnt about it at university because it helped me to understand what it was microscopically that made the available potential energy “available” in the first place. As in, if you say that something happens in some time-irreversible way because of “available energy” that’s almost just giving a name to your ignorance – a bit like Richard Feynman’s “gorce”, if you’ve read his lectures? Whereas if you can actually picture the atoms colliding and the time-irreversibility emerging out of those collisions, your understanding is “deeper” somehow. I could have that wrong though, maybe “available energy” lets you do that too? – as I said, I suspect I’m a little out my depth here! Perhaps you could recommend something nice to read on “available energy”? It’s not something I really remember coming across as an undergraduate.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

 
%d bloggers like this: