Macroscopic and Molecular Entropy
From the thread "Entropy and chaotic systems"
Post of the Month: August 2001
by Gordon Davisson
Subject: Re: Entropy and chaotic systems Newsgroups: talk.origins Date: August 18, 2001 Author: Gordon Davisson Message-ID: gordon-19FC62.13375718082001@[127.0.0.1]
In article <3B6F834B.ABCC3C1F@rcn.com>, Wade Hines
I'd agree with this, with a couple of exceptions: first, I believe you
made a thinko; the difference is proportional to the ratio of parts to
molecules, not their respective logarithms. In general, the entropy
contribution from n objects, each of which can be in any of m states,
is proportional to n*log(m). This means that what's going on at the
atomic scale (like temperature, crystal structure, etc) is generally a
couple of dozen orders of magnitude more significant than what's going
on at human-discernable scale.
Second, even if we look only at (what I would call) the macroscopic
contribution to entropy, I still don't see a direct connection to
complexity. Well, ok, let me moderate that: I do see a (somewhat
dubious) connection, but it's in the wrong direction. Take my clock
example: a highly complex clock with many moving parts will tend to
have more degrees of freedom, and hence higher macro-entropy, than a
simpler clock with fewer moving parts (and they'll both have higher
macro-entropy than the original block of metal, which has only one
moveable part). We tend to think of complex machines as being highly
constrained, but consider a particular 20-tooth gear in my clock. Even
if it seems to be totally constrained by its bearings and the need to
mesh with its neighboring gears, it still has 20 different states it
could be in. And a more-complex 400-tooth gear would have 400 states,
and thus contribute twice the macro-entropy of the 20-tooth gear.
If you're interested, look for articles by Wojciech H. Zurek (e.g.
"Algorithmic randomness and physical entropy" in _Physical Review A_, v.
40 #8 pp 4731-4751; and "Thermodynamic cost of computation, algorithmic
complexity and the information metric" in _Nature_ v. 341 pp 119-124).
I haven't grokked them in detail, but as I understand it he argues that
the Kolmogorov complexity of the macroscopic information in a system
contributes to the system's entropy, at the rate of 1 bit of complexity
-> k*ln(2) of thermo-entropy (which is to say, negligibly).
>[...]
Unfortunately, I think the problem (both in Heilman's case and in
general) has a number of roots -- what you're pointing out is only one
of them. The other major ones, as I see it, are the tendency to think
of entropy as synonymous with everything bad (disorder, decay, etc)
and as the opposite of everything good (order, complexity, etc); and
the idea that 2lot somehow applies differently when intelligence is
involved (e.g. that intelligence can somehow circumvent 2lot).
(Actually, I'm tempted to claim that all of those "roots" derive
from a single deeper root: thinking 2lot is a more formal version of
the general intuition that things tend to go to pot unless someone
intervenes. This leads to conflating entropy with "going to pot", and
mistaking "someone intervening" for the sort of nonisolation that
matters in 2lot (which is really heat, matter, and energy flows).
As for illustrating the relative importance of the molecular and
macroscopic scales, my favorite example is a deck of cards: which
makes a bigger change in the deck's entropy, sorting/shuffling it, or
cooling/heating it by one degree Celsius (= 1 Kelvin = 1.8 degrees
Fahrenheit)? The answer, of course, is that even a small temperature
change matters far *far* *FAR* more, because while shuffling increases
the number of states each of the 52 cards might be in, heating
increases the (effective) number of states each of the billions and
billions -- well, sextillions and sextillions actually -- of atoms
might be in.
But let's do the math. Shuffling a sorted desk increases the number
of states it might be in by a factor of 52! (pronounced "fifty-two
factorial", equal to about 8e57), and thus increases its entropy by
k*ln(52!) = 2.16e-21 Joules/Kelvin.
Heating the deck, on the other hand... well, doing the actual state-
counting here would require far more knowledge of the exact physical
and chemical state of the deck than have or want. Fortunately,
there's an easier way: if we know the deck's heat capacity and initial
temperature, we can apply the classical definition of entropy, dS=dQ/T
(or if we want to stick to pure stat mech, we could get essentially the
same result via the statmechian definition of temperature). Let's
assume the deck weighs 100g (about 3.5oz) and has a specific heat of
.2, giving it a heat capacity of 84 Joules/Kelvin. That means heating
it 1 Kelvin requires adding 84 Joules of heat. If the deck started at
a temperature of 300 Kelvin (= 27 Celsius = 80 Fahrenheit), the entropy
change will be dS=dQ/T=0.28 Joules/Kelvin.
So the entropy change due to a small change in temperature is about 100
quintillion times bigger than the change due to shuffling. For all
practical purposes, the entropy contribution due to deck ordering (and
other such macroscopic forms of order/disorder) can be safely ignored.
>Gordon Davisson wrote:
> ...
>> Now, on the other hand, suppose I take a block of metal, and make a
>> clock out of it. Complexity increased, right? But as far as I can see
>> that doesn't relate to a change in its entropy. Mind you, its entropy
>> might've changed for any of a number of reasons:
>>[...thermal and chemical reasons...]
>> ...and probably others I haven't thought of (and a few I thought of and
>> considered too unimportant to mention). But as far as I can see the
>> complexity of the final piece has no direct connection to its entropy.
>I have argued elsewhere, and still would maintain, that it does have
>a direct connection to its entropy but that the macroscopic complexity
>is generally swamped by non obvious aspects of entropy at the molecular
>level.
>
>It is significant both that the order in a clock does have some associated
>entropy and that that entropy is hugely insignificant compared to
>molecular contributions. At a superficial but revealing level, the
>magnitudes of difference are proportional to log(parts) vs log(molecules).
>
>While practical considerations say we should ignore the log(parts) as
>overwhelmingly insignificant, it is worth holding the term to satisfy
>and track perceptions regards macroscopic order. By that I do mean
>to maintain the rhetorical advantage of acknowledging a point
>while putting it in its proper place.
>> As I see it, your identification of entropy with decay, and as the
>> opposite of complexity, is at the base of your perception of a conflict
>> between 2lot and evolution. If you can provide support for these
>> identifications, please provide it. But absent that, I think the
>> conflict is pure illusion.
>I do think that the difference between log(part) and log(molecules/atoms)
>is closer to the root of the problem but fear I am inadequate
>to the task of presenting a useful didactic.
--
Human: Gordon Davisson ><todd>
HASA: Member, S division. o o
Internet: gordon@tardigrade.org