Thermodynamics and classical entropy

First part on entropy. Next : Entropy in information theory - Entropy in statistical physics - proof of the ideal gas law - links.

Let us introduce entropy in its classical view, that is how it behaves for macroscopic observers.
The behavior of entropy can be described by comparison to that of energy. Energy can be transferred between systems but is always globally preserved. In some kinds of processes, entropy can be transferred between systems but is preserved, like energy. These processes are called reversible, which means that they can take place backwards as well, symmetrically with respect to time. Some other processes happen to be create some entropy (its total amount increases). But no process can ever eliminate entropy, which is why the processes that create it are irreversible.

This approach by macroscopic physics (= of human scale, or generally non-microscopic scales ; unfortunately the only one presented in many courses of thermodynamics) remains unsatisfactory as it lets the nature of entropy, its creation process and its irreversibility, looking like mysteries.
In the fundamental laws (quantum physics without measurement) that describe elementary (microscopic) processes, entropy has a clear definition, but all processes are reversible, so that this defined entropy is preserved. Thus, the process of entropy creation is understood as an emergent process, that "occurs" only relatively to the approximation of how things can be summed up in practice when they involve rather disordered large groups of particles. This approximation operation affects the conception of effective states of the system at successive times, and thus the successive values of entropy that are calculated from these effective states. Another form of this emergent process of entropy creation will be quantum decoherence, which is the usually required circumstance to qualify a process as a measurement in quantum physics. These deeper explanations of the microscopic definition and creation of entropy, will be presented in the next pages.

Heat and temperature

Let us introduce the concept of temperature as a physical quantity, that is positive in most cases in the correct physical sense, i.e. with respect to the true physical zero value, from which the usual conventions of naming temperatures differ by an additive constant: the true physical zero value of temperature, called "absolute zero", corresponds to the conventional figure of -459.67 °F or -273.15 °C. Temperature in Kelvins (K) is defined as respecting this absolute zero, and with the same unit as Celsius degrees (1/100 of the difference between temperatures of fusion and boiling of water under atmospheric pressure).

There is a maximum amount for the entropy of any material system evolving within given limits of volume and energy. A body that reached its maximal amount of entropy within these limits, is said to be in thermal equilibrium, a state often determined by these conditions (material content, volume and energy). To each system in thermal equilibrium, is also attributed another important physical quantity called its temperature, defined as follows.
Entropy is usually not transferred alone, but together with an amount of energy. A mixture of amounts of energy and entropy that can flow from a system to another, is an amount of heat. (Energy and entropy are not like distinct objects that move, but rather like 2-substance fluids "mixing" themselves by diffusion during contacts, and only the resulting variations of amounts on each side matter). Heat can take several forms: either direct contact or radiation.

The ratio energy/entropy of an amount of heat, is called its temperature.
The temperature of a system in a state of thermal equilibrium, is the temperature of the small amounts of heat that it is ready to transfer to or from the outside, evolving between states of thermal equilibrium of nearby temperatures. Precisely, temperature is defined as the differential (during heat exchanges as the system follows a smooth, reversible change between thermal equilibrium states),
T = δQ/dS
where In this way, the variations dE of energy, dS of entropy and dV of volume, of a system with temperature T during (reversible) transfer of a small amount of heat at the same temperature, are related by

dE = TdSPdV .

where TdS is the energy received from heat and − PdV is the energy received from the work of pressure.
Metaphorically, the properties of heat and temperature can be compared to the idea of a garbage market with negative prices : the dictionary is
Thermodynamics
Garbage market
entropy mass of garbage
energy money
−temperature negative price of garbage

As the flow of heat must preserve energy but can create entropy, it can only go from "warm" objects (with higher temperature, decreasing their entropy by a smaller amount for the transferred energy) to "cold" ones (with lower temperature, getting more entropy for this energy). This amount of heat increases its entropy when reaching the object with lower temperature.

Usually, any transfer of entropy between systems has a cost: it is an irreversible process, which itself creates more entropy. For example near a given temperature, flows of heat are roughly proportional to the difference of temperature between bodies. To make them faster, the difference of temperature must increase, so that the transfer creates more entropy. Or, a release of heat makes the environment temporarily warmer, which makes this release more costly. This cost can be reduced (approaching reversibility) by slowing down the transfer.

Heat flows from the warm to the cold by the fact that warm bodies send their heat faster than cold ones. So, heat transfer is faster at higher temperatures, already in terms of energy, but also usually in terms of entropy (a possibility of speed that can be traded with the fact of producing less entropy). In particular, the radiation from warmer objects has both more energy and more entropy, as we shall see below. In the limit, pure energy (that can be seen as heat with infinite temperature) can often be transferred reversibly.

For entropy-creating processes of life (and machines) to continue their works, they need to transfer their entropy away. As this can usually only happen carried by energy in the form of heat, these systems need to receive pure energy (or warmer energy, with less entropy) in return. The purity of the received energy is what makes it useful, unlike the very abundant heat energy in the environment. Still the release of heat in sufficient flow can also be an issue, which is why, for example, power plants needs to be near rivers for releasing their heat in the water.

The entropy in the Universe

For example, life on Earth involves many irreversible processes, which continuously create entropy. As there is a limit in the amount of entropy that can be contained in given limits of volume and energy, the stability of this quantity around around average values far below this maximum (to let life continue) is made possible by the continuous transfer of the created entropy from Earth to outer space, in the form of infrared radiation (which carries quite more entropy than sunlight in proportion to its amount of energy because it is colder).
This radiation then crosses interstellar space and mainly ends up in intergalactic space. Thus, the development of life is fed not only by sunlight energy (heat with high temperature) but also by the ever larger and colder intergalactic space, which the universal expansion provides like a huge bin for entropy. Both are complementary, like two markets with different prices provide an opportunity for profit by trade between them.

Still, all the entropy of visible and infrared light from stars and planets, is only a tiny part of the entropy in the universe. Among electromagnetic radiations alone, the cosmic microwave background already has comparable energy to visible and infrared light (1); and thus much more entropy (ignoring the entropy of practically undetectable particles: dark matter, neutrinos, gravitons...).

But most of the entropy of the universe is made of the giant black holes in galactic centers. Indeed, the fall of matter into black holes, contributing to the growth of their size and thus of their entropy (proportional to the area of their horizon), is among the most radically irreversible processes of the Universe (that will only be "reversed" after very unreasonable times by "evaporation" in a much, much colder universe...)

Amounts of substance

To express the behaviors of temperature and entropy quantitatively, we need to relate them with other physical quantities. Precisely, amounts of entropy happen to be essentially convertible into amounts of substance. Let us first explain what is this.

The amount of substance counts the very large number of atoms or molecules contained in macroscopic objects. Thus its deep meaning is that of natural numbers, but too big for the unit number (an individual atom or molecule) to be of any significance.
This concept comes from chemistry, as chemical reactions involve ingredients in precise proportions to form molecules containing the right numbers of atoms (this was first an observed fact at the beginning of the 19th century, until its explanation in terms of atoms was clearly established later that century).

The conventional unit for amounts of substance is the mol: 1 mol means NA molecules, where the number NA≈ 6.022×1023 is the Avogadro constant. Thus, n mol of some pure substance contains n×N A molecules of this substance.
This number comes from the choice that 1 mol of Carbon-12 weights 12 grams (thus roughly, 1 mol of hydrogen atoms weights 1 gram = 0.001 kg, with a slight difference due to the nuclear binding energy, converted into mass by E=mc2).
It can be seen as quantity NA ≈ 6.022×1023 mol-1.

The ideal gas law

As we shall later deduce from the nature of entropy, gases are subject in good approximation to the ideal gas law. This approximation goes when their density is much lower than in the liquid phase, so that each item of gas (atom or molecule) spends in average much more time freely moving than significantly interacting with its neighbors.

The ideal gas law is PV=nRT
where P is the pressure, V is the volume, T is the temperature, and n is the amount of substance of gas (that is, nNA is the number of independent items of gas: atoms or mulecules that are isolated from each other at any given time).

The quantity PV is homogeneous to an energy (with conventional unit J=Joule) : the energy needed to push a volume V of gas (the volume swept by the push) at its pression P. This is also 2/3 E where E is the kinetic energy of just the speed of gas molecules (ignoring their energy of rotation and other internal moves) if this speed is much smaller than the speed of light. Here in E=3/2 PV, the factor 3 comes from the number of space dimensions and the factor 1/2 comes from the formula of kinetic energy.(²)

Physical units of temperature and entropy

In the ideal gas law, the gas constant R=8.314 J mol−1K−1, is the natural conversion constant by which the temperature T (expressed in Kelvin), in the form of the product RT, is physically involved as a composite of other physical quantities. This gas constant is never far from any phenomenon involving temperature, even for solids instead of gases, so that temperature (and thus entropy) only conventionally has its own unit, while its true physical nature is that of a composite of other physical quantities.

Namely, the ideal gas law presents the physically meaningful expression RT of the temperature, as an energy per amount of substance (which explains the units involved in the value of R). It also reduces entropy (initially expressed in J/K) as comparable with an amount of substance.
Microscopically, moles are replaced by numbers of molecules, so that the conversion factor is the Boltzmann constant k= R/NA: a temperature T microscopically appears as a typical amount of energy E=kT, while an amount of entropy S becomes a real number S/k.

This naturally suggests that the variations of temperature of a given object would be proportional to the transferred amounts of heat energy.

Heat capacity

The heat capacity of an object is C= δQ/dT (ratio of a small amount of thermally received energy δQ, to the variation dT of temperature, but it depends if the volume is taken constant). It can be seen as representing a kind of thermally relevant "amount of substance" in an object.

Indeed, for some kinds of ordinary matter across some ranges of temperature, it is rather stable (independent of the temperature), near a value that is the product of its amount of substance by some precise number (usually half an integer) which represents the possible degrees of freedom involved in thermal agitation. For example, the heat capacity of water at 25°C, is 8.965 times its amount of substance (of water molecules). This number is close to 9 = 3*3, that is 3 atoms per molecule, times 3 space dimensions, as each atom can move in 3 dimensions.

As long as the heat capacity C of an object is stable, the variations of entropy, integral over T of dS = C.dT/T, are those of C.ln T.
Such cases mainly occur only around familiar ranges of temperatures, in successive temperature intervals, each with a different value of heat capacity. But other situations may occur: the heat capacity may abruptly change during phase transition (between solid, liquid and gas), and smoothly change in some other cases.
The hotter an object is, the more the matter is divided into smaller components able to move independently of each other (from molecules to atoms to individual particles in plasmas), thus a higher heat capacity. At higher temperatures, increasing temperature by a given additive value requires more energy; thus, increasing temperature by a given multiplicative factor involves a larger increase of entropy.

Entropy zero (third law of thermodynamics)

Near the absolute zero of temperature, a logarithmic function (as is entropy when capacity is constant) would decrease to infinity. But entropy cannot decrease infinitely. Instead, the third law of thermodynamics states that for any system there exists an absolute zero of entropy, thus an absolute definition of the entropy of the state of any system, as a quantity that always remains positive. This zero entropy is often reached at the absolute zero of temperature by perfect crystals and some other systems kept in limited volumes, but not always: in addition to gases expanded in unlimited volumes, a few other substances keep a positive entropy near the zero temperature, called residual entropy.

So, at very low temperatures, the heat capacity of an object must converge to zero too (this is not an exactly logical consequence but it happens in practice), thus become much less than its number of molecules: atoms and simple molecules no more move individually, but rather only collectively or scarcely. This will be explained by the nature of entropy and its foundation on quantum physics. Let us introduce a first approach, with the case of thermal radiation.

Black body radiation

A space region filled with a uniform thermal (black-body) radiation of a given temperature, can be understood saying its electromagnetic field is in a state of thermal equilibrium at that temperature. Its content is not exactly a pure heat : the ratio of its total amounts of energy and entropy, does not coincide with its temperature. Instead, like any other physical body, its temperature is the ratio of its variations of energy and entropy during heat exchange.
The order of magnitude of its densities of energy and entropy can be deduced as follows.
The temperature T defines a typical energy E = kT (photons have energies about πE), from which the Planck constant ℏ gives a time interval t = ℏ/E= ℏ/kT (photons periods are about 2t). In that time, light goes a distance x=ct=cℏ/kT, proportional to the average inter-photon distance. This gives an order of magnitude for the density of photons number, and thus the density of entropy, S/kV ~ x-3 = (k/cℏ)3T3.

In the case of an ideally black object (absorbing the light of all wavelengths), the radiation for every temperature T combines an energy flow proportional to T4 with an entropy flow proportional to T3. In a volume V, the energy E and entropy S of radiation are related by E = (3/4) ST where the coefficient 3/4 < 1 comes from T=dE/dS = E/ST d(T4)/d(T3).

The exact values are

E/V = (π2/15)(kT)4/(cℏ)3 where π2/15 =0.65797
S/V= (4π2/45)k4T3/(cℏ)3 where (4π2/45)=0.8773.

In particular, sunlight contains more entropy than it took from the Sun: the further amount is created by the process of light emission at the Sun's surface, which is irreversible due to the contrast of the emitted light with the surrounding darkness.

The heat capacity of matter starts being dominated by the heat capacity of radiation inside matter when the density of photons number becomes comparable to that of other particles. In the center of the Sun, T= 15 million K = cℏ/kx gives x=1.5Å. This radiation has similar heat capacity per volume to a medium at this "usual" density and thus inter-particle distance; its pressure is also similar to that of a plasma at this particles density and this temperature. However, a plasma at this temperature has no reason to stay close to this density anymore; the density at the sun center is 162.2 g/cm3, so that the heat capacity and pressure of electrons and nucleons still dominates there. Only in bigger stars with higher temperature in the core, this radiation can get a comparable importance to ordinary particles, and thus play a crucial role in stellar stability.
At still higher temperatures, when the energy kT exceeds the mass energy mc2 of some field with mass m, space starts to be filled with spontaneously created particle/antiparticle pairs. Then, both particles and antiparticles having speeds near c, start behaving like black-body radiation in how their density and heat capacity increases with temperature.

In particular, the temperature above which space is filled with electrons and positrons is (from google calculator "electron mass/2*c^2/k=") 3 billion K. Such temperatures correspond to times until a few seconds after the big bang (3), are just touched as factors of pressure reduction in exceptional stars and supernova, and are only exceeded after this, but too late to still matter ("newly formed neutron core has an initial temperature of about 100 billion K").

Low temperatures

A possible thing that can happen for crystals at low temperatures, is that their remaining heat capacity corresponds to sound waves with wavelength larger than the inter-atom distance of the crystal. This can be described in terms of phonons ("particles of sound") in a pattern (Debye model) very similar to black body radiation, but much more dense for each temperature because the formula must replace the speed of light by the speed of sound in that crystal.
However, other things can also contribute to heat capacity near absolute zero:

Helmholtz Free energy

To define the Helmoltz Free energy, there are two problems with conventional choices as I could see in the literature. One is whether it should be written A or T. The other is whether, in its defining term ETS, the temperature T should be defined as that of the considered object, or that of the environment. But both functions are interesting. So let us introduce them both: denoting T the (variable) temperature of the object and T0 the (constant) temperature of the environment, let

F = ET0 S
A = ETS

Their variations are given by

dF = (TT0)dSPdV
dA = −SdTPdV

Another remarkable function is the function A/T = E/TS because its variations are

d(A/T) = d(E/TS) = E d(1/T) + (dETdS)/T = E d(1/T) − PdV/T

The special interest of these quantities will appear when explaining the nature of entropy.


(1) Precisely, the cosmic microwave energy density is about 10-4 times that of atomic matter. Since then, about as much energy was consumed by nuclear reactions: in mass proportion of atomic matter, about 1% of the matter in our galaxy underwent fusion from hydrogen to helium, lowering the proportion of hydrogen from 75% to 74% and releasing 7MeV per nucleon, and 2% fused into other elements, lowering helium proportion from 25% to 24%, releasing about 1MeV per nucleon; the proton mass being 938MeV. But the actually remaining light energy must be lower, for several reasons: this is only the reactions in our galaxy (the rest of matter in the universe may have had less reactions); part of it was released as neutrinos, and the brightest galaxies were only brightest long ago, their light was redshifted since then.

(2)The calculation of the comparison of kinetic energy and pressure contributed in a 1-dimensional space by a particle at any speed between 0 and the speed of light, goes this way:
Ec=E-m
v=p/E
Pressure P=pv
E2-p2=m2
m2=E2+Ec2-2EEc
p2+Ec2=2EEc
P=p2/E
E=p2/P
p2+Ec2=2.Ec.p2/P
Ec/P=(p2+Ec2)/2p2=(1+(Ec/p)2)/2

(3)a time one can deduce from the energy density of that temperature's radiation and the gravitational constant: google calculator
sqr((hbar)^3/((electron mass/2)^4*(2.5*8*pi*G/3)*c^3))/2= 13.45 seconds
where /2 is because t=1/2H when energy density is dominated by radiation (particles with speeds near c), electron mass/2 is by assuming the main electron/positron annihilation to happen when 2kT=mc2, and 2.5 instead of 0.658 is a number I hazardously insert to reflect how much more the total energy density of the universe may have been than electromagnetic radiation, mainly because of electrons, positrons and neutrinos.


Next:
Entropy in information theory