where S is the change in entropy, Q is the heat flow into or out of a system, and T is the absolute temperature in degrees Kelvin (K).
[Note: For a reversible flow of energy such as occurs under equilibrium conditions, the equality sign applies. For irreversible energy flow, the inequality applies.]A Driving Force
where Sr is the total entropy change associated with
this irreversible heat flow, T1 is the temperature inside the
house, and T2 is the temperature outdoors. The negative sign
of the first term notes loss of heat from the house, while the positive
sign on the second term recognizes heat gained by the outdoors. Since it
is warmer in the house than outdoors (T1 > T2),
the total entropy will increase (Sr > 0) as a result
of this heat flow. If we turn off the heater in the house, it will gradually
cool until the temperature approaches that of the outdoors, i.e., T1
= T2. When this occurs, the entropy change (S) associated
with heat flow (Q) goes to zero. Since there is no further driving force
for heat flow to the outdoors, it ceases; equilibrium conditions have been
established.
As this simple example shows, energy flow occurs in a direction that causes
the total energy to be more uniformly distributed. If we think about it,
we can also see that the entropy increase associated with such energy flow
is proportional to the driving force for such energy flow to occur. The
second law of thermodynamics says that the entropy of the universe (or any
isolated system therein) is increasing; i.e., the energy of the universe
is becoming more uniformly distributed.
It is often noted that the second law indicates that nature tends to go
from order to disorder, from complexity to simplicity. If the most random
arrangement of energy is a uniform distribution, then the present arrangement
of the energy in the universe is nonrandom, since some matter is very rich
in chemical energy, some in thermal energy, etc., and other matter is very
poor in these kinds of energy. In a similar way, the arrangements of mass
in the universe tend to go from order to disorder due to the random motion
on an atomic scale produced by thermal energy. The diffusional processes
in the solid, liquid, or gaseous states are examples of increasing entropy
due to random atomic movements. Thus, increasing entropy in a system corresponds
to increasingly random arrangements of mass and/or energy.
Entropy and Probability
There is another way to view entropy. The entropy of a system is a measure
of the probability of a given arrangement of mass and energy within it.
A statistical thermodynamic approach can be used to further quantify the
system entropy. High entropy corresponds to high probability. As a random
arrangement is highly probable, it would also be characterized by a large
entropy. On the other hand, a highly ordered arrangement, being less probable,
would represent a lower entropy configuration. The second law would tell
us then that events which increase the entropy of the system require a change
from more order to less order, or from less-random states to more-random
states. We will find this concept helpful in Chapter 9 when we analyze condensation
reactions for DNA and protein.
Clausius2, who formulated the second law of thermodynamics, summarizes
the laws of thermodynamics in his famous concise statement: "The energy
of the universe is constant; the entropy of the universe tends toward a
maximum." The universe moves from its less probable current arrangement
(low entropy) toward its most probable arrangement in which the energy of
the universe will be more uniformly distributed.
where E and S are the changes in the system energy and system entropy
respectively, for a time interval t. Clearly the emergence of order
of any kind in an isolated system is not possible. The second law of thermodynamics
says that an isolated system always moves in the direction of maximum entropy
and, therefore, disorder.
It should be noted that the process just described is irreversible in the
sense that once the ice is melted, it will not reform in the thermos. As
a matter of fact, natural decay and the general tendency toward greater
disorder are so universal that the second law of thermodynamics has been
appropriately dubbed "time's arrow."5
Closed Systems near Equilibrium
A closed system is one in which the exchange of energy with the outside
world is permitted but the exchange of mass is not. Along the boundary between
the closed system and the surroundings, the temperature may be different
from the system temperature, allowing energy flow into or out of the system
as it moves toward equilibrium. If the temperature along the boundary is
variable (in position but not time), then energy will flow through the
system, maintaining it some distance from equilibrium. We will discuss closed
systems near equilibrium first, followed by a discussion of closed systems
removed from equilibrium next.
If we combine the first and second laws as expressed in equations 7-1 and
7-2 and replace the mechanical work term W by P V, where P is pressure
and V is volume change, we obtain,
[NOTE: Volume expansion (V> 0) corresponds to the system doing work, and therefore losing energy. Volume contraction
(V 0) corresponds to work being done on the system].
Algebraic manipulation gives
where
The term on the left side of the inequality in equation 7-6 is called
the change in the Gibbs free energy (G). It may be thought of as a thermodynamic
potential which describes the tendency of a system to change---e.g., the
tendency for phase changes, heat conduction, etc. to occur. If a reaction
occurs spontaneously, it is because it brings a decrease in the Gibbs free
energy (G 0). This requirement is equivalent to the requirement that
the entropy of the universe increase. Thus, like an increase in entropy,
a decrease in Gibbs free energy simply means that a system and its surroundings
are changing in such a way that the energy of the universe is becoming more
uniformly distributed.
We may summarize then by noting that the second law of thermodynamics requires,
where t indicates the time period during which the Gibbs free energy
changed.
The approach to equilibrium is characterized by,
The physical significance of equation 7-7 can be understood by rewriting
equations 7-6 and 7-7 in the following form:
and noting that the first term represents the entropy change due to processes
going on within the system and the second term represents the entropy change
due to exchange of mechanical and/or thermal energy with the surroundings.
This simply guarantees that the sum of the entropy change in the system
and the entropy change in the surroundings will be greater than zero; i.e.,
the entropy of the universe must increase. For the isolated system, E
+ P V = 0 and equation 7-9 reduces to equation 7-4.
A simple illustration of this principle is seen in phase changes such as
water transforming into ice. As ice forms, energy (80 calories/gm) is liberated
to the surrounding. The change in the entropy of the system as the amorphous
water becomes crystalline ice is -0.293 entropy units (eu)/degree Kelvin
(K). The entropy change is negative because the thermal and configuration
entropy (or disorder) of water is greater than that of ice, which is a highly
ordered crystal.
[NOTE: Confirgurational entropy measures randomness in the distribution of matter in much the same way that thermal entropy measures randomness in the distribution of energy].Thus, the thermodynamic conditions under which water will transform to ice are seen from equation 7-9 to be:
For condition of T 273oK energy is removed from water to produce
ice, and the aggregate disordering of the surroundings is greater than the
ordering of the water into ice crystals. This gives a net increase in the
entropy of the universe, as predicted by the second law of thermodynamics.
It has often been argued by analogy to water crystallizing to ice that simple
monomers may polymerize into complex molecules such as protein and DNA.
The analogy is clearly inappropriate, however. The E + P V
term (equation 7-9) in the polymerization of important organic molecules
is generally positive (5 to 8 kcal/mole), indicating the reaction can never
spontaneously occur at or near equilibrium.
[NOTE: If E + P V is positive, the entropy term in eq 7 9 must be negative due to the negative sign which preceeds it. The inequality can only be satisfied by S being sufficiently positive, which implies disordenng].By contrast the E + P V term in water changing to ice is a negative, -1.44 kcal/mole, indicating the phase change is spontaneous as long as T 273oK, as previously noted. The atomic bonding forces draw water molecules into an orderly crystalline array when the thermal agitation (or entropy driving force, T S) is made sufficiently small by lowering the temperature. Organic monomers such as amino acids resist combining at all at any temperature, however, much less in some orderly arrangement.
The probability that at ordinary temperatures a macroscopic number of molecules is assembled to give rise to the highly ordered structures and to the coordinated functions characterizing living organisms is vanishingly small. The idea of spontaneous genesis of life in its present form is therefore highly improbable, even on the scale of billions of years during which prebiotic evolution occurred.8It seems safe to conclude that systems near equilibrium (whether isolated or closed) can never produce the degree of complexity intrinsic in living systems. Instead, they will move spontaneously toward maximizing entropy, or randomness. Even the postulate of long time periods does not solve the problem, as "time's arrow" (the second law of thermodynamics) points in the wrong direction; i.e., toward equilibrium. In this regard, H.F. Blum has observed:
The second law of thermodynamics would have been a dominant directing factor in this case [of chemical evolution]; the reactions involved tending always toward equilibrium, that is, toward less free energy, and, in an inclusive sense, greater entropy. From this point of view the lavish amount of time available should only have provided opportunity for movement in the direction of equilibrium.9 (Emphasis added.)Thus, reversing "time's arrow" is what chemical evolution is all about, and this will not occur in isolated or closed systems near equilibrium.
where Se is the entropy flux due to energy flow through
the system, and Si is the entropy production inside the system
due to irreversible processes such as diffusion, heat conduction, heat production,
and chemical reactions. We will note when we discuss open systems in the
next section that Se includes the entropy flux due to mass flow
through the system as well. The second law of thermodynamics requires,
In an isolated system, Se = 0 and equations 7-11 and 7-12 give,
Unlike Si, Se in a closed system does not have a definite sign, but depends entirely on the boundary constraints imposed on the system. The total entropy change in the system can be negative (i.e., ordering within system) when,
Under such conditions a state that would normally be highly improbable
under equilibrium conditions can be maintained indefinitely. It would be
highly unlikely (i.e., statistically just short of impossible) for a disconnected
water heater to produce hot water. Yet when the gas is connected and the
burner lit, the system is constrained by energy flow and hot water is produced
and maintained indefinitely as long as energy flows through the system.
An open system offers an additional possibility for ordering---that of maintaining
a system far from equilibrium via mass flow through the system, as will
be discussed in the next section.
An open system is one which exchanges both energy and mass with the surroundings.
It is well illustrated by the familiar internal combustion engine. Gasoline
and oxygen are passed through the system, combusted, and then released as
carbon dioxide and water. The energy released by this mass flow through
the system is converted into useful work; namely, torque supplied to the
wheels of the automobile. A coupling mechanism is necessary, however, to
allow the released energy to be converted into a particular kind of work.
In an analagous way the dissipative (or disordering) processes within an
open system can be offset by a steady supply of energy to provide for (S)
Se
type work. Equation 7-11, applied earlier to closed systems far from equilibrium,
may also be applied to open systems. In this case, the Se
term represents the negative entropy, or organizing work done on the system
as a result of both energy and mass flow through the system. This work done
to the system can move it far from equilibrium, maintaining it there as
long as the mass and/or energy flow are not interrupted. This is an essential
characteristic of living systems as will be seen in what follows.
(2) The maintenance of living systems requires that the energy flow through
the system be of sufficient magnitude that the negative entropy production
rate (i.e., useful work rate) that results be greater than the rate of dissipation
that results from irreversible processes going on within the systems; i.e.,
(3) The negative entropy generation must be coupled into the system in
such a way that the resultant work done is directed toward restoration of
the system from the disintegration that occurs naturally and is described
by the second law of thermodynamics; i.e.,
where Se and Si refer not only to
the magnitude of entropy change but also to the specific changes that occur
in the system associated with this change in entropy. The coupling must
produce not just any kind of ordering but the specific kind required by
the system.
While the maintenance of living systems is easily rationalized in terms
of thermodynamics, the origin of such living systems is quite another
matter. Though the earth is open to energy flow from the sun, the means
of converting this energy into the necessary work to build up living systems
from simple precursors remains at present unspecified (see equation 7-17).
The "evolution" from biomonomers of to fully functioning cells
is the issue. Can one make the incredible jump in energy and organization
from raw material and raw energy, apart from some means of directing the
energy flow through the system? In Chapters 8 and 9 we will consider this
question, limiting our discussion to two small but crucial steps in the
proposed evolutionary scheme namely, the formation of protein and DNA from
their precursors.
It is widely agreed that both protein and DNA are essential for living systems
and indispensable components of every living cell today.11 Yet
they are only produced by living cells. Both types of molecules are much
more energy and information rich than the biomonomers from which they form.
Can one reasonably predict their occurrence given the necessary biomonomers
and an energy source? Has this been verified experimentally? These questions
will be considered in Chapters 8 and 9.