# How to take derivative of ln

Notes on Statistical Thermodynamics - Partition Functions

Many times we divide the study of physical chemistry into two broad classes of phenomena. There is the "macroscopic world," where we study the bulk properties of matter. That is, we study samples which contain on the order of 10 23 molecules or particles. The main theoretical framework for the study of bulk properties in chemistry is thermodynamics (or kinetics for most nonequilibrium phenomena), and the fundamental equations are the first, second, and third laws of thermodynamics. On the other hand, we also study the "microscopic world," where we are concerned with the properties of individual molecules or particles. The usual theoretical framework of the microscopic world is quantum mechanics (or sometimes classical mechanics), and the fundamental equations are Schrödinger's equations (or Newton's laws). In the macroscopic world we deal with quantities such as internal energy, enthalpy, entropy, heat capacities, and so on. In the microscopic world we deal with wave functions, particle momenta, kinetic and potential energies, energy levels, and so on.

But the properties of bulk matter obviously depend on the properties of the particles of which it is composed. How are the microscopic properties of individual molecules related to the properties of a sample which contains 10 23 molecules, or - more to the point - how can we find the properties of a bulk sample from the properties of the molecules? This is the question which statistical thermodynamics seeks to address. We can think of statistical thermodynamics as a process of model building. We construct a (theoretical) model of the particles, atoms, molecules, etc. which make up the sample, and statistical thermodynamics will tell us what the bulk properties will be. For example, if our model is a collection of molecules which do not interact with each other, we will get the bulk properties of an ideal gas. If we want to get the properties of a nonideal gas, we have to go back to the model and put in the properties of the molecules which will make the gas nonideal. In this case that amounts to including a potential energy of interaction between the molecules.

It would be nice if statistical thermodynamics could be derived entirely from the fundamental principles we already know, say quantum mechanics or classical mechanics. Unfortunately, this is not possible at present. In order to arrive at a theory which works we must introduce some new postulates. This path is followed in most books on statistical thermodynamics and is quite successful and largely satisfactory. However, in this discussion we will use a slightly different approach. Here I am going to ask you to believe that the "Boltzmann factor" - which I will describe below - is a correct description of some "probabilities" relevant to the system, and we will derive everything else from there.

We will assume that whatever system we are interested in satisfies the Schrödinger equation (even if it contains 10 23 particles!), and that we know or can find the energies of the quantum states. For convenience we will label the energy states in order of increasing energy, *E* _{1} ≤ *E* _{2} ≤ *E* _{3} ≤ *E* _{4} ≤. We use the ≤ sign rather than the 2 /2 is a kinetic energy and that *E _{a}* is the Arrhenius activation energy. In the latter case the assertion is usually made that the exponential factor is proportional to the number of molecules with sufficient energy to react.

It is cumbersome to keep writing 1/*kT* all the time, so it is customary to set 1/*kT* = *β*. Using this notation, the proportionality can be written:

(2) .

Assuming that we accept that the probability of finding the system in state *i* with energy *E _{i}* is proportional to , the next natural question is what is the proportionality constant? That's relatively easy to answer because we know that the probabilities must sum to unity - the system must be in

__some__state. So we can write

(3)

Let's call the proportionality constant *c*. Then

(4)

Using Equation (3) we can solve for *c* by writing

(5)

so that

(6)

Again, it is cumbersome to keep writing all the time so we simplify things by writing

(7)

It turns out that this quantity, *Q*. is so important in the theory that it is even given its own name. *Q*. so defined, is called a __partition function__. (Don't worry about why it is called that, it has something to do with how energy is __partitioned__ among the possible states of the system. In some books the partition function is given the symbol *z* or *Z*. which stands for the German word *zustands-summe*. which means sum over states.)

The reason why *Q* is so important is that it connects the mechanical properties of the system (through the quantized energies *E _{i}* ) with thermodynamics (through the

*T*in

*β*= 1/

*kT*). So this function has both thermodynamics and mechanics in it.

*Q*is a function of

*T*through the

*β*part and it is a function of the mechanical variables in the model through the energies

*E*. For example, if the quantized energies of the system depend on the volume,

_{i}*V*. the system is contained in and on the number,

*N*. of molecules in the system - and they generally do depend on these variables - then

*Q*will be a function of

*T*.

*V*. and

*N*.

(*Q* will also be a function of other things, like the mass of the individual molecules, but we don't generally indicate that explicitly because the mass of a molecule is not a thermodynamic variable.)

So *Q* is usually a function of *T* , *V*. and *N*. which we write as,

(8) .

It is a function of *T* through the *β* and of *V* and *N* through the quantized energies *E _{i}* . (It is important to remember that the sum in Equation 7 is over all

*states*of the system not just over energy

*levels*. If there is degeneracy some of the terms in Equation 7 will be identical. For example, if there are four states with a particular energy

*E*. then the term will occur in the summation four times.)

Utilizing the fact that the normalization constant is 1/*Q*. we can write the probability that the system is in state *i*. with energy *E _{i}* . as

(9) .

Now the question arises, how do

we use this to calculate quantities of interest? We'll start with internal energy, *U*. The best word definition of *U* is that it is the (average of the) sum of all the potential and kinetic energies of all the particles in the system. In other words, *U* is the total (average) mechanical energy of the system. Since we know what the possible energy states of the system are and we know the probability that the system is in each state, we can calculate the average energy. We will set this average energy equal to *U* (this is sort of a postulate, but we won't worry about that now),

(10)

Since we have an expression for *P _{i}* we can rewrite Equation 10 as

(11)

So far so good, but we can simplify this by noticing that

(12)

so that

(13)

and

(14)

So we see that

(15)

(We will find, as we go along, that all of the thermodynamic properties will depend on ln*Q* or derivatives of ln*Q*. *Q* itself usually is a very, very, very, large dimensionless number, but its natural logarithm is much smaller and will be related to measurable properties.)

It is fair to ask what is being held constant in taking the partial derivative in Equation 15. If we recall the definition of *Q* it will be clear that the only things available to be held constant are the mechanical definitions of the model, such as *V*. *N*. and any other purely mechanical things that the *E _{i}* may depend on (but not pressure, for example).

Sometimes it is convenient to take derivatives with respect to temperature instead of *β*. Using elementary calculus we can change variables by setting

(16)

but *β* = 1/*kT*. or *T* = 1/*k β* . so that

(17)

then

(18)

We now have enough information to calculate the heat capacity at constant volume. We can't calculate the heat capacity at constant pressure yet because *Q* and *U* are functions of *V* and not *p*. (We will call the heat capacity at constant volume, *C _{V}* . but in fact we are holding all of the mechanical parameters of the system constant.) The thermodynamic definition of

*C*is,

_{V}(19)

Which can be calculated from our *Q* and *U* as,

(20)

(In statistical thermodynamics it is common to omit the statement of what variables are being held constant since we know that *Q* is a function of *T*. *V*. and *N*. Thus, it is not unusual to see Equation 20 written,

(21) )

We have two thermodynamic properties of our system, *U* and *C _{V}* , all calculated from

*Q*. Can we get anything else? How about entropy? The third law of thermodynamics says that the entropy (of a "nice" system) is zero at the absolute zero of temperature. So all we should have to do is integrate

*C*/

_{V}*T*from 0 K up to some temperature,

*T*,

(22)

(I know that I've used T as both variable and limit of integration, but you know what I mean so I won't worry about making it look right to the mathematicians. If this bothers you, put a prime on the *T* and *dT* inside the integral.) OK, so entropy becomes

(23)

(Notice that we have divided *C _{V}* by

*T*. so that there is one less

*T*factor in each of the terms in the integrand than there were in Equation 21.) This integral is really the sum of two integrals. The first integral is easy and gives

(24)

The second,

(25) ,

can be integrated by parts.

In case you have forgotten how to integrate by parts, recall that

udv = *d* (*uv* ) − *vdu*

Here, we are setting *u* = *T* and *dv* = ∂ 2 ln*Q* /∂ *T* 2 *dT*. so that *v* = ∂ln*Q* /∂*T*. The integration by parts then gives for the Expression 24

(26)

Combining the two integrals we get

(27)

which, upon separation of the upper and lower limits, becomes

(28)

(We have not bothered to explicitly indicate that the first two terms are evaluated at temperature *T*. they are.) The last two terms refer to 0 K and are presumably the entropy at absolute zero. In a more sophisticated treatment we would show that they are identically zero, but here we shall just assume that they are zero because of the third law. So our expression for entropy is just

(29)

So now we have three thermodynamic functions which we can calculate from ln*Q*. We have added entropy to the list. If we look carefully at the second term on the right in the last equation, and compare it to Equation (18) we will see that this last term is just *U* /*T*. So,

(30)

(31)

But we already know that *U* − *TS* is just the Helmholtz free energy, *A*. so

(32)

This Equation 32 is the fundamental equation connecting the partition function *Q* to thermodynamics. From this equation we can derive all the other equations which we have given above and more. For example, we can get S again from the usual relationships of thermodynamics as

(33)

Knowing *A* and *S* we can get *U* as,

(34)

The heat capacity at constant volume can be calculated two ways,

(35)

(36)

In addition, we can get pressure from

(37)

and chemical potential, *μ*. from

(38)

*C _{p}* can also be obtained from ln

*Q*and its derivatives. It is a good test of your thermodynamic skills to derive the expression for

*C*in terms of ln

_{p}*Q*and its

*T*and

*V*derivatives.

The fact is, we can get every property of our system from ln*Q* that our model contains. (Anything that is not in the model will not show up in the thermodynamic properties. For example, if you want the properties of a nonideal gas you have to include interactions between molecules in your model.) We now have the basic equations, all that remains is make the model and write *Q*. On the next page we will give some additional useful information and develop some simple models.

Category: Bank

## Similar articles:

PFC derivatives and chemicals on which they are based alert FactSheet

5 Equity Derivatives And How They Work