the microcanonical ensemble finding the probability distribution we consider an isolated system in...

12
The microcanonical ensemble Finding the probability distribution consider an isolated system in the sense that he energy is a constant of motion. are not able to derive from first principles Two typical alternative approaches Postulate of Equal a Priori Probability Use (information) entropy as starti concept Construct entropy expression from it and show that the result is consistent with thermodynamics Derive from maximum entropy principle (,) E H pq E N

Upload: hugh-speakman

Post on 31-Mar-2015

220 views

Category:

Documents


2 download

TRANSCRIPT

Page 1: The microcanonical ensemble Finding the probability distribution We consider an isolated system in the sense that the energy is a constant of motion. We

The microcanonical ensemble

Finding the probability distribution

We consider an isolated system in the sense that the energy is a constant of motion.

( , )E H p q E

N

We are not able to derive from first principles

Two typical alternative approaches

Postulate of Equal a Priori Probability Use (information) entropy as startingconcept

Construct entropy expression from it and show that the result is consistent with thermodynamics

Derive from maximum entropy principle

Page 2: The microcanonical ensemble Finding the probability distribution We consider an isolated system in the sense that the energy is a constant of motion. We

Information entropy**Introduced by Shannon. See textbook and E.T. Jaynes, Phys. Rev. 106, 620 (1957)

Idea: Find a constructive least biased criterion for setting up probability distributions on the basis of only partial knowledge (missing information)

What do we mean by that? Let’s start with an old mathematical problem

Consider a quantity x with discrete random values

1 2, , ..., nx x x

Assume we do not know the probabilities 1 2, , ..., np p p

We have some information however, namely1

1n

ii

p

and we know the average of the function f(x) (we will also consider cases

where we know more than one average)

1

( ) ( )n

i ii

f x p f x

With this information, can we calculate an average of the function g(x) ?

To do so we need all the 1 2, , ..., np p p but we have only the 2 equations

1

1n

ii

p

and1

( ) ( )n

i ii

f x p f x

we are lacking (n-2) additional equations

Page 3: The microcanonical ensemble Finding the probability distribution We consider an isolated system in the sense that the energy is a constant of motion. We

Shannon defined information entropy, Si:

i n nn

S k ln or ( , ) ( , )iS k dpdq p q ln p q for continuous distribution

1nn

( , ) 1dpdq p q with normalization

We make plausible that:- Si is a measure of our ignorance of the microstate of the system. More quantitatively- Si is a measure of the width of the distribution of the n.

What can we do with this underdetermined problem?

There may be more than one probability distribution creating1

( ) ( )n

i ii

f x p f x

We want the one which requires no further assumptionsWe do not want to “prefer” any pi if there is no reason to do so

Information theory tells us how to find this unbiased distribution(we call the probabilities now rather than p)

Page 4: The microcanonical ensemble Finding the probability distribution We consider an isolated system in the sense that the energy is a constant of motion. We

Let’s consider an extreme case:An experiment with N potential outcomes (such as rolling dice)However:Outcome 1 has the probability 1=1Outcome 2,3,… ,N have n=0

0i n nn

S k ln

-Our ignorance regarding the outcome of the experiment is zero.-We know precisely what will happen- the probability distribution is sharp (a delta peak)

Let’s make the distribution broader:Outcome 1 has the probability 1=1/2Outcome 2 has the probability 2=1/2Outcome 3,4, … ,N have n=0

1 1 1 1ln ln 0 ... 02 2 2 2

1 1ln 2 ln 2 ln 22 2

i n nn

S k ln k

k k

Let’s consider the more general case:Outcome 1 has the probability 1=1/MOutcome 2 has the probability 2=1/M...Outcome M has the probability M=1/MOutcome M+1, … ,N have n=0

1 1 1 1 1 1ln ln ... ln ... 0

ln

i n nn

S k ln

kM M M M M M

k M

Page 5: The microcanonical ensemble Finding the probability distribution We consider an isolated system in the sense that the energy is a constant of motion. We

So far our considerations suggest:

Ignorance/information entropy increases with increasing width of the distribution

Which distribution brings information entropy to a maximum

For simplicity let’s start with an experiment with 2 outcomes

Binary distribution with 1, 2 and 1+ 2=1

1 1 2 2iS k ln ln with 1 2 1

1 1 1 11 1iS k ln ln

11 1 1

1 1

11 1 1 ln

1 1i

i

dSk ln ln k

d

1

1

ln 01

i

i

dSk

d

maximum

1

1

11

1 21/ 2 uniform distribution

0.0 0.2 0.4 0.6 0.8 1.00.0

0.1

0.2

0.3

0.4

0.5

0.6

0.7

Si

1

Page 6: The microcanonical ensemble Finding the probability distribution We consider an isolated system in the sense that the energy is a constant of motion. We

Lagrange multiplier technique

1 1 2 2iS k ln ln with 1 2 1

1 2 1 1 2 2 1 2( , , ) 1F k ln ln Fromhttp://en.wikipedia.org/wiki/File:LagrangeMultipliers2D.svg

Finding an extremum of f(x,y) under the constraint g(x,y)=c.

11

22

1 2

1 0

1 0

1 0

Fk ln

Fk ln

F

1 2

1 2 1 1 21/ 2

uniform distribution

Let’s use Lagrange multiplier technique to find distribution that maximizes

1

M

i n nn

S k ln

Once again

at maximum constraint

Page 7: The microcanonical ensemble Finding the probability distribution We consider an isolated system in the sense that the energy is a constant of motion. We

1 21 1

( , ,..., , ) 1M M

M n n nn n

F k ln

1 0jj

Fk ln

1 /1 2 ... k

M e

with1

1M

nn

1 2

1... M M

max lniS k M

uniform distributionmaximizes entropy

Distribution functionIn a microcanonical ensemble where each system has N particles, volume Vand fixed energy between E and E+ the entropy is at maximum in equilibrium.- When identifying information entropy with thermodynamic entropy

1. ( , )

( )( , )

0

const if E H p q EZ Ep q

otherwise

Where Z(E) = # of microstate with energy in [E,E+ ]called the partition function of the microcanonical ensemble

Page 8: The microcanonical ensemble Finding the probability distribution We consider an isolated system in the sense that the energy is a constant of motion. We

Information entropy and thermodynamic entropy

has all the properties we expect from the thermodynamic entropy

B n nn

S k ln When identifying k=kB

(for details see textbook)

We show here S is additive

S1 S2

1 2 1 2S S S (1) :n probability distribution

for system 1(2) :m probability distribution

for system 2

Statistically independence of system 1 and 2probability of finding system 1 in state n and system 2 in state m

(1) (2)n m

(1 2) (1) (2) (1) (2) (1) (2) (1) (2)

, ,

(2) (1) (1) (1) (2) (2)

(1) (1) (2) (2) (1) (2)

lnB n m n m B n m n mn m n m

B m n n B n m mm n n m

B n n B m mn m

S k ln k ln

k ln k ln

k ln k ln S S

Page 9: The microcanonical ensemble Finding the probability distribution We consider an isolated system in the sense that the energy is a constant of motion. We

B n nn

S k ln

Relation between entropy and the partition function Z(E)

1 1

( ) ( )Bn

k lnZ E Z E

1

( )( )B

n

k lnZ EZ E

1( )BS k lnZ E

Derivation of Thermodynamics in the microcanonical Ensemble

Where is the temperature ?

In the microcanonical ensemble the energy, E, is fixed

( , ) ( , )E S V U S V

with dU TdS PdV 1

V

S

T U

U

P S

T V

and

Page 10: The microcanonical ensemble Finding the probability distribution We consider an isolated system in the sense that the energy is a constant of motion. We

Let’s derive the ideal gas equation of state from the microcanonical ensembledespite the fact that there are easier ways to do so

3 3

( , )

( ) N N

U H p q U

Z U d p d q

Major task: find Z(U) :=# states in energy shell [U,U+ ]

q

pU

U+

3

1

! NN h

“correct Boltzmann counting”requires qm origin of indistinguishability of atomsWe derive it when discussing the classical limit of qm gas

another leftover from qm: phase space quantization, makes Z a dimensionless #

For a gas of N non-interacting particles we have 2

1 2

Ni

i

pH

m

Solves Gibb’s paradox

2

1

3 33

2

1( )

! Ni

i

N NN

pU U

m

Z U d p d qN h

2

1

3 3 31 23

2

...! N

i

i

N

NN

pU U

m

Vd p d p d p

N h

Page 11: The microcanonical ensemble Finding the probability distribution We consider an isolated system in the sense that the energy is a constant of motion. We

Remember:

2

1

2N

ii

p mU

2mU

3N dim. sphere in momentum space

2 ( )m U

2

1

3 3 31 23

2

( ) ...! N

i

i

N

NN

pU U

m

VZ U d p d p d p

N h

3 3( 2 ( )) ( 2 )N U N UV p m U V p mU

3 / 2 3 / 233

2 ( ) 2!

NN NN

N

V Cm U mU

N h

dim 2dim

2dim 3dim

2

3 1d

2

33

33 im 3 dim 3

2

44

3...

NN N

NN

V S dr rdr r

V S dr r dr r

V S dr r dr C r

3 / 2

3 / 21

NNN U

V U constU

( )BS k lnZ U 3 / 2

3ln ln ln 1 ln

2

N

B

Uk N V N U const

U

Page 12: The microcanonical ensemble Finding the probability distribution We consider an isolated system in the sense that the energy is a constant of motion. We

In the thermodynamic limit of

N

V

U

Nconst

V

3 / 2

3ln ln ln 1 ln

2

N

B

US k N V N U const

U

ln1=0

3ln ln ln

2BS k N V N U const

with 1

V

S

T U

3

2 BU Nk T

U

P S

T V

BPV Nk T

http://en.wikipedia.org/wiki/Exponentiation

lim 0n

na

for 0 1a