There is real statistical mechanics today. An ensemble is defined and an average calculated.
Microcanonical & Canonical Ensembles
It has been demonstrated that there is a huge number of microstates. It is possible to connect thermodynamics with the complexity of the microscopic world. Below are definitions.
Microstate
A microstate is a particular state of a system specified at the atomic level. This could be described by the many-body wavefunction. A system is something that over time fluctuates between different microstates. An example includes the gas from last time. There is immense complexity. Fix the variables <math>T, V,</math> and <math>N</math>. With only these variables known, there is no idea what microstate the system is in at the many body wavefunction level.
Consider a solid. It is possible to access different configurational states, and two are shown below. Diffusion is a process that results in changes in state over time. X-ray images may not be clear due to the diffusion of atoms and systems accessing different microstates. There may be vibrational excitations, and electronic excitations correspond to the excitation of electrons to different levels. These excitations specify what state the solid is in. Any combination of excitations specify the microstate of a system.
Summary
A particular state of a system specified at the atomic level (many function wavebody level
Latex |
---|
\[ \Psi_{\mbox{manybody}} \] |
. The system over time fluctuates betwen different microstates
- There is immense complexity in a gas
- Solid
- Configurational states
- Vibrational excitations
- electronic excitations
- There is usually a combination of these excitations, which results in immense complexity
- Excitations specify the microstate of the system
Why Ensembles?
A goal is to find
. Thermodynamic variables are time averages. Sum over the state using the schrodinger equation to find energy and multiplying by probability. To facilitate averages, ensembles are introduced. Ensembles are collections of systems. Each is very large, and they are macroscopically identical. Look at the whole and see what states the system could be in. Below is a diagram of systems and ensembles. Each box represents a system, and the collection of systems is an ensemble. Each box could represent the class, and v could represent the sleep state. Look at the properties of the ensemble. There are
;macroscopically identical systems. Eventually
is taken to go to
. Each system evolves over time.
Deriving P_v
The probability,
is defined for different kinds of boundary conditions. When looking at the probability that students in a class are asleep, it is possible to take an instantaneous snapshot. The term
is the occupation number and is defined as the number of systems that are in the state
at the time of the snapshot. The fraction of systems in state
is
. This is one approximation to get the probability, and it could be a bad approximation.
Latex |
---|
\[ P_v \approx \frac {a_v}{A} \] |
There are an additional definitions of
. It is equal to the probability of finding a system in state v at time t or identically it is the fraction of time spent by the system in state v. In the case of the sleep example, it is equal to the fraction of time that any one is in the sleep state.
The time average corresponds to looking at a class and seeing for what fraction of time does it find someone in the class asleep. The ensemble average corresponds to looking at the set of identical classes and seeing how many classes have at least one student asleep. There is a correlation between the state average and the time average. There is a need of boundary conditions. Take a picture of a large number of systems, look at everyone, and average.
Summary
Recap:
Latex |
---|
\[ E = \sum_V E_V P_V \] |
To facilitate averages, we introduce "ensembles" that we average over
- Averaging over many bodies rather than averaging over time
- Example: student = system, v =sleepstate
Ensemble of systems:
- 'A' (a very large number) macroscopically identical systems
- Each system evolves over time
Probability:
- Take an instantaneous snapshot
- Define = #of systems that are in state v at the time of snapshot
- Fraction of systems in state v is
Latex |
---|
\[ \frac{a_v}{A} \simeq P_v \] |
- Probability to find a system in statevat timet
- Fraction of time spent in statev
Microcanonical Ensemble
The boundary conditions of all systems of the microcanonical ensemble are the same. The variablesN, V,andEcannot fluctuate. Each system can only fluctuate between states with fixed energy,E.
It is possible to get degeneracy from the Shrodinger equation.
Latex |
---|
\[ \hat H \Psi = E \begin {matrix} \underbrace{ (\Psi_1, \Psi_2, ..., \Psi_{\Omega}) } \\ \Omega(E) \end{matrix} \] |
Consider the example of the hydrogen atom. Below is an expression of the energy proportionality and the degeneracy when
and
Wiki Markup |
---|
{html}
<P> </P>{html} |
Latex |
---|
\[ E \alpha \frac {1}{n^2} \] |
Wiki Markup |
---|
{html}
<P> </P>{html} |
Latex |
---|
\[ \Omega (n=1) = 1 \] |
Wiki Markup |
---|
{html}
<P> </P>{html} |
Latex |
---|
\[ \Omega (n=2) = 4 \] |
What is Pv in microcanonical ensemble?
All states should be equally probable with variables
and
[ fixed. The term
is the probability of being in any
state for a system, and it should be equal to a constant. An expression is below. Each state can be accessed, and one is not more favored. This is related to the principle of a priori probability. There is no information that states should be accessed with different probability.
Latex |
---|
\[ P_v = \frac{1}{\Omega (E)} \] |
Example
Consider an example of a box and gas. All the atoms are in one corner in the second box. Add to get complete degeneracy. The value of
is large; there is enormous degeneracy.
Latex |
---|
\[ \Omega_1(E) \gg \Omega_2(E) \] |
Consider a poker hand. There is a lot of equivalence in bad hands. These are dealt most of the time and correspond to
. The royal flush corresponds to
. It is equally probable, but there are many fewer ways to get the royal flush. There are the same boundary conditions. In an isolated system, in which
and
are fixed, it is equally probable to be in any of its
possible quantum states.
Summary
The variables
and
are fixed
- Each system can only fluctuate between states with fixed energy (like from Schrodinger's equation)
Latex |
---|
\[ \hat H \Psi = E \Psi \rightarrow E[\Psi_2 .... \Psi_\omega \|\Psi_1|\Psi_1] \] |
- All states are equally probable, and are given equal weight.
Hydrogen atom
Latex |
---|
\[ E = \frac{1}{n^2} \] |
Latex |
---|
\[ \Omega (n=1) =1 \] |
Latex |
---|
\[ \Omega (n=2) =4 \] |
Probability of being in any E state for a system
- employed the principle of equal a priory probabilities
Latex |
---|
\[ P_v = \mbox{constant} \] |
Latex |
---|
\[ P_v = \frac{1}{\Omega(E)} \] |
Systems
- Equally more probable
- Some accessed more times because of large degeneracy number
- There's just a lot more ways to get configuration left (like a craphand) than the right (like a straight flush)
Latex |
---|
\[ \Omega_1 (E) >> \Omega_2 (E) \] |
An isolated system (
fixed) is equally probable to be in any of its
possible quantum states.
Canonical Ensemble
There is a different set of boundary conditions in the canonical ensemble. There are heat conducting walls or boundaries of each system. Each of the
members of the ensemble find themselves in the heat bath formed by the
members. Each system can fluctuate between different microstates. An energy far from average is unlikely. In the picture below, the energy of the ensemble on the right side of the ensemble is fixed, while the energy of a particular system is not fixed and can fluctuate.
Take another snapshot. There is interest in the distribution. The term
Latex |
---|
\[ \overline {a} \] |
is equal to the number of systems in state
.
Latex |
---|
\[ {a_v} = \overline{a} \] |
Below is a table of microstates, energy, and occurence, and a graph. In the graph, equilibrium has occurred, but all states can be accessed. It is possible to access different states some distance from the average energy. The total energy,
, is fixed and is equal to the integral of the curve. As the number of systems increases, the curve becomes sharper.
Latex |
---|
\[ \mbox{microstate} \] |
| 1 | 2 | 3 | |
---|
Latex |
---|
\[ \mbox {energy} \] |
| | | | |
Latex |
---|
\[ \mbox{occurence} \] |
| | | | |
Constraints
Below are constraints. The first is the sum of the occupation number. The second constraint is possible due to the system being isolated.
Latex |
---|
\[ \sum_v a_v = A \] |
Wiki Markup |
---|
{html}
<P> </P>{html} |
Latex |
---|
\[ \sum_v a_v E_v = \epsilon \] |
The term
is the probability of finding the system in state
. It is possible to use snapshot probability. There are many distributions that satisfy the boundary conditions. There is a better way to find
, and a relation is below. It corresponds to the average distribution. This is associated with a crucial insight.
Latex |
---|
\[ P_v = \frac{\overline{a_v}}{A} \] |
Crucial Insight
An assumption is that the entire canonical ensemble is isolated. No energy can escape, and the energy
is constant. Every distribution of
that satisfies the boundary conditions is equally probable. it is possible to write many body wavefunction because the energy of the entire ensemble is fixed. The principle of equal a priori probabilities is applied. Look at the whole distribution that satisfies the boundarty condition. Each distribution of occurance numbers must be given equal weights.
Summary
The variables
are fixed
- There are heat conducting bondaries of each system
- Each of the (= large number) members finds itself in a heat bath, formed by the other members
- Take snapshot; get distribution
Latex |
---|
\[ a_v = \overline{a} \] |
(= # of systems in state )
Constraints
The total energy,
, is fixed
Latex |
---|
\[ \sum_v a_v = A \sum_v a_v E_v = \epsilon \] |
(isolated!)
Probability
Latex |
---|
\[ P_v \simeq \frac{a_v}{A} \] |
is an approximation.
- Better to use
Latex |
---|
\[ P_v = \frac{\overline{a_v}}{A} \] |
, the averaged distribution - There is an assumption that the whole canonical ensemble is isolated and that energy is constant. Every distribution of that satisfies the boundary conditions is equally probable.
- We are applying the principle of equal a priori probabilities, and each distribution of occurance numbers must be given equal weights.
Some Math
Consider every possible distribution consistent with boundary conditions, and for each distribution consider every possible permutation. The term
Latex |
---|
\[ w (\overline{a}) \] |
is equal to the number of ways to obtain a distribution
, where
is the number of systems in state
. A bad hand in poker is defined by a large number of
. Use the multinomial distribution.
Latex |
---|
\[ a_i = \overline{a} \] |
Wiki Markup |
---|
{html}
<P> </P>{html} |
Latex |
---|
\[ w (\overline{a}) = \frac{ A! }{ a_1! a_2! a_3! ..... a_v! } \] |
Wiki Markup |
---|
{html}
<P> </P>{html} |
Latex |
---|
\[ w (\overline{a}) = \frac{ A! }{ \Pi_v a_v!} \] |
Wiki Markup |
---|
{html}
<P> </P>{html} |
Wiki Markup |
---|
{html}
<P> </P>{html} |
Latex |
---|
\[ \mbox{Number of systems in state 1} \] |
Wiki Markup |
---|
{html}
<P> </P>{html} |
Wiki Markup |
---|
{html}
<P> </P>{html} |
Latex |
---|
\[ \mbox{Number of systems in state v} \] |
Below are expressions of the probability to be in a certain state. The term
is averaged over all possible distributions. Every distribution is given equal weight, and the one with the most permutations is the most favored.
Latex |
---|
\[ P_v = \frac{\overline{a_v}}{A} \] |
Wiki Markup |
---|
{html}
<P> </P>{html} |
Latex |
---|
\[ P_v = \frac{1}{A} \frac{ \sum_{\overline {a}} \omega (\overline{a}) a_v (\overline{a} ) }{ \sum_{\overline a} \omega (\overline{a}) } \] |
Example
Below is an example of four systems in an ensemble. The term
is the probability of any system to be in state
.
| | |
---|
Latex |
---|
\[ \mbox{energy} \] |
| | |
Latex |
---|
\[ \mbox{occupation} \] |
| | |
Latex |
---|
\[ \mbox{distribution} \] |
| | |
Latex |
---|
\[ \mbox{Distribution A} \] |
Wiki Markup |
---|
{html}
<P> </P>{html} |
Latex |
---|
\[ w(A) = \frac{4!}{0!4!} \] |
Wiki Markup |
---|
{html}
<P> </P>{html} |
Wiki Markup |
---|
{html}
<P> </P>{html} |
Latex |
---|
\[ \mbox{Distribution B} \] |
Wiki Markup |
---|
{html}
<P> </P>{html} |
Latex |
---|
\[ w(B) = \frac{4!}{1!3!} \] |
Wiki Markup |
---|
{html}
<P> </P>{html} |
Latex |
---|
\[ w(B) = 4P_1 = \frac{1}{4} \left ( \frac{1 \cdot 0 + 4 \cdot 1 + 6 \cdot 2 + 4 \cdot 3 + 1 \cdot 4}{1 + 4 + 6 + 4 + 1} \right ) \] |
Wiki Markup |
---|
{html}
<P> </P>{html} |
Latex |
---|
\[ P_1 = \frac{1}{2} \] |
Wiki Markup |
---|
{html}
<P> </P>{html} |
Distribution of
The term
is the number of permutations for a particular distribution. As the number of systems increases, or asAincreases, the distribution becomes more peaked.
Consider the probability.
Latex |
---|
\[ P_{\nu} = \frac{1}{A} \frac{ \sum_{\overline {a}} \omega (\overline{a}) a_{\nu} (\overline{a} ) }{ \sum_{\overline {a}} \omega (\overline{a}) } \] |
Latex |
---|
\[ P_{\nu} \approx \frac{ \frac{1}{A} w ( \overline{a}^*) a_{\nu}^*}{w ( \overline{a}^*) } \] |
Latex |
---|
\[ P_{\nu} \approx \frac{a_{\nu}^*}{A} \] |
(Equation 1)
Look at the distribution that maximizes
Latex |
---|
\[ w ( \overline{a}) \] |
, the permutation number. To get
, maximize
Latex |
---|
\[ w ( \overline{a} ) \] |
subject to the constraints below.
Latex |
---|
\[ \sum_{\nu} a_{\nu} - A = 0 \] |
Latex |
---|
\[ \sum_{\nu} a_{\nu}E_{\nu} - \epsilon = 0 \] |
Use Lagrange multipliers, and maximize
Latex |
---|
\[ \ln w ( \overline{a} ) \] |
in order to be able to use Stirling's approximation.
Latex |
---|
\[ \frac{\partial}{\partial a_{\nu}} \left ( \ln w ( \overline{a}) - \alpha \sum_k a_k - \beta \sum_k a_k E_k \right ) = 0 \] |
Latex |
---|
\[ w ( \overline{a} ) = \frac{A!}{\pi_k a_k!} \] |
Latex |
---|
\[ \ln w ( \overline{a}) = \ln A! + \left ( - \ln \pi_k a_k! \right ) = \ln A! - \sum_k \ln a_k! \] |
Use Stirling's approximation as
and the occupation number,
, go to infinity.
Latex |
---|
\[ \sum_k \ln a_k! = \sum_k \left( a_k \ln a_k - a_k \right ) \] |
Latex |
---|
\[ = \sum_K a_K \ln a_K - \sum_K a_K \] |
Latex |
---|
\[ = \sum_K a_K \ln a_K - A\frac{\partial}{\partial a_{\nu}} \left ( \ln A! - \sum_k a_k \ln a_k + A - \alpha \sum_k a_k - \beta \sum_k a_k E_k \right ) = 0 \] |
Latex |
---|
\[ \left ( a_v \to x \mbox{ , } \ln A! - \sum_k a_k \ln a_k + A \to -a_v \ln a_v \mbox{ , }\alpha \sum_k a_k \to \alpha a_v \mbox{ , }\beta \sum_k a_k E_k \to \beta a_v E_k \right )\ln a_v - 1 - \alpha - \beta E_{\nu} = 0 \] |
[
The term
is the occupation number that maximizes the expressione
Latex |
---|
\[ \alpha ' \cdot e^{\beta E_{\nu}} \] |
, where
Latex |
---|
\[ \alpha ' = \alpha + 1 \] |
. Use constraints to determine the Lagrange multipliers and determine the probability.
Latex |
---|
\[ a_{\nu}=e^{\alpha '} \cdot e^{\beta E_{\nu} \] |
Latex |
---|
\[ \sum_{\nu} a_{\nu} = A \] |
Latex |
---|
\[ \sum_{\nu} e^{\alpha '} \cdot e^{\beta E_{\nu} = A \] |
Latex |
---|
\[ e^{\alpha '} = \frac{1}{A} \sum_{\nu} e^{\beta E_{\nu}} \] |
The probability of being in a certain state
can be calculated, and it is still in terms of the second Lagrange multiplier. Plugging back into Equation 1:
Latex |
---|
\[ P_v = \frac{a_{\nu}}{A} = \frac{A}{\sum_{\nu} e^{-\beta E_{\nu}}} \cdot \frac{e{\beta E_{\nu}}}{A} = \frac{e^{-\beta E_{\nu}}}{\sum_{\nu} e^{-\beta E_{\nu}} \] |
Partition Function
The denominator is the partition function,
Latex |
---|
\[ Q = \sum_{\nu}e^{-\beta E_{\nu} \] |
. It tells us how many states are accessible by the system. Determine
, a measure of thermally accessible states . Look at how the partition function connects to macroscopic thermodynamic variables. Find
and find
Latex |
---|
\[ \overline{E} = \sum_{\nu} P_{\nu} E_{\nu}\overline{E}= \frac{\sum_{\nu} E_{\nu} e^{-\beta E_{\nu}}}{Q} \] |
Consider the average pressure. The pressure for one microstate is
.
Latex |
---|
\[ p_{\nu} = \frac{-\partial E_{\nu}}{\partial V} \] |
Latex |
---|
\[ p_{\nu} = \sum_{\nu} P_{\nu} p_{\nu} \] |
Latex |
---|
\[ p_{\nu} = \frac{ -\sum_{\nu} \left ( \frac{\partial E_{\nu}}{\partial V} \right ) e^{-\beta E_{\nu}}}{Q} \] |
Summary
In the case of a canonical ensemble, the energy of the entire ensemble is fixed. Each state is equally probable, and there is degeneracy. The probability is a function of how many ways to get the distribution. The distribution with the most permutation is the most probable. The graph can become very peaked. Once the distribution is known, do a maximization of
Latex |
---|
\[ w(\overline{a}) \] |
. Use Lagrange multipliers and two constraints. The term
is the distribution that maximizes an expression. This is what is most often found. Go back to the probability, get an expression, and give part of it a name. The term
is a measure of how many states are thermally accessible.
Consider every possible distribution
Latex |
---|
\[ a_i = \overline{a} \] |
(consistent with boundary conditions)
- For each distribution, consider every possible permutation
- The number of ways to obtain is
Latex |
---|
\[ \omega (\overline{a}) = \frac{ A! }{ a_1! a_2! a_3! ..... a_v! } = \frac{ A! }{ \Pi_v a_v!} \] |
, where is the number of systems in state .
Probability
Latex |
---|
\[ P_v = \frac{\overline{a_v}}{A} = \frac{1}{A}\frac{ \sum_{\overline a} \omega (\overline{a}) a_v (\overline{a} ) }{ \sum_{overline a}}\omega (\overline{a}) } \] |
Averaging
over all possible distributions.
Latex |
---|
\[ w(\overline a) \] |
Latex |
---|
\[ w(\overline a) \] |
is very peaked around a specific distribution
Increase
and
Latex |
---|
\[ \omega{\overline a} \] |
becomes more peaked
To get
, maximize
Latex |
---|
\[ w (\overline a) \] |
subject to constraints.
Find the partition function,