Entropy is a feature of probability distributions, and can be taken to be a quantification of uncertainty.

Standard quantum mechanics takes its fundamental object to be the wave function – an amplitude distribution. And from an amplitude distribution Ψ you can obtain a probability distribution Ψ^{*}Ψ.

So it is very natural to think about the entropy of a given quantum state. For some reason, it looks like this concept of wave function entropy is not used much in physics. The quantum-mechanical version of entropy that is typically referred to is the Von-Neumann entropy, which involves uncertainty over which quantum state a system is in (rather than uncertainty intrinsic to a quantum state).

I’ve been looking into some of the implications of the concept of wave function entropy, and found a few interesting things.

Firstly, let’s just go over what precisely wave function entropy is.

Quantum mechanics is primarily concerned with calculating the wave function Ψ(x), which distributes complex amplitudes over phase space. The physical meaning of these amplitudes is interpreted by taking their absolute square Ψ^{*}Ψ, which is a probability distribution.

Thus, the entropy of the wave function is given by:

S = – ∫ Ψ^{*}Ψ ln(Ψ^{*}Ψ) dx

As an example, I’ll write out some of the wave functions for the basic hydrogen atom:

(Ψ^{*}Ψ)_{1s} = e^{-2r} / π

(Ψ^{*}Ψ)_{2s} = (2 – r)^{2} e^{-r} / 32π^{
}(Ψ^{*}Ψ)_{2p} = r^{2} e^{-r} cos(θ) / 32π

(Ψ^{*}Ψ)_{3s} = (2r^{2} – 18r + 27)^{2} e^{-⅔r} / 19683π

With these wave functions in hand, we can go ahead and calculate the entropies! Some of the integrals are intractable, so using numerical integration, we get:

S_{1s} ≈ 70

S_{2s} ≈ 470

S_{2p} ≈ 326

S_{3s} ≈ 1320

The increasing values for (1s, 2s, 3s) make sense – higher energy wave functions are more dispersed, meaning that there is greater uncertainty in the electron’s spatial distribution.

Let’s go into something a bit more theoretically interesting.

We’ll be interested in a generalization of entropy – relative entropy. This will quantify, rather than pure uncertainty, changes in uncertainty from a prior probability distribution ρ to our new distribution Ψ^{*}Ψ. This will be the quantity we’ll denote S from now on.

S = – ∫ Ψ^{*}Ψ ln(Ψ^{*}Ψ/ρ) dx

Now, suppose we’re interested in calculating the wave functions Ψ that are local maxima of entropy. This means we want to find the Ψ for which δS = 0. Of course, we also want to ensure that a few basic constraints are satisfied. Namely,

∫ Ψ^{*}Ψ dx = 1

∫ Ψ^{*}HΨ = E

These constraints are chosen by analogy with the constraints in ordinary statistical mechanics – normalization and average energy. H is the Hamiltonian operator, which corresponds to the energy observable.

We can find the critical points of entropy that satisfy the constraint by using the method of Lagrange multipliers. Our two Lagrange multipliers will be α (for normalization) and β (for energy). This gives us the following equation for Ψ:

Ψ ln(Ψ^{*}Ψ/ρ) + (α + 1)Ψ + βHΨ = 0

We can rewrite this as an operator equation, which gives us

ln(Ψ^{*}Ψ/ρ) + (α + 1) + βH = 0

Ψ^{*}Ψ = ρ/Z e^{-βH}

Here we’ve renamed our constants so that Z = e^{α+1} is a normalization constant.

So we’ve solved the wave function equation… but what does this tell us? If you’re familiar with some basic quantum mechanics, our expression should look somewhat familiar to you. Let’s backtrack a few steps to see where this familiarity leads us.

Ψ ln(Ψ^{*}Ψ/ρ) + (α + 1)Ψ + βHΨ = 0

HΨ + 1/β ln(Ψ^{*}Ψ/ρ) Ψ = – (α + 1)/β Ψ

Let’s rename – (α + 1)/β to a new constant λ. And we’ll take a hint from statistical mechanics and call 1/β the *temperature* T of the state. Now our equation looks like

HΨ + T ln(Ψ^{*}Ψ/ρ) Ψ = λΨ

This equation is *almost* the Schrodinger equation. In particular, the Schrodinger equation pops out as the zero-temperature limit of this equation:

As T → 0,

our equation becomes…

HΨ = λΨ

The obvious interpretation of the constant λ in the zero temperature limit is E, the energy of the state.

What about in the infinite-temperature limit?

As T → ∞,

our equation becomes…

Ψ^{*}Ψ = ρ

Why is this? Because the only solution to the equation in this limit is for ln(Ψ^{*}Ψ/ρ) → 0, or in other words Ψ^{*}Ψ/ρ → 1

And what this *means* is that in the infinite temperature limit, the critical entropy wave function is just that which gives the prior distribution.

We can interpret this result as a generalization of the Schrodinger equation. Rather than a linear equation, we now have an additional logarithmic nonlinearity. I’d be interested to see how the general solutions to this equation differ from the standard equations, but that’s for another post.

HΨ + T ln(Ψ^{*}Ψ/ρ) Ψ = λΨ