Quantum ComputingEntropyUniversity
AQAAPOntarioNSWCBSEGCE O-LevelMoECAPS

Von Neumann Entropy

Measure of the quantum information content in a state.

Understand the formulaSee the free derivationOpen the full walkthrough

This public page keeps the free explanation visible and leaves premium worked solving, advanced walkthroughs, and saved study tools inside the app.

Core idea

Overview

The Von Neumann entropy generalizes the classical Shannon entropy to quantum mechanical systems by using the density operator to represent state uncertainty. It measures the amount of quantum information or mixedness present in a state, where a value of zero signifies a pure state and higher values indicate a mixture.

When to use: This formula is applied when evaluating the purity of a quantum state or calculating the entanglement between subsystems in a bipartite system. It is also essential in quantum thermodynamics and channel capacity calculations when dealing with mixed states.

Why it matters: It provides a rigorous framework for understanding information loss in decoherence and sets the fundamental limits for quantum data compression. Its behavior under various transformations helps define the laws of quantum information processing and the second law of thermodynamics in quantum systems.

Symbols

Variables

S = Entropy, P_1 = Prob (State 1), P_2 = Prob (State 2)

Entropy
Prob (State 1)
Prob (State 2)

Walkthrough

Derivation

Formula: Von Neumann Entropy

Measures the quantum information content (mixedness) of a quantum state, analogous to Shannon entropy for classical systems.

  • ρ is the density matrix of the quantum state.
  • Eigenvalues λᵢ of ρ satisfy Σλᵢ = 1.
1

Von Neumann Entropy:

Compute the eigenvalues λᵢ of the density matrix. The entropy is the sum of −λᵢ log₂(λᵢ) over all eigenvalues.

2

Interpretation:

A pure state has zero entropy. A maximally mixed d-dimensional state has entropy log₂(d) bits.

Result

Source: University Quantum Computing — Quantum Information Theory

Free formulas

Rearrangements

Solve for

Von Neumann Entropy

Start from the Von Neumann Entropy formula for a quantum state and transform it into the Shannon entropy formula for a classical probability distribution, including a change of logarithm base.

Difficulty: 2/5

The static page shows the finished rearrangements. The app keeps the full worked algebra walkthrough.

Visual intuition

Graph

Graph unavailable for this formula.

The graph depicts the Von Neumann entropy $S(\lambda) = -\lambda \ln(\lambda)$ for a single eigenvalue $\lambda$ in the range $[0, 1]$, forming a concave 'hump' shape that begins and ends at zero. The curve features a global maximum at $\lambda = 1/e \approx 0.368$, reflecting the state of maximum uncertainty or mixedness. This shape illustrates that entropy is minimal for pure states (where $\lambda$ is 0 or 1) and maximal for states with intermediate, balanced probability distributions.

Graph type: sigmoid

Why it behaves this way

Intuition

Envision a quantum state as a distribution of probabilities over a set of orthogonal pure states; the Von Neumann entropy quantifies the 'spread' or 'fuzziness' of this distribution, with a perfectly localized

A measure of the uncertainty or mixedness of a quantum state \rho.
Quantifies how 'pure' or 'mixed' a quantum state is; a pure state (fully known) has zero entropy, while a mixed state (a probabilistic combination of pure states) has positive entropy, indicating greater uncertainty.
The density operator (or density matrix), which completely describes the statistical state of a quantum system, encompassing both pure and mixed states.
It is a quantum generalization of a classical probability distribution, providing the probabilities of different outcomes for measurements and describing the system's state even when it's not in a single pure state.
The operator logarithm of the density operator. When \rho is diagonalized with eigenvalues p_i, \log \rho has eigenvalues \log p_i.
Similar to the term in Shannon entropy, it represents the 'surprise' or information content associated with the probability of finding the system in a particular eigenstate.
The trace operation, which sums the diagonal elements of an operator. For a function f(\rho), \text{Tr}(f(\rho)) = \sum_i f(p_i), where p_i are the eigenvalues of \rho.
It averages or sums up the contributions of the term over all possible outcomes or components of the quantum state, effectively calculating the expectation value of with respect to .

Signs and relationships

  • -: The eigenvalues of the density operator are probabilities (0 1). Consequently, 0, making each term 0. Thus, ( ) = 0.

Free study cues

Insight

Canonical usage

The Von Neumann entropy is a dimensionless quantity, typically expressed in 'nats' when using the natural logarithm or 'bits' when using the base-2 logarithm, to quantify information content.

Common confusion

Students often confuse the base of the logarithm (natural log vs. base-2 log), which leads to different numerical values for entropy. Another common mistake is to mistakenly assign physical units (like J/K)

Dimension note

The Von Neumann entropy is inherently dimensionless, representing a measure of information, uncertainty, or mixedness in a quantum state. Its 'units' (nats or bits)

Unit systems

nats or bits · The unit depends on the base of the logarithm used. If `\log` denotes the natural logarithm (`\ln`), the unit is 'nats'. If `\log` denotes the base-2 logarithm (`\log_2`), the unit is 'bits'.

Ballpark figures

  • Quantity:

One free problem

Practice Problem

Calculate the Von Neumann entropy s for a maximally mixed state qubit where the eigenvalues of the density matrix are p1 = 0.5 and p2 = 0.5.

Prob (State 1)0.5
Prob (State 2)0.5

Solve for:

Hint: For a mixed state, use the formula s = -(p1 × log₂(p1) + p2 × log₂(p2)).

The full worked solution stays in the interactive walkthrough.

Study smarter

Tips

  • Always use the eigenvalues of the density matrix for calculation.
  • Recall that for pure states, the entropy is always zero.
  • Use log base 2 to measure results in bits or qubits.

Common questions

Frequently Asked Questions

Measures the quantum information content (mixedness) of a quantum state, analogous to Shannon entropy for classical systems.

This formula is applied when evaluating the purity of a quantum state or calculating the entanglement between subsystems in a bipartite system. It is also essential in quantum thermodynamics and channel capacity calculations when dealing with mixed states.

It provides a rigorous framework for understanding information loss in decoherence and sets the fundamental limits for quantum data compression. Its behavior under various transformations helps define the laws of quantum information processing and the second law of thermodynamics in quantum systems.

Always use the eigenvalues of the density matrix for calculation. Recall that for pure states, the entropy is always zero. Use log base 2 to measure results in bits or qubits.

References

Sources

  1. Nielsen, Michael A., and Isaac L. Chuang. Quantum Computation and Quantum Information.
  2. Wikipedia: Von Neumann entropy
  3. Nielsen, Michael A., and Isaac L. Chuang. Quantum Computation and Quantum Information. Cambridge University Press, 2010.
  4. Nielsen and Chuang Quantum Computation and Quantum Information
  5. Sakurai Modern Quantum Mechanics
  6. Wehrl General properties of entropy (Reviews of Modern Physics)
  7. IUPAC Gold Book
  8. University Quantum Computing — Quantum Information Theory