Von Neumann Entropy
Measure of the quantum information content in a state.
This public page keeps the free explanation visible and leaves premium worked solving, advanced walkthroughs, and saved study tools inside the app.
Core idea
Overview
The Von Neumann entropy generalizes the classical Shannon entropy to quantum mechanical systems by using the density operator to represent state uncertainty. It measures the amount of quantum information or mixedness present in a state, where a value of zero signifies a pure state and higher values indicate a mixture.
When to use: This formula is applied when evaluating the purity of a quantum state or calculating the entanglement between subsystems in a bipartite system. It is also essential in quantum thermodynamics and channel capacity calculations when dealing with mixed states.
Why it matters: It provides a rigorous framework for understanding information loss in decoherence and sets the fundamental limits for quantum data compression. Its behavior under various transformations helps define the laws of quantum information processing and the second law of thermodynamics in quantum systems.
Symbols
Variables
S = Entropy, P_1 = Prob (State 1), P_2 = Prob (State 2)
Walkthrough
Derivation
Formula: Von Neumann Entropy
Measures the quantum information content (mixedness) of a quantum state, analogous to Shannon entropy for classical systems.
- ρ is the density matrix of the quantum state.
- Eigenvalues λᵢ of ρ satisfy Σλᵢ = 1.
Von Neumann Entropy:
Compute the eigenvalues λᵢ of the density matrix. The entropy is the sum of −λᵢ log₂(λᵢ) over all eigenvalues.
Interpretation:
A pure state has zero entropy. A maximally mixed d-dimensional state has entropy log₂(d) bits.
Result
Source: University Quantum Computing — Quantum Information Theory
Free formulas
Rearrangements
Solve for
Von Neumann Entropy
Start from the Von Neumann Entropy formula for a quantum state and transform it into the Shannon entropy formula for a classical probability distribution, including a change of logarithm base.
Difficulty: 2/5
The static page shows the finished rearrangements. The app keeps the full worked algebra walkthrough.
Visual intuition
Graph
Graph unavailable for this formula.
The graph depicts the Von Neumann entropy $S(\lambda) = -\lambda \ln(\lambda)$ for a single eigenvalue $\lambda$ in the range $[0, 1]$, forming a concave 'hump' shape that begins and ends at zero. The curve features a global maximum at $\lambda = 1/e \approx 0.368$, reflecting the state of maximum uncertainty or mixedness. This shape illustrates that entropy is minimal for pure states (where $\lambda$ is 0 or 1) and maximal for states with intermediate, balanced probability distributions.
Graph type: sigmoid
Why it behaves this way
Intuition
Envision a quantum state as a distribution of probabilities over a set of orthogonal pure states; the Von Neumann entropy quantifies the 'spread' or 'fuzziness' of this distribution, with a perfectly localized
Signs and relationships
- -: The eigenvalues of the density operator are probabilities (0 1). Consequently, 0, making each term 0. Thus, ( ) = 0.
Free study cues
Insight
Canonical usage
The Von Neumann entropy is a dimensionless quantity, typically expressed in 'nats' when using the natural logarithm or 'bits' when using the base-2 logarithm, to quantify information content.
Common confusion
Students often confuse the base of the logarithm (natural log vs. base-2 log), which leads to different numerical values for entropy. Another common mistake is to mistakenly assign physical units (like J/K)
Dimension note
The Von Neumann entropy is inherently dimensionless, representing a measure of information, uncertainty, or mixedness in a quantum state. Its 'units' (nats or bits)
Unit systems
Ballpark figures
- Quantity:
One free problem
Practice Problem
Calculate the Von Neumann entropy s for a maximally mixed state qubit where the eigenvalues of the density matrix are p1 = 0.5 and p2 = 0.5.
Solve for:
Hint: For a mixed state, use the formula s = -(p1 × log₂(p1) + p2 × log₂(p2)).
The full worked solution stays in the interactive walkthrough.
Study smarter
Tips
- Always use the eigenvalues of the density matrix for calculation.
- Recall that for pure states, the entropy is always zero.
- Use log base 2 to measure results in bits or qubits.
Common questions
Frequently Asked Questions
Measures the quantum information content (mixedness) of a quantum state, analogous to Shannon entropy for classical systems.
This formula is applied when evaluating the purity of a quantum state or calculating the entanglement between subsystems in a bipartite system. It is also essential in quantum thermodynamics and channel capacity calculations when dealing with mixed states.
It provides a rigorous framework for understanding information loss in decoherence and sets the fundamental limits for quantum data compression. Its behavior under various transformations helps define the laws of quantum information processing and the second law of thermodynamics in quantum systems.
Always use the eigenvalues of the density matrix for calculation. Recall that for pure states, the entropy is always zero. Use log base 2 to measure results in bits or qubits.
References
Sources
- Nielsen, Michael A., and Isaac L. Chuang. Quantum Computation and Quantum Information.
- Wikipedia: Von Neumann entropy
- Nielsen, Michael A., and Isaac L. Chuang. Quantum Computation and Quantum Information. Cambridge University Press, 2010.
- Nielsen and Chuang Quantum Computation and Quantum Information
- Sakurai Modern Quantum Mechanics
- Wehrl General properties of entropy (Reviews of Modern Physics)
- IUPAC Gold Book
- University Quantum Computing — Quantum Information Theory