The thermal interpretation of quantum physics

Arnold Neumaier (Vienna)




The official description of the thermal interpretation of quantum physics can be found in Section 9.2 of my recent book


This book introduces mathematicians, physicists, and philosophers to a new, coherent approach to theory and interpretation of quantum physics, in which classical and quantum thinking live peacefully side by side and jointly fertilize the intuition. The formal, mathematical core of quantum physics is cleanly separated from the interpretation issues.

The book demonstrates that the universe can be rationally and objectively understood from the smallest to the largest levels of modeling. The thermal interpretation featured in this book succeeds without any change in the theory. It involves one radical step, the reinterpretation of an assumption that was virtually never questioned before - the traditional eigenvalue link between theory and observation is replaced by a q-expectation link:

Objective properties are given by q-expectations of products of quantum fields and what is computable from these. Averaging over macroscopic spacetime regions produces macroscopic quantities with negligible uncertainty, and leads to classical physics.

  • Reflects the actual practice of quantum physics.
  • Models the quantum-classical interface through coherent spaces.
  • Interprets both quantum mechanics and quantum field theory.
  • Eliminates probability and measurement from the foundations.
  • Proposes a novel solution of the measurement problem.


  • The book is based on the following series of preprints:
    Introduction to coherent spaces,
    Foundations of quantum physics I. A critique of the tradition,
    Foundations of quantum physics II. The thermal interpretation,
    Foundations of quantum physics III. Measurement,
    Foundations of quantum physics IV. More on the thermal interpretation,
    Foundations of quantum physics V. Coherent foundations.


    There is a discussion of these preprints at PhysicsForums. It contains, for example, the Thermal Interpretation Summary Redux given by DarMM in the discussion, of which I give here a slightly modified latex-free variant.

    1. q-expectations and q-correlators are physical properties of quantum systems, not predicted averages. This makes these objects highly "rich" in terms of properties, for a pair correlator is not merely a statistic for the field value, but actually a property itself and so on for higher correlators.
    2. To emphasize the point above remember that probability theory is merely a formal construct whose elements are interpreted based on the context in which they are used. Thus we are not required to view things like Tr(ρA) as statistical statements fundamentally. Here in the Thermal Interpretation we view them as property assignment. They are merely properties whose values have the same relation to each other as the formal relations of probability theory.
    3. This richness of physical properties is not compatible with the notion of a system being purely decomposable into its subsystems in all cases. There are many properties such as a correlators that are properties of the total system that don't arise from properties of subsystems.
    4. Since some properties are assigned to the system as a whole, which can be quite extended, they provide the nonlocal beables required by Bell's theorem. This is a combination of points above. Consider an extended two photon system. This has correlator properties like <AB> that are assigned to the whole system, no matter how extended it is and by the above these properties are not merely a property or combination of properties of any of the subsystems.
      Note in the Thermal Interpretation, once we have these nonlocal properties, violation of the CHSH inequalities for example has a simple explanation. There are simply four nonlocal total system properties:
      <AB>, <BC>, <CD>, <AD>
      for whom the sum
      <AB> + <BC> + <CD> - <AD>
      is greater than 2.
    5. Properties in the Thermal Interpretation are intrinsically "fuzzy". For example for a particle its position property <q> has an associated property Δq, in orthodox terminology called the uncertainty, that indicates the delocalised nature of the particle's position. Rather than the particle being located at points along a world line q(t) it in fact constitutes a world tube centered on <q>t and with width Δqt.
      All properties are blurred/fuzzy like this, not just position. Note in particular that spin for example is not discrete at the fundamental ontological level since <Sz> may take a continuous range of values. And thus we have a "spin tube" of width ΔSz.
      For this reason there is a more basic notion of uncertainty to quantum properties akin to asking "What is the position of Paris". Since Paris is an extended object there is no precise answer to this question.
    6. Nonetheless as per standard scientific practice even this basic notion of uncertainty may be treated statistically.
    7. In measurements our devices (for reasons given in the next point) unfortunately only become correlated with a single (blurred) point within the world tube or blurred range of a property. This gives measurements on quantum systems discrete results that don't faithfully represent the "tubes". Thus we must reconstruct them from multiple observations.
    8. The origin of this discreteness is in the metastability of the system-device-environment interaction. The device as a physical system can be partitioned into slow large scale and fast small scale modes. The space of slow large scale modes has the form of a set of disconnected manifolds. After interaction with the system the total state of the device initially faithfully records the uncertain quantity <A> however it is metastable. Noise from the environment causes it to quickly decay into one of the slow mode manifolds giving a discrete outcome not fully reflective of < A>. From our perspective, ignorant of the environmental dynamics, this constitutes a stochastically driven discrete outcome of a measurement.


    Below you can find

  • some slightly older (in parts less accurate) expository material, and
  • some links to even older stuff.


    The thermal interpretation of quantum physics (including quantum mechanics, quantum field theory, quantum statistical mechanics, and application) allows a consistent and deterministic relativistic quantum description of the universe from the smallest to the largest levels of modeling, including its classical aspects, without having to introduce any change in the formal apparatus of quantum physics.


    Essential use is made of the fact that everything physicists measure is measured in a thermal environment for which statistical thermodynamics is relevant. This is reflected in the characterizing adjective ''thermal'' for the interpretation.

    For microscopic experiments, the thermal interpretation claims that particles (photons, electrons, alpha particles, etc.) are not fundamental but only convenient simplifications appropriate under special circumstances. From a more fundamental point of view one has instead beams (states of the electron field, an effective alpha particle field, etc., concentrated along a small neighborhood of a mathematical curve) with approximately known properties (charge densities and currents, spin densities, energy densities, etc.) If one places a detector into the path of a beam one measures some of these densities - accurately if the densities are high, erratically and inaccurately when they are very low.

    When performing on a quantum system a measurement of an operator A with a physical meaning, one gets an approximation for its value. The thermal interpretation treats this value as an approximation not of an eigenvalue of A but of the q-expectation of A, the formal expectation value defined as the trace of the product of A with a density operator describing the state of the system. This radical deviation from the tradition has important theoretical implications.

    The deterministic dynamics of the complete collection of q-expectations constructible from quantum fields, when restricted to the set of measurable ones, gives rise to all the stochastic features observed in practice.


    The thermal foundations are easily stated and motivated since they are essentially the foundations used everywhere for uncertainty quantification, just slightly extended to accommodate quantum effects by not requiring that observables commute. The state of our solar system, when the latter is modeled by quantum fields, completely specifies what happens in any small space-time region within the planetary system - namely through the n-point correlation functions with arguments restricted to this neighborhood. There is nothing else in quantum field theory; what we can observe is contained in the least oscillating contributions to this correlations. The spatial and temporal high frequency part is unobservable due to the limited resolution of our instruments.

    Unlike the thermal interpretation, statistical interpretations and interpretations of Copenhagen flavor apply only to the results of infinite-time few-particle scattering calculations derived from quantum field theory. They cannot apply to the finite time quantum field theory of our solar system since there is no external classical apparatus for measuring this system, and only a single realization of the solar system is experimentally accessible to us.


    The thermal interpretation of quantum mechanics says that, consistent with statistical thermodynamics, an expectation (ensemble mean) should not be interpreted as a statistical average over many realizations (except when the statistical context is immediate). Instead, it should be interpreted as an in principle approximately measurable quantity. Therefore, the notion of ensemble is to be understood not necessarily (and in case of a quantum field theory of our solar system never) as an actual repetition by repeated preparation. It should be understood instead in the original sense used by Gibbs - who coined the notion of an ensemble as a collection of imagined copies of which only one is actually realized -, giving him an intuitive excuse to be able to use the statistical formalism to describe a single thermodynamic system such as a single piece of metal. What is conventionally called expectation becomes in the thermal interpretation simply the uncertain value.


    According to the thermal interpretation of quantum mechanics, we need for a description of the universe a mathematical framework consisting of a Hilbert space carrying a unitary representation of the Poincare group to account for conservative dynamics and relativity, a representation of the standard model plus some form of gravity (not yet fully known) to describe the fundamental field content, density operators ρ encoding Heisenberg states, the formula Ã=<A>:=trρA defining the uncertain value (generally called expectation value) of the operator A, and for its interpretation the following simple rule generalizing statistical intuition:

    Uncertainty principle: A Hermitian quantity A whose uncertainty σA, the square root of <(A-Ã)2>, is much less than |Ã| has the value à within an uncertainty of σA.

    From this rule one can derive under appropriate conditions the following

    Measurement rule: Upon measuring a Hermitian operator A, the measured result will be approximately Ã, with an uncertainty at least of the order of σA, the square root of <(A-Ã)2>. If the measurement can be sufficiently often repeated (on a system with the same or a sufficiently similar state) then σA will be a lower bound on the standard deviation of the measurement results.

    Physicists doing quantum mechanics (even those adhering to the shut-up-and-calculate mode of working) use this rule routinely and usually without further justification. The rule applies universally. No probabilistic interpretation is needed, so it applies also to single systems. Born's famous rule turns out to be derivable under special circumstances only, namely those where the Born rule is indeed valid in practice. (Though usually invoked as universally valid, Born's rule has severe limitations. It neither applies to position measurements nor to photodetection, nor to measurement of energies, just to mention the most conspicuous misfits.)

    Actually the above measurement rule should be considered as a definition of what it means to have a device measuring A. As such it creates the foundation of measurement theory. In order that a macroscopic quantum device qualifies for the description ''it measures A'' it must either be derivable from quantum mechanics, or checkable by experiment, that the property claimed in the above measurement rule is in fact valid. Thus there is no circularity in the foundations.


    Most descriptions in physics are either very coarse-grained or of very small objects. The detailed state can be found with a good approximation only for fairly stationary sources of very small objects, that prepare sufficiently many of these in essentially the same quantum state. In this case, one can calculate sufficiently many expectations by averaging over the results of multiple experiments on these objects, and use these to determine the state via some version of quantum state tomography. Except in very simple situations, the result is a mixed state described by a density operator. Thus in the thermal interpretation, any realistic state is fully described by a density operator, not by a state vector as in conventional interpretations.

    For macroscopic systems, one must necessarily use a coarse-grained description in terms of a limited number of parameters. In the quantum field theory of macroscopic objects, the averaging is always done inside the definition of the macroscopic operator to be measured; this is sufficient to guarantee very small uncertainties of macroscopic observables. Thus one does not need an additional averaging in terms of multiple experiments on similarly prepared copies of the system. This is the deeper reason why quantum field theory can make accurate predictions for single macroscopic systems.

    Everything deduced in quantum field theory about macroscopic properties follows, and one has a completely self-consistent setting. The transition to classicality is automatic and needs no deep investigations - the classical situation is simply the limit of a huge number of particles. Whereas on the microscopic level, uncertainties of single events are large, so that state determination must be based by the statistics of multiple events with a similar preparation. (In this case, one can derive Born's traditional rule for perfect binary measurements in pure states; see Chapter 10.5 in my online book Classical and Quantum Mechanics via Lie algebras.)

    Although only a coarse-grained description of a macroscopic system is can be explicitly known, this doesn't mean that the detailed state doesn't exist. The existence of an exact state for large objects has always been a metaphysical but unquestioned assumption. Even in classical mechanics, it is impossible to know a highly accurate state of a many-particle system (not even of the solar system with sun, planets, planetoids, and comets treated as rigid bodies). But its existence is never questioned.


    In quantum optics experiments, both sources and beams are extended macroscopic objects describable by quantum field theory and statistical mechanics, and hence have (according to the thermal interpretation) associated nearly classical observables - densities, intensities, correlation functions - computable from quantum mechanics in terms of expectations.

    The sources have properties independent of measurement, and the beams have properties independent of measurement. These are objects described by quantum field theory. For example, the output of a laser (before or after parametric down conversion or any other optical processing) is a laser beam, or an arrangement of highly correlated beams. These are in a well-defined state that can be probed by experiment. If this is done, they are always found to have the properties ascribed to them by the preparation procedure. One just needs sufficient time to collect the information needed for a quantum state tomography. The complete state is measurably in this way, reproducibly. Neither the state of the laser nor of the beam is changed by a measurement at the end of the beam. Thus these properties exist independent of any measurement - just as the moon exists even when nobody is looking at it!

    It is a historical accident that one continues to use the name particle in the many microscopic situations where it is grossly inappropriate to think of it in terms of a tiny bullet moving through space. If one restricts the use of the particle concept to situations where it is appropriate, or if one does not think of particles as ''objects'' - in both cases all mystery is gone, and the foundations become fully rational and intelligible.


    Unlike in conventional single-world interpretations of quantum mechanics, nothing in the thermal interpretation depends on the existence of measurement devices (which were not available in the very far past of the universe). Thus the thermal interpretation allows one to consider the single universe we live in as a quantum system, the smallest closed physical system containing us, hence strictly speaking the only system to which unitary quantum mechanics applies rigorously.

    There is no longer a physical reason to question the existence of the state of the whole universe, even though all its details may be unknown for ever. Measuring all observables or finding its exact state is already out of the question for a small macroscopic quantum system such as a piece of metal. Thus, as for a metal, one must be content with describing the state of the universe approximately.

    What matters for a successful physics of the universe is only that we can model (and then predict) the observables that are accessible to measurement. Since all quantities of interest in a study of the universe as a whole are macroscopic, they have a tiny uncertainty and are well-determined even by an approximate state. For example, one could compute from a proposed model of the universe the (expectation) values of the electromagnetic field at points where we can measure it, and (if the computations could be done) should get excellent agreement with the measurements.

    Since every observable of a subsystem is also an observable of the whole system, the state of the universe must be compatible with everything we have ever empirically observed in the universe! This is a very stringent test of adequacy - the state of the universe is highly constrained since knowing this state amounts to having represented all physics accessible to us by the study of its subsystems. Cosmology studies this state in a very coarse (and partly conjectured) approximation where even details at the level of galaxies are averaged over. Only for observables localized in the solar system we have a much more detailed knowledge.


    Earlier (pre 2019) discussions of the thermal interpretation can be found in the following topics from my theoretical physics FAQ:

    I have also written some technical documents about the thermal interpretation, see and Chapter 10 of my online book

    Themes related to the thermal interpretation can also be found in parts of the following discussions on PhysicsForums:

    In a preliminary German version, the thermal interpretation dates back to Spring 2004; see the beginnings of the thermal interpretation.

    Older, partially outdated but in some respects more detailed discussions

    Happy Reading!


    A theoretical physics FAQ

    Arnold Neumaier (Arnold.Neumaier@univie.ac.at)