Talks

Wave-Functionalism

Valia Allori

Northern Illinois University

In this paper I present a new perspective for interpreting the wave function as a non-material, non-epistemic, non-representational entity. I endorse a functional view according to which the wave function is defined by its roles in the theory. I argue that this approach shares some similarities with the nomological account of the wave function as well as with the pragmatist and epistemic approaches to quantum theory, while avoiding the major objections of these alternatives.

 

Two stories in quantum gravity

Erik Aurell

KTH, Stockholm

Is it necessary to quantize the gravitational field? The analogous question for the electrody- namic field was settled in a famous paper by Bohr and Rosenfeld, now almost 90 years ago. The answer is yes, as otherwise there would be a a way to bypass the Heisenberg uncertainty relations for a body obeying quantum mechanics and interacting with the field.

The gravitational question has been discussed for a long time, with opposite opinions ap- pearing in the literature until very recently. I will discuss that one class of recently proposed Gedankenexperiments do not settle the issue, as a putative paradox (to be described in the talk) can be resolved by the simpler assumption that length scales below the Planck length can not be resolved. That the Gedankenexperiemnt does not imply quantization of the grav- itational field of course does not imply the opposite, i.e. that gravitation is not quantized (i). It could also be that that the gravitational field is quantized anyway (ii), or that gravitation is somehow quantized, but is not a quantum field theory (iii).

In the second part of the talk I will discuss an idea related to option (iii), motivated by Hawking’s paper ”Information Preservation and Weather Forecasting for Black Holes” (2014).

The talk is based on joint work with Erik Rydving and Igor Pikovski [first part, arXiv:2107.07514] and Michal Eckstein and Pawel Horodecki [second part, Foundations of Physics 51 (2), 1-13 (2021)].

 

Of Weighting and Counting: Statistics and Ontology from StatMech to OQT

Massimiliano Badino

University of Verona

Planck’s 1900 theory of the blackbody radiation hid more than it revealed. One half of it looked like a clever piece of Maxwellian physics, but the other half did not square with anything seen before. Planck had crucially twisted Boltzmann’s classical combinatorial approach for his purposes, and he had introduced the mysterious quantum of action. It is no exaggeration to claim that a large part of the ensuing Old Quantum Theory was concerned with the arduous task of making sense of Planck’s theory from the point of view of Boltzmann’s statistical mechanics. This process eventually paved the way for Schrödinger’s wave mechanics.

In my talk, I reconstruct the intricate interconnections between statistical mechanics and old quantum theory. Initially confined in the limited realm of the blackbody problem and the clarification of Planck’s mysterious formula, these interconnections progressively expanded to include stubborn problems of thermodynamics, the entire radiation theory, and the role of statistics in physics. Perhaps more interestingly, the quest for an interpretation of Planck’s radiation law brought to light the deep cleavage separating the ontological assumptions of classical mechanics and quantum theory.

 

Present and future precision tests of spontaneous wave function collapse models

Angelo Bassi

University of Trieste

Quantum mechanics is grounded on the superposition principle, which is the source both of its tremendous success and technological power, as well as of the problems in understanding it. The reason why superpositions do not propagate from the microscopic to the macroscopic world are subject to debate. Spontaneous wave function collapse models have been formulated to take into account a progressive breakdown of quantum superpositions when systems are large enough; they do so by modifying the Schrödinger dynamics, and therefore they are empirically testable. Deviations are tiny, and require precision measurements. I will review the most recent tests of such models, with a focus on gravity-related ones.

 

Mesoscale Models and Many-Body Physics

Robert Batterman

University of Pittsburgh

What is the best way to study the bulk behavior of many-body systems? A natural, common sentiment among philosophers and physicists is to take a foundational perspective. Examine the theory that characterizes the interactions among the components of such many-body systems and derive the continuum scale behaviors. This approach serves also to reduce, in ef- fect, the continuum theories (Navier-Stokes, Navier-Cauchy) to more fundamental lower scale theories. The hope would be that in so doing we would also be able to explain the relative autonomy of those continuum theories from the lower scale more fundamental details. After all, the continuum theories do not recognize any structure whatsoever below continuum scales.

I argue that this reductionist approach is not fruitful and cannot explain the relative autonomy/universality of the continuum theories—theories that continue to be used in engineering contexts. Instead, I describe an approach that appeals to order parameters and material parameters understood to be defined at mesoscales. That is, we need to treat order pa- rameters and material parameters as natural kinds that live at mesoscales in between the continuum and the atomic scales. This approach is natural from the perspective of condensed matter physics and materials science. It is not new and has its origins in (quantum) field theoretic approaches to many-body systems developed by Schwinger, Martin, and Kadanoff, among others. It reflects a widespread methodology that has almost completely been ignored by philosophers of science. A key aspect of arguing for the naturalness of such mesoscale parameters is provided by the Fluctuation-Dissipation theorem of statistical mechanics.

 

Geometric phases old and new

Michael Berry

University of Bristol

The waves that describe systems in quantum physics can carry information about how their environment has been altered, for example by forces acting on them. This effect is the geometric phase. It occurs in the optics of polarised light, where it goes back to the 1830s; it influences wave interference; and it provides insight into the spin-statistics relation for identical quantum particles. The underlying mathematics is geometric: parallel transport, explaining how falling cats turn, and how to park a car. A central concept is the curvature, whose statistics are calculated for random Hamiltonians. Incorporating the back-reaction of the geometric phase on the dynamics of the changing environment exposes an unsolved problem: how can a system be separated from a slowly-varying environment? The concept has a tangled history.

 

Contextual wave function collapse: Steps of a measurement process

Barbara Drossel

Technical University of Darmstadt

This talk presents a theoretical treatment of the measurement process that takes into account all the steps between the initial arrival of the particle at the detector and the final pointer deflection. By using methods from the respectively relevant fields of physics, it becomes clear that only the first step, the initial creation of a linear superposition, follows unitary time evolution. All the later steps involve classical elements, elements from statistical physics, and nonlinearities. I will argue that the coupling to the internal heat bath of the detector puts an end to unitary time evolution, since a heat bath cannot be described by a many-particle wave function. This is because its state is specified only to limited number of bits (as implemented in the Boltzmann formula for entropy), and because its microstate cannot be controlled even in principle.

Overall, while our approach benefits a lot from the Copenhagen interpretation, from de- coherence theory, and from collapse models, it goes a step further by taking into account top-down influences from the classical and thermal environments in order to identify where the 'collapse' happens.

 

The Arrow of Time

Jürg Fröhlich

ETH Zurich

A mathematical analysis of irreversible behavior in various physical systems is presented. Often irreversibility manifests itself in the form of “entropy production”. This motivates to begin the lecture with a brief review of relative entropy and with a sketch of some of the sources of irreversible behavior.
Subsequently, the Second Law of thermodynamics in the forms given to it by Clausius and Carnot is derived from quantum statistical mechanics.
In a third part, a derivation of Brownian motion of a quantum particle interacting with a quantum-mechanical thermal reservoir is sketched. This is followed by an outline of a theory of Hamiltonian Friction. To conclude, the fundamental arrow of time inherent in the evolution of states in quantum mechanics is highlighted.

 

Indirect Measurements of a Harmonic Oscillator

Gian Michele Graf

ETH Zurich

The measurement of a quantum system becomes itself a quantum-mechanical process once the apparatus is internalized. That shift of perspective may result in different physical predictions about measurement outcomes for a variety of reasons. In fact, whereas the ideal measurement, as described by Born's rule, is instantaneous, the apparatus produces an outcome over time. In contrast to the often purported view that perfect measurement emerges in the long-time limit, because decoherence supposedly improves with time, it is found that the operation may be of transient character. Following an initial time interval, during which the system under observation and the apparatus remain uncorrelated, there is a ``window of opportunity'' during which suitable observables of the two systems are witnesses to each other. After that time window however, the apparatus is dominated by noise, making it useless.

 

Statistical Mechanical Ensembles and Typical Behavior of

Macroscopic System

Joel L. Lebowitz

Rutgers University

In this talk I will focus on describing, in the quantitative way, the reason why statistical mechanics is able to predict, with great certainty, behavior of macroscopic systems, both in equilibrium and out of it. I will relate this to the fact that this behavior is typical for systems represented by the usual Gibbs ensembles ot those derived from them.These take small phase space volume to indicate small probability. I will not try to justify this.

 

Teaching and learning nonequilibrium statistical physics

Roberto Livi

University of Studies of Florence

Nonequilibrium statistical physics is a discipline of growing interest also for interdisci- plinary applications. Recently it became a curricular topic in many university courses, mainly at master degree and PhD level. Relying upon a recent survey among students and teachers from different countries, it definitely emerged a clear indication in favour of teaching this mat- ter taking first advantage from simple examples and models for providing a better learning also of its fundamental aspects. This inspired the way of writing the book “Nonequilibrium Statistical Physics: a modern perspective” by R. Livi and P. Politi, Cambridge University Press 2017. An extended edition of this book is going to appear in 2023.

 

Frenesy

Christian Maes

KU Leuven

While Boltzmann’s picture of the emergence of irreversibility provides key-elements for un- derstanding relaxation to equilibrium, it also misses several key-points, even in the heuristics. For example, what determines relaxation times, or, what is the role of fluctuations, so impor- tant on the mesoscopic level of description? The obsession with entropy and its production only makes half the story. Energy-entropy considerations for deciding the preferred state or response, fail when far enough from equilibrium and there is a good reason. Life time and accessibility of a condition start to matter crucially there, which we summarize under the concept of frenesy.

 

S = k ln(B(W)): Boltzmann entropy, the Second Law,

and the Architecture of Hell

Tim Maudlin

New York University

The so-called Boltzmann entropy, in its modern formulation, is defined in terms of the volume of phase space associated with different macrostates or generic states. I will argue that the proper definition of this sort—the one that makes contact with the explanatory challenges of statistical mechanical explanation—should actually be in terms of the boundaries of those generic states rather than their volumes. This change in focus bring out the essential nature of the statistical independence assumption that underlies these explanations, and sheds light on how the direction of time comes into the explanation.

 

The cloud chamber problem in the analysis of Mott (1929)

Alessandro Teta

Sapienza, University of Rome

An alpha-particle emitted as a spherical wave in a cloud chamber produces a visible track, interpreted as the trajectory of the particle. The theoretical explanation of this fact was one of the problems discussed in the early days of Quantum Mechanics. In the seminar we recall the historical context and describe in some detail the seminal contribution by Mott (1929), which is relevant not only for the debate on the interpretation of Quantum Mechanics but also as a pioneering anticipation of decoherence theory. We also mention some modern developments of Mott’s approach.

 

Boltzmann entropy in quantum mechanics

Roderich Tumulka

Eberhard Karls University Tbingen

I begin with the definitions of Boltzmann entropy and Gibbs entropy in classical statistical mechanics. While they agree in thermal equilibrium, Boltzmann’s is the more fundamental definition of entropy away from thermal equilibrium. After reviewing Boltzmann’s explana- tion of the increase of Boltzmann entropy with time and the relevance of a so-called “past hypothesis” about the initial conditions of the universe for it, I turn to the question how to transfer Boltzmann’s reasoning to the quantum case. The analog of the Gibbs entropy is the von Neumann entropy -tr rho log rho, and the analog of the Boltzmann entropy of a macro state is the log of the dimension of the subspace corresponding to this macro state. But a quantum state is in general a superposition of contributions from several such subspaces. As a consequence, it is not obvious what it would even mean to say that quantum Boltzmann entropy increases with time; I describe and discuss some proposals.

 

Learning through atypical “phase transitions” in overparametrized

neural networks

Riccardo Zecchina

Bocconi Univeristy, Milan

Among the most surprising aspects of deep learning models are their highly overparametrized and non-convex nature. Both of these aspects are a common trait of all the deep learning models and have led to unexpected results for classical statistical learning theory and non- convex optimization. Current deep neural networks (DNN) are composed of millions (or even billions) of connections weights and the learning process seeks to minimize a loss function that measures the number of classification errors made by the DNN over a training set. We are facing two conceptually stimulating facts: (i) highly expressive neural network models can fit the training data via simple variants of algorithms originally designed for convex optimization; (ii) even if trained with little control over their statistical complexity, these models achieve unparalleled levels of prediction accuracy, contrary to what would be expected from the uniform convergence framework of classical statistical inference. In other words, these models exhibit a form of benign overfitting that does not degrade generalization. In this talk, we focus on the computational fallout of overparameterization. As the number of parameters increases, we study the changes in the geometric structure of the different minima of the error loss function and we relate this to learning performance. From an intuitive viewpoint, one might be tempted to think that non-convex neural network models become effectively convex when the number of weights becomes sufficiently large relative to the number of data to be classified.

We will see that this is not the case even in simple non-convex over-parameterized models. More importantly, we show that as the number of weights increases, the learning algorithms begin to find solutions that almost perfectly fit the data beyond a phase transition point that does not coincide with the so-called interpolation threshold of the model. For a given set of samples taken from a natural distribution of data, the interpolation threshold is defined as the minimum number of parameters (weights) such that solutions that fit almost perfectly the samples start to exist. Contrary to what happens in convex models, we show that for non convex neural networks there exists a gap between the interpolations threshold and the threshold where the learning algorithms start to find solutions. This phase transition coincides with the appearance of atypical solutions that are locally extremely entropic, namely flat regions of the space of the weights that are particularly dense with solutions.