Abstracts for Thursday Philosophy of Physics seminars

9th March Michael Hicks (Physics, Oxford), “Explanatory (a)symmetries and Humean laws”

Abstract: Recently, Lange (2009) has argued that some physical principles are explanatorily prior to others. Lange’s main examples are symmetry principles, which he argues explain both conservation laws–through Noether’s Theorem–and  features of dynamic laws–for example, the Lorentz invariance of QFT. Lange calls these “meta-laws” claims that his account of laws, which is built around the counterfactual stability of groups of statements, can capture the fact that these govern or constrain first-order laws, whereas other views, principally Humean views, can’t. After reviewing the problem Lange presents, I’ll show how the explanatory asymmetry between laws he describes follows naturally on a Humean understanding of what laws are–particularly informative summaries. The Humean should agree with Lange that symmetry principles are explanatorily prior to both conservation laws and dynamic theories like QFT; however, I’ll argue that Lange is wrong to consider these principles “meta-laws” which in some way govern first-order laws, and I’ll show that on the Humean view, the explanation of these two sorts of laws from symmetry principles is importantly different.

9th February Alastair Wilson (Philosophy, Birmingham), “How multiverses might undercut the fine-tuning argument”.

Abstract: In the context of the probabilistic fine-tuning argument that moves from the fragility of cosmological parameters with respect to life to the existence of a divine designer, appealing to the existence of a multiverse has in general seemed problematically ad hoc. The situation looks rather different, though, if there is independent evidence from physics for a multiverse. I will argue that independently-motivated multiverses can be undercutting defeaters for the fine-tuning argument; but whether the argument is indeed undercut still depends on open questions in fundamental physics and cosmology. I will also argue that Everettian quantum mechanics opens up new routes to undercutting the fine-tuning argument, although by itself it is insufficient to do so.

26th January Antony Eagle (Philosophy, Adelaide), “Quantum location”

Abstract: Many metaphysicians are committed to the existence of a location relation between material objects and spacetime, useful in characterising debates in the metaphysics of persistence and time, particularly in the context of trying to map ordinary objects into models of relativity theory. Relatively little attention has been paid to location in quantum mechanics, despite the existence of a position observable in QM being one of the few things metaphysicians know about it. I want to explore how the location relation(s) postulated by metaphysicians might be mapped onto the framework of QM, with particular reference to the idea that there might be such a thing as being indeterminately located.

19th January Emily Adlam (DAMPT, Cambridge), “Quantum mechanics and global determinism”

Abstract: We propose that the information-theoretic features of quantum mechanics are perspectival effects which arise because experiments on local variables can only uncover a certain subset of the correlations exhibited by an underlying deterministic theory. We show that the no-signalling principle, information causality, and strong subadditivity can be derived in this way; we then use our approach to propose a new resolution of the black hole information paradox.
 
24 Nov 2016David Glick (Philosophy, Oxford), “Swapping Something Real: Entanglement Swapping and Entanglement Realism”

Abstract: Experiments demonstrating entanglement swapping have been alleged to challenge realism about entanglement. Seevinck (2006) claims that entanglement “cannot be considered ontologically robust” while Healey (2012) claims that entanglement swapping “undermines the idea that ascribing an entangled state to quantum systems is a way of representing some new, non-classical, physical relation between them.” My aim in this paper is to show that realism is not threatened by the possibility of entanglement swapping, but rather, should be informed by the phenomenon. I argue—expanding the argument of Timpson and Brown (2010)—that ordinary entanglement swapping cases present no new challenges for the realist. With respect to the delayed-choice variant discussed by Healey, I claim that there are two options available to the realist: (a) deny these are cases of genuine swapping (following Egg (2013)) or (b) allow for the existence of entanglement relations between timelike separated regions. This latter option, while radical, is not incoherent and has been suggested in quite different contexts. While I stop short of claiming that the realist must take this option, doing so allows one to avoid certain costs associated with Egg’s account. I conclude by noting several important implications of entanglement swapping for how one thinks of entanglement relations more generally.

17 Nov 2016 Jim Weatherall (UC Irvine),”On Stuff: The Field Concept in Classical Physics”

Abstract: Discussions of physical ontology often come down to two basic options. Either the basic physical entities are particles, or else they are fields. I will argue that, in fact, it is not at all clear what it would mean to say that the world consists of fields. Speaking classically (i.e., non-quantum-ly), there are many different sorts of thing that go by the name “field”, each with different representational roles. Even among those that have some claim to being “fundamental” in the appropriate sense, it does not seem that a single interpretational strategy could apply in all cases. I will end by suggesting that standard strategies for constructing quantum theories of fields are not sensitive to the different roles that “fields” can play in classical physics, which adds a further difficulty to interpreting quantum field theory. Along the way, I will say something about an old debate in the foundations of relativity theory, concerning whether the spacetime metric is a “geometrical” or “physical” field. The view I will defend is that the metric is much like the electromagnetic field: geometrical!

10 Nov 2016Lina Jansson (Nottingham), ‘Newton’s Methodology Meets Humean Supervenience about Laws of Nature’

Abstract

Earman and Roberts [2005a,b] have argued for Humean supervenience about laws of nature based on an argument from epistemic access. In rough outline, their argument relies on the claim that if Humean supervenience is false, then we cannot have any empirical evidence in favour of taking a proposition to be a law of nature as opposed to merely accidentally true. I argue that Newton’s methodology in the Principia provides a counterexample to their claim. In particular, I argue that the success or failure of chains of subjunctive reasoning is empirically accessible, and that this provides a way of gaining empirical evidence for or against a proposition being a law of nature (even under the assumption that Humean supervenience fails).

27 Oct 2016 Ryan Samaroo (Bristol), ‘The Principle of Equivalence is a Criterion of Identity’

Abstract

In 1907 Einstein had an insight into gravitation that he would later refer to as ‘the happiest thought of my life’. This is the hypothesis, roughly speaking, that bodies in free fall do not ‘feel’ their own weight. This is what is formalized in ‘the equivalence principle’. The principle motivated a critical analysis of the Newtonian and 1905 inertial frame concepts, and it was indispensable to Einstein’s argument for a new concept of inertial motion. A great deal has been written about the equivalence principle. Nearly all of this work has focused on the content of the principle, but its methodological role has been largely neglected. A methodological analysis asks the following questions: what kind of principle is the equivalence principle? What is its role in the conceptual framework of gravitation theory? I maintain that the existing answers are unsatisfactory and I offer new answers.

20 Oct 2016 Niels Martens (Oxford, Philosophy), ‘Comparativism about Mass in Newtonian Gravity’

Abstrac

Absolutism about mass asserts that facts about mass ratios are true in virtue of intrinsic masses. Comparativism about mass denies this. I present and dismiss Dasgupta’s (2013) analysis of his recent empirical adequacy argument in favour of comparativism—in the context of Newtonian Gravity. I develop and criticise two new versions of comparativism. Regularity Comparativism forms a liberalisation of Huggett’s Regularity Relationalism (2006), which uses the Mill-Ramsey-Lewis Best System’s Account to respond to Newton’s bucket argument in the analogous relationalism-substantivalism debate. To the extent that this approach works at all, I argue that it works too well: it throws away the massive baby with the bathwater. A Machian flavoured version of comparativism is more promising. Although it faces no knock-down objection, it is not without its own problems though.

13 Oct 2016 David Wallace (USC, Philosophy), ‘Fundamental and emergent geometry in Newtonian gravity’

Abstract

Using as a starting point recent and apparently incompatible conclusions by Simon Saunders (Philosophy of Science 80 (2013) pp.22-48) and Eleanor Knox (British Journal for the Philosophy of Science 65 (2014) pp.863-880), I revisit the question of the correct spacetime setting for Newtonian physics. I argue that understood correctly, these two theories make the same claims both about the background geometry required to define the theory, and about the inertial structure of the theory. In doing so I illustrate and explore in detail the view — espoused by Knox, and also by Harvey Brown (Physical Relativity, OUP 2005) — that inertial structure is defined by the dynamics governing subsystems of a larger system. This clarifies some interesting features of Newtonian physics, notably (i) the distinction between using the theory to model subsystems of a larger whole and using it to model complete Universes, and (ii) the scale-relativity of spacetime structure.

19 May 2016 Eleanor Knox (KCL, Philosophy), ‘Novel Explanation and the Emergence of Phonons’

Abstract

Discussions of emergence in the philosophy of physics literature often emphasise the role of asymptotic limits in understanding the novelty of emergent phenomena while leaving the nature of the novelty in question unexplored. I’ll put forward an account of explanatory novelty that can accommodate examples involving asymptotic limits, but also applies in other cases. The emergence of phonons in a crystal lattice will provide an example of a description with novel explanatory power that does not depend on asymptotic limits for its novelty. The talk is based on joint work with Alex Franklin.

12th May 2016 Yvonne Geyer (Oxford, Maths), ‘Rethinking Quantum Field Theory: Traces of String Theory in Yang-Mills and Gravity’

Abstract

A multitude of recent developments point towards the need for a different understanding of Quantum Field Theories. After a general introduction, I will focus on one specific example involving one of the most natural and fundamental observables; the scattering amplitude. In Yang-Mills theory and Einstein gravity, scattering amplitudes exhibit a simplicity that is completely obscured by the traditional approach to Quantum Field Theories, and that is remarkably reminiscent of the worldsheet models describing string theory. In particular, this implies that – without additional input – the theories describing our universe, Yang-Mills theory and gravity, exhibit traces of string theory.

28th April 2016 Roman Frigg (LSE, Philosophy), ‘Further Rethinking Equilibrium’

Abstract

In a recent paper we proposed a new definition of Boltzmannian equilibrium and showed that in the case of deterministic dynamical systems the new definition implies the standard characterisation but without suffering from its well-known problems and limitations. We now generalise this result to stochastic systems and show that the same implication holds. We then discuss an existence theorem for equilibrium states and illustrate with a number of examples how the theorem works. Finally, fist steps towards understanding the relation between Boltzmannian and Gibbsian equilibrium are made.

25 Feb 2016 Stephen J. Blundell (Oxford, Physics), ‘Emergence, causation and storytelling: condensed matter physics and the limitations of the human mind’

Abstract

The physics of matter in the condensed state is concerned with problems in which the number of constituent particles is vastly greater than can be comprehended by the human mind. The physical limitations of the human mind are fundamental and restrict the way in which we can interact with and learn about the universe. This presents challenges for developing scientific explanations that are met by emergent narratives, concepts and arguments that have a nonEtrivial relationship to the underlying microphysics. By examining examples within condensed matter physics, and also from cellular automata, I show how such emergent narratives efficiently describe elements of reality.

18 Feb 2016 Jean-Pierre Llored (University of Clermont-Ferrand), ‘From quantum physics to quantum chemistry’.

Abstract:

The first part, which is mainly anthropological, summarizes the results of a survey that we carried out in several research laboratories in 2010. Our aims were to understand what quantum chemists currently do, what kind of questions they ask, and what kind of problems they have to face when creating new theoretical tools both for understanding chemical reactivity and predicting chemical transformations.

The second part, which is mainly historical, highlights the philosophical underpinnings that structure the development of quantum chemistry from 1920 to nowadays. In so doing, we will discuss chemical modeling in quantum chemistry, and the different strategies used in order to define molecular features using atomic ones and the molecular surroundings at the same time. We will show how computers and new laboratories emerged simultaneously, and reshaped the culture of quantum chemistry. This part goes on to describe how the debate between ab initio and semi-empirical methods turned out to be highly controversial because of underlying scientific and metaphysical assumptions about, for instance, the nature of the relationships between science and the possibility for human knowledge to reach a complete description of the world.

The third and last part is about the philosophical implications for the study of quantum chemistry and that of ‘quantum sciences’ at large. It insists on the fact that the history of quantum chemistry is also a history of the attempts of chemists to establish the autonomy of their theories and methods with respect to physical, mathematical, and biological theories. According to this line of argument, chemists gradually proposed new concepts in order to circumvent the impossibility to perform full analytical calculations and to make the language of classical structural chemistry and that of quantum chemistry compatible. Among different topics, we will query the meaning of a chemical bond, the impossibility to deduce a molecular shape from the Schrödinger equation, the way quantum chemistry is involved in order to explain the periodic table, and the possibility to go beyond the Born-Oppenheimer approximation. We would like to show that quantum chemistry is neither physics nor chemistry nor applied mathematics, and that philosophical debates which turned out to be relevant in quantum physics are not necessarily so in quantum chemistry, whereas other philosophical questions arise…

11th Feb 2016 David Wallace (Oxford, Philosophy) , ‘Who’s afraid of coordinate systems?’.

Abstract:

Coordinate-based approaches to physical theories remain standard in mainstream physics but are largely eschewed in foundational discussion in favour of coordinate-free differential-geometric approaches. I defend the conceptual and mathematical legitimacy of the coordinate-based approach for foundational work. In doing so, I provide an account of the Kleinian conception of geometry as a theory of invariance under symmetry groups; I argue that this conception continues to play a very substantial role in contemporary mathematical physics and indeed that supposedly ‘coordinate-free’ differential geometry relies centrally on this conception of geometry. I discuss some foundational and pedagogical advantages of the coordinate-based formulation and briefly connect it to some remarks of Norton on the historical development of geometry in physics during the establishment of the general theory of relativity.

21 Jan 2016 Philipp Roser (Clemson), ‘‘Time and York time in quantum theory’.

Abstract:

Classical general relativity has no notion of a physically meaningful time parameter and one is free to choose one’s coordinates at will. However, when attempting to quantise the theory this freedom leads to difficulties, the notorious `problem of time’ of canonical quantum gravity. One way to overcome this obstacle is the identification of a physically fundamental time parameter. Interestingly, although purely aesthetic at the classical level, different choices of time parameter may in principle lead to different quantum phenomenologies, as I will illustrate with a simple model. This means that an underlying physically fundamental notion of time may (to some extent) be detectable via quantum effects.

For various theoretical reasons one promising candidate for a physical time parameter is `York time’, named after James York and his work on the initial-value problem of general relativity, where its importance first became apparent. I will derive the classical and quantum dynamics with respect to York time for certain cosmological models and discuss some of the unconventional structural features of the resulting quantum theory.

3 Dec 2015 Thomas Moller-Nielsen (Oxford), “Symmetry and the Interpretation of Physical Theories”

Abstract:

In this talk I examine two (putative) ways in which symmetries can be used as tools for physical theory interpretation. First, I examine the extent to which symmetries can be used as a guide to a theory’s ideology: that is, as a means of determining which quantities are real, according to the theory. Second, I examine the extent to which symmetries can be used as a guide to a theory’s ontology: that is, as a means of determining which objects are real, according to the theory. I argue that symmetries can only legitimately be used in the first, but not the second, sense.

26 Nov 2015 Ellen Clarke (All Souls), “Biological Ontology”.

Abstract:

All sciences invent kind concepts: names for categories that gather particulars together according to their possession of some scientifically interesting properties. But kind concepts must be well-motivated: they need to do some sort of work for us. I show how to define one sort of scientific concept – that of the biological individual, or organism – so that it does plenty of work for biology. My view understands biological individuals as defined by the process of evolution by natural selection. I will engage in some speculation about how the situation compares in regard to other items of scientific ontology.

19 November 2015 Dan Bedingham (Oxford) “Dynamical Collapse of the Wavefunction and Relativity”.

Abstract:

When a collapse of the wave function takes place it has an instantaneous effect over all space. One might then assume that a covariant description is not possible since a collapse whose effects are simultaneous in one frame of reference would not have simultaneous effects in a boosted frame. I will show, however, that in fact a consistent covariant picture emerges in which the collapsing wave function depends on the choice of foliation of space time, but that suitably defined local properties are unaffected by this choice. The formulation of a covariant description is important for models attempting to describe the collapse of wave function as a dynamical process. This is a very direct approach to solving the quantum measurement problem. It involves simply giving the wave function the stochastic dynamics that it has in practice. We present some proposals for relativistic versions of dynamical collapse models.

12 November 2015 Karim Thébault (Bristol) “Regarding the ‘Hole Argument’ and the ‘Problem of Time’”

Abstract:

The canonical formalism of general relativity affords a particularly interesting characterisation of the infamous hole argument. It also provides a natural formalism in which to relate the hole argument to the problem of time in classical and quantum gravity. In this paper I will examine the connection between these two much discussed problems in the foundations of spacetime theory along two interrelated lines. First, from a formal perspective, I will consider the extent to which the two problems can and cannot be precisely and distinctly characterised. Second, from a philosophical perspective, I will consider the implications of various responses to the problems, with a particular focus upon the viability of a ‘deflationary’ attitude to the relationalist/substantivalist debate regarding the ontology of space-time. Conceptual and formal inadequacies within the representative language of canonical gravity will be shown to be at the heart of both the canonical hole argument and the problem of time. Interesting and fruitful work at the interface of physics and philosophy relates to the challenge of resolving such inadequacies.

5 November 2015 Joseph Melia (Oxford) “Haecceitism, Identity and Indiscernibility: (Mis-)Uses of Modality in the Philosophy of Physics”

Abstract:

I examine a number of arguments involving modality and identity in the Philosophy of Physics. In particular, (a) Wilson’s use of Leibniz’ law to argue for emergent entities; (b) the implications of anti-haecceitism for the Hole argument in GR and QM; (c) the proposal to “define” or “ground” or “account” for identity via some version of Principle of the Identity of Indiscernibles or the Hilbert-Bernays formula.

Against (a) I argue that familiar problems with applications of Leibniz’ law in modal contexts block the argument for the existence of emergent entities;

On (b), I argue that (i) there are multiple and incompatible definitions of haecceitism at play in the literature; (ii) that, properly understood, haecceitism *is* a plausible position; indeed, even supposedly mysterious haecceities do not warrant the criticism of obscurity they have received; (iii) we do better to solve the Hole argument by other means than a thesis about the range and variety of possibilities.

On (c), I argue that recent attempts to formulate a principle of PII fit to serve as a definition of identity are either trivially true, or must draw distinctions between different kinds of properties that are problematic: better to accept identity as primitive.

Some relevant papers/helpful reading (I will not, of course, assume familiarity with these papers)
J. Ladyman: `On the Identity and Diversity of Objects in a Structure.’ Proc. Aristotelian Supp Soc. (2007).
D. Lewis: `On the Plurality of Worlds’, Chp.4. (1986)
O. Pooley: `Points, Particles and Structural Realism’, in Rickles, French and Saatsi, `The Structural Foundations of Quantum Gravity.’ (2006)
S. Saunders: `Are Quantum Particles Objects?’ Analysis (2006)
J. Wilson: `Non-Reductive Physicalism and Degrees of Freedom’, BJPS (2010)

29 October 2015 Chiara Marletto (Oxford, Materials), “Constructor theory of information (and its implications for our understanding of quantum theory)”.

Abstract:

Constructor Theory is a radically new mode of explanation in fundamental physics. It demands a local, deterministic description of physical reality – expressed exclusively in terms of statements about what tasks are possible, what are impossible, and why. This mode of explanation has recently been applied to provide physical foundations for the theory of information – expressing, as conjectured physical principles, the regularities of the laws of physics necessary for there to be what has been so far informally called ‘information’. In constructor theory, one also expresses exactly the relation between classical information and the so-called ‘quantum information’ – showing how properties of the latter arise from a single, constructor-theoretic constraint. This provides a unified conceptual basis for the quantum theory of information (which was previously lacking one qua theory of information). Moreover, the arising of quantum-information like properties in a deterministic, local framework also has implications for the understanding of quantum theory, and of its successors.

22 October 2015 Bryan Roberts (LSE) “The future of the weakly interacting arrow of time”.

Abstract:

This talk discusses the evidence for time asymmetry in fundamental physics. The main aim is to propose some general templates characterising how time asymmetry can be detected among weakly interacting particles. We will then step back and evaluate how this evidence bears on time asymmetry in future physical theories beyond the standard model.

15 October 2015 Oscar Dahlsten (Oxford Physics) “The role of information in work extraction”.

Abstract:

“Since Maxwell’s daemon it has been known that extra information can give more
work. I will discuss how this can be made concrete and quantified. I will focus on
so-called single-shot statistical mechanics. There one can derive expressions for the
maximum work one can extract from a system given one’s information. Only one property
of the state one assigns to the system matters: the entropy. There are subtleties, including
which entropy to use. I will also discuss the relation to fluctuation theorems, and our recent
paper on realising a photonic Maxwell’s daemon.

Some references, I will certainly not assume you have looked at them:
arXiv:0908.0424 The work value of information, Dahlsten, Renner, Rieper and Vedral
arXiv:1009.1630 The thermodynamic meaning of negative entropy, del Rio, Aaberg, Renner, Dahlsten and Vedral
arXiv:1207.0434 A measure of majorisation emerging from single-shot statistical mechanics, Egloff, Dahlsten, Renner, Vedral
arXiv:1409.3878 Introducing one-shot work into fluctuation relations, Yunger Halpern, Garner, Dahlsten, Vedral
arXiv:1504.05152 Equality for worst-case work at any protocol speed, Dahlsten, Choi, Braun, Garner, Yunger Halpern, Vedral
arxiv:1510.02164 Photonic Maxwell’s demon, Vidrighin, Dahlsten, Barbieri, Kim, Vedral and Walmsley”

11 June 2015 Tim Pashby (University of Southern California)
‘Schroedinger’s Cat: It’s About Time (Not Measurement)’

Abstract: I argue for a novel resolution of Schroedinger’s cat paradox by paying particular attention to the role of time and tense in setting up the problem. The quantum system at the heart of the paradoxical situation is an unstable atom, primed for indeterministic decay at some unknown time. The conventional account gives probabilities for the result of instantaneous measurements and leads to the unacceptable conclusion that the cat can neither be considered alive nor dead until the moment the box is opened (at a time of the experimenter’s choosing). To resolve the paradox I reject the status of the instantaneous quantum state as `truthmaker’ and show how a quantum description of the situation can be given instead in terms of time-dependent chance propositions concerning the time of decay, without reference to measurement.

The conclusions reached in the case of Schroedinger’s cat may be generalized throughout quantum mechanics with the means of event time observables (interpreted as conditional probabilities), which play the role of the time of decay for an arbitrary system. Conventional quantum logic restricts its attention to the lattice of projections, taken to represent possible properties of the system. I argue that event time observables provide a compelling reason to look beyond the lattice of projections to the algebra of effects, and suggest an interpretation in which propositions are made true by events rather than properties. This provides the means to resolve the Wigner’s friend paradox along similar lines.

4th June 2015 Neil Dewar (Oxford)
‘Symmetry and Interpretation: or, Translations and Translations’

Abstract: There has been much discussion of whether we should take (exact) symmetries of a physical theory to relate physically equivalent states of affairs, and – if so – what it is that justifies us in so doing. I argue that we can understand the propriety of this move in essentially semantic terms: namely, by thinking of a symmetry transformation as a means of translating a physical theory into itself. To explain why symmetry transformations have this character, I’ll first look at how notions of translation and definition are dealt with in model theory. Then, I’ll set up some analogies between the model-theoretic formalism and the formalism of differential equations, and show how the relevant analogue of self-translation is a symmetry transformation. I conclude with some remarks on how this argument bears on debates over theoretical equivalence.

28th May 2015George Ellis (Cape Town)
‘On the crucial role of top-down causation in complex systems’

Abstract: It will be suggested that causal influences in the real world occurring on evolutionary, developmental, and functional timescales are characterized by a combination of bottom up and top down effects. Digital computers give very clear exemplars of how this happens. There are five different distinct classes of top down effects, the key one leading to the existence of complex systems being adaptive selection. The issue of how there can be causal openness at the bottom allowing this to occur will be discussed. The case will be made that while bottom-up self-assembly can attain a certain degree of complexity, truly complex systems such as life can only come into being if top-down processes come into play in addition to bottom up processes. They allow genuine emergence to occur, based in multiple realisability at lower levels of higher level structures and functions.

21 May 2015
Francesca Vidotto (Radboud University, Nijmegen)

‘Relational ontology from General Relativity and Quantum Mechanics’

Abstract: Our current most reliable physical theories, General Relativity and Quantum Mechanics, point both towards a relational description of reality. General Relativity builds up the spacetime structure from the notion of contiguity between dynamical objects. Quantum Mechanics describes how physical systems affect one another in the course of interactions. Only local interactions define what exists, and there is no meaning in talking about entities but in terms of local interactions.

14 May 2015
Harvey Brown (Oxford) and Chris Timpson (Oxford)

‘Bell on Bell’s theorem: the changing face of nonlocality’

Between 1964 and 1990, the notion of nonlocality in Bell’s papers underwent a profound change as his nonlocality theorem gradually became detached from quantum mechanics, and referred to wider probabilistic theories involving correlations between separated beables. The proposition that standard quantum mechanics is itself nonlocal (more precisely, that it violates ‘local causality’) became divorced from the Bell theorem per se from 1976 on, although this important point is widely overlooked in the literature. In 1990, the year of his death, Bell would express serious misgivings about the mathematical form of the local causality condition, and leave ill-defined the issue of the consistency between special relativity and violation of the Bell-type inequality. In our view, the significance of the Bell theorem, both in its deterministic and stochastic forms, can only be fully understood by taking into account the fact that a fully Lorentz-covariant version of quantum theory, free of action-at-a-distance, can be articulated in the Everett interpretation.

7 May 2015 Mauro Dorato (Rome)
‘The passage of time between physics and psychology ‘

Abstract: The three main aims of my paper are

To defend a minimalistic theory of objective becoming that takes STR and GTR at face value;
To bring to bear relevant neuro-psychological data in support of 1;
To combine 1 and 2 to try to explain with as little metaphysics as possible three key features of our experience of passage, namely:
1. Our untutored belief in a cosmic extension of the now (leading to postulate privileged frames and presentism;
2. The becoming more past of the past (leading to Skow’s 2009 moving spotlight, branching spacetimes)
3. The fact that our actions clearly seem to bringing new events into being (Broad 1923, Tooley 1997, Ellis 2014)

26 February 2015 James Ladyman (Bristol)

“Do local symmetries have ‘direct empirical consequences’?”

Abstract: Hilary Greaves and David Wallace argue that, contrary to the widespread view of philosophers of physics, local symmetries have direct empirical consequences. They do this by showing that there are `Galileo’s Ship Scenarios’ in theories with local symmetries. In this paper I will argue that the notion of `direct empirical consequences’ is ambiguous and admits of two kinds of precisification. Greaves and Wallace do not purport to show that local symmetries have empirical consequences in the stronger of the two senses, but I will argue that it is the salient one. I will then argue that they are right to focus on Galileo’s Ship Scenarios, and I will offer a characterisation of the form of such arguments from symmetries to empirical consequences. I will then discuss how various examples relate to this template. I will then offer a new argument in defence of the orthodoxy that direct empirical consequences do not depend on local symmetries.

19 February 2015 David Wallace (Oxford):

“Fields as Bodies: a unified treatment of spacetime and gauge symmetry”

Abstract: Using the parametrised representation of field theory (in which the location in spacetime of a part of a field is itself represented by a map from the base manifold to Minkowski spacetime) I demonstrate that in both local and global cases, internal (Yang-Mills-type) and spacetime (Poincare) symmetries can be treated precisely on a par, so that gravitational theories may be regarded as gauge theories in a completely standard sense.

12 February 2015 Erik Curiel (Munich)

“Problems with the interpretation of energy conditions in general relativity”

An energy condition, in the context of a wide class of spacetime theories (including general relativity), is, crudely speaking, a relation one demands the stress-energy tensor of matter satisfy in
order to try to capture the idea that “energy should be positive”. The remarkable fact I will discuss is that such simple, general, almost trivial seeming propositions have profound and far-reaching import for our understanding of the structure of relativistic spacetimes. It is therefore especially surprising when one also learns that we have no clear understanding of the nature of these conditions, what theoretical status they have with respect to fundamental physics, what epistemic status they may have, when we should and should not expect them to be satisfied, and even in many cases how they and their consequences should be interpreted physically. Or so I shall argue, by a detailed analysis of the technical and conceptual character of all the standard conditions used in physics today, including examination of their consequences and the circumstances in which they are believed to be violated in the actual universe.

22nd January 2015 Jonathan Halliwell (Imperial College London):

“Negative Probabilities, Fine’s Theorem and Quantum Histories”

Abstract: Many situations in quantum theory and other areas of physics lead to quasi-probabilities which seem to be physically useful but can be negative. The interpretation of such objects is not at all clear. I argue that quasi-probabilities naturally fall into two qualitatively different types, according to whether their non-negative marginals can or cannot be matched to a non-negative probability. The former type, which we call viable, are qualitatively similar to true probabilities, but the latter type, which we call non-viable, may not have a sensible interpretation. Determining the existence of a probability matching given marginals is a non-trivial question in general. In simple examples, Fine’s theorem indicates that inequalities of the Bell and CHSH type provide criteria for its existence. A simple proof of Fine’s theorem is given. The results have consequences for the linear positivity condition of Goldstein and Page in the context of the histories approach to quantum theory. Although it is a very weak condition for the assignment of probabilities it fails in some important cases where our results indicate that probabilities clearly exist. Some implications for the histories approach to quantum theory are discussed.

4 December 2014: Tony Sudbery

“The logic of the future in the Everett-Wheeler understanding of quantum theory”

Abstract: I discuss the problems of probability and the future in the Everett-Wheeler understanding of quantum theory. To resolve these, I propose an understanding of probability arising from a form of temporal logic: the probability of a future-tense proposition is identified with its truth value in a many-valued and context-dependent logic. I construct a lattice of tensed propositions, with truth values in the interval [0, 1], and derive logical properties of the truth values given by the usual quantum-mechanical formula for the probability of histories. I argue that with this understanding, Everett-Wheeler quantum mechanics is the only form of scientific theory that truly incorporates the perception that the future is open.

27 November 2014 : Owen Maroney

“How epistemic can a quantum state be?”

Abstract: The “psi-epistemic” view is that the quantum state does not represent a state of the world, but a state of knowledge about the world. It draws its motivation, in part, from the observation of qualitative similarities between characteristic properties of non-orthogonal quantum wavefunctions and between overlapping classical probability distributions. It might be suggested that it gives a natural explanation for these properties, which seem puzzling for the alternative “psi-ontic” view. However, for two key similarities, quantum state overlap and quantum state discrimination, it turns out that the psi-epistemic view cannot account for the values shown by quantum theory, and for a wide range of quantum states must rely on the same supposedly puzzling explanations as the “psi-ontic” view.

20 November 2014 : Boris Zilber

“The semantics of the canonical commutation relations”

Abstract: I will argue that the canonical commutation relations and the way of calculating with those discovered in the 1920th is in essence a syntactic reflection of a world the semantics of which is still to be reconstructed. The same can be said about the calculus of Feynman integrals. Similar developments have been taking place in pure mathematics since the 1950s in the form of Grothendieck’s schemes and the formalism of non-commutative geometry. I will report on some progress of reconstructing the missing semantics. In particular, for the canonical commutation relations it leads to a theory of representation in finite-dimensional “algebraic Hilbert spaces” which in the limit look rather similar, although not the same, as conventional Hilbert spaces.

13 November 2014 1st BLOC Seminar, KCL, London : Huw Price

“Two Paths to the Paris Interpretation”

Abstract: In 1953 de Broglie’s student, Olivier Costa de Beauregard, raised what he took to be an objection to the EPR argument. He pointed out that the EPR assumption of Locality might fail, without action-at-a-distance, so long as the influence in question is allowed to take a zigzag path, via the past lightcones of the particles concerned. (He argued that considerations of time-symmetry counted in favour of this proposal.) As later writers pointed out, the same idea provides a loophole in Bell’s Theorem, allowing a hidden variable theory to account for the Bell correlations, without irreducible spacelike influence. (The trick depends on the fact that retrocausal models reject an independence assumption on which Bell’s Theorem depends, thereby blocking the derivation of Bell’s Inequality.) Until recently, however, it seems to have gone unnoticed that there is a simple argument that shows that the quantum world must be retrocausal, if we accept three assumptions (one of them time-symmetry) that would have all seemed independently plausible to many physicists in the years following Einstein’s 1905 discovery of the quantisation of light. While it is true that later developments in quantum theory provide ways of challenging these assumptions – different ways of challenging them, for different views of the ontology of the quantum world – it is interesting to ask whether this new argument provides a reason to re-examine the Costa de Beauregard’s ‘Paris interpretation’.

6 November 2014 : Vlatko Vedral

“Macroscopicity”

ABSTRACT: We have a good framework for how to quantify entanglement based, broadly speaking, on two different ideas. One is the fact that local operations and classical communications (LOCCs) do not increase entanglement and hence introduce a natural ordering on the set of entangled states. The other one is inspired by the mean-field theory and quantifies entanglement of a state by how difficult it is to approximate it with disentangled states (the two, while not identical, lead frequently to the same measures). Interestingly, neither of these captures the notion of “macroscopicity” which ask what states are very quantum and macroscopic at the same time. Here the GHZ states win as the ones with the highest macroscopicity, however, they are not highly entangled as far as either the LOCCs or the mean-field theory point of view. I discuss different ways of quantifying macroscopicity and exemplify them with a range of quantum experiments producing different many-body states (GHZ, and general GHZ states, cluster states, topological states). And the winner for producing the highest degree of macroscopicity is…

30 October 2014 : David Wallace

“How not to do the metaphysics of quantum mechanics”

Abstract: Recent years have seen an increasing interest in the metaphysics of quantum theory. While welcome, this trend has an unwelcome side effect: an inappropriate (and often unknowing) identification of quantum theory in general with one particular brand of quantum theory, namely the nonrelativistic mechanics of finitely many point particles. In this talk I’ll explain just why this is problematic, partly by analogy with questions about the metaphysics of classical mechanics.

23 October 2014 : Daniel Bedingham

“Time reversal symmetry and collapse models”

Abstract: Collapse models are modifications of quantum theory where the wave function is treated as physically real and collapse of the wave function is a physical process. This introduces a time reversal asymmetry into the dynamics of the wave function since the collapses affect only the future state. However, it is shown that if the physically real part of the model is reduced to the set of points in space and time about which the collapses occur then a collapsing wave function picture can be given both forward and backward in time, in each case satisfying the Born rule (under certain conditions). This implies that if the collapse locations can serve as an ontology then these models can in fact have time reversal symmetry.

16 October 2014 : Dennis Lehmkuhl

“Einstein, Cartan, Weyl, Jordan: The neighborhood of General Relativity in the space of spacetime theories”

Abstract: Recent years have seen a renewed interest in Newton-Cartan theory (NCT), i.e. Newtonian gravitation theory reformulated in the language of differential geometry. The comparison of this theory with the general theory of relativity (GR) has been particularly interesting, among other reasons, because it allows us to ask how `special’ GR really is, as compared to other theories of gravity. Indeed, the literature so far has focused on the similarities between the two theories, for example on the fact that both theories describe gravity in terms of curvature, and the paths of free particles as geodesics. However, the question of how `special’ GR is can only be properly answered if we highlight differences as much as similarities, and there are plenty of differences between NCT and GR. Furthermore, I will argue that it is not enough to compare GR to simpler theories like NCT, we also have to compare it to more complicated theories; more complicated in terms of geometrical structure and gravitational degrees of freedom. While NCT is the most natural degenerative limit of GR, gravitational theory defined on a Weyl geometry (to be distinguished from a unified field theory based on Weyl geometry) and gravitational scalar-tensor theories (like Jordan-Brans-Dicke theory) are two of the most natural generalisations of GR. Thus, in this talk I will compare Newton-Cartan, GR, Weyl and Jordan-Brans-Dicke theory, to see how special GR really is as compared to its immediate neighborhood in the `space of spacetime theories’.

19 June 2014 : Antony Valentini

“Hidden variables in the early universe II: towards an explanation for large-scale cosmic anomalies”

Abstract: Following on from Part I, we discuss the large-scale anomalies that have been reported in measurements of the cosmic microwave background (CMB) by the Planck satellite. We consider how the anomalies might be explained as the result of incomplete relaxation to quantum equilibrium at long wavelengths on expanding space (during a ‘pre-inflationary phase’) in the de Broglie-Bohm formulation of quantum theory. The first anomaly we consider is the reported large-scale power deficit. This could arise from incomplete relaxation for the amplitudes of the primordial perturbations. It is shown, by numerical simulations, that if the pre-inflationary era is radiation dominated then the deficit in the emerging power spectrum will have a characteristic shape (a specific dependence on wavelength). It is also shown that our scenario is able to produce a power deficit in the observed region and of the observed magnitude, for an appropriate choice of cosmological parameters. The second anomaly we consider is the reported large-scale anisotropy. This could arise from incomplete relaxation for the phases of the primordial perturbations. We report on recent numerical simulations for phase relaxation, and we show how to define characteristic scales for amplitude and phase nonequilibrium. While difficult questions remain concerning the extent to which the data might support our scenario, we argue that we have an (at least) viable model that is able to explain two apparently independent cosmological anomalies at a single stroke.

12 June 2014 : Antony Valentini

“Hidden variables in the early universe I: quantum nonequilibrium and the cosmic microwave background”

Abstract: Assuming inflationary cosmology to be broadly correct, we discuss recent work showing that the Born probability rule for primordial quantum fluctuations can be tested (and indeed is being tested) by measurements of the cosmic microwave background (CMB). We consider in particular the hypothesis of ‘quantum nonequilibrium’ — the idea that the universe began with an anomalous distribution of hidden variables that violates the Born rule — in the context of the de Broglie-Bohm pilot-wave formulation of quantum field theory. An analysis of the de Broglie-Bohm field dynamics on expanding space shows that relaxation to quantum equilibrium is generally retarded (and can be suppressed) for long-wavelength field modes. If the initial probability distribution is assumed to have a less-than-quantum variance, we may expect a large-scale power deficit in the CMB — as appears to be observed by the Planck satellite. Particular attention is paid to conceptual questions concerning the use of probabilities ‘for the universe’ in modern theoretical and observational cosmology.
[Key references: A. Valentini, ‘Inflationary Cosmology as a Probe of Primordial Quantum Mechanics’, Phys. Rev. D 82, 063513 (2010) [arXiv:0805.0163]; S. Colin and A. Valentini, ‘Mechanism for the suppression of quantum noise at large scales on expanding space’, Phys. Rev. D 88, 103515 (2013) [arXiv:1306.1579].]

5 June 2014 : Mike Cuffaro

“Reconsidering quantum no-go theorems from a computational perspective”

Bell’s and related inequalities are misleadingly thought of as “no-go” theorems, except in a highly qualified sense. More properly, they should be understood as imposing constraints on locally causal models which aim to recover quantum mechanical predictions. Thinking of them as no-go theorems is nevertheless mostly harmless in most circumstances; i.e., the necessary qualifications are, in typical discussions of the foundations of quantum mechanics, understood as holding unproblematically. But the situation can change once we leave the traditional context. In the context of a discussion of quantum computation and information, for example, our judgements regarding which locally causal models are to be ruled out as implausible will be different than our similar judgements in the traditional context. In particular, the “all-or-nothing” GHZ inequality, which is traditionally considered to be a more powerful refutation of local causality than statistical inequalities like Bell’s, has very little force in the context of a discussion of quantum computation and information. In this context it is only the statistical inequalities which can legitimately be thought of as no-go theorems. Considering this situation serves to emphasise, I argue, that there is a difference in aim between practical sciences like quantum computation and information, and the foundations of quantum mechanics traditionally construed: describing physical systems as they exist and interact with one another in the natural world is different from describing what one can do with physical systems.

22 May 2014 Elise Crull

“Whence Physical Significance in Bimetric Theories?”

Recently there has been lively discussion regarding a certain class of alternative theories to general relativity called bimetric theories. Such theories are meant to resolve certain physical problems (e.g. the existence of ghost fields and dark matter) as well as philosophical problems (e.g. the apparent experimental violation of relativistic causality and assigning physical significance to metrics).

In this talk, I suggest that a new type of bimetric theory wherein matter couples to both metrics may yield further insights regarding those same philosophical questions, while at the same time addressing (perhaps to greater satisfaction!) the physical worries motivating standard bimetric theories.

15 May 2014: Julian Barbour

“A Gravitational Arrow of Time”

My talk (based on arXiv: 1310.5167 [gr-qc]) will draw attention to a hitherto unnoticed way in which scale-invariant notions of complexity and information can be defined in the problem of N point particles interacting through Newtonian gravity. In accordance with these definitions, all typical solutions of the problem with nonnegative energy divide at a uniquely defined point into two halves that are effectively separate histories. They have a common ‘past’ at the point of division but separate ‘futures’. In each half, the arrow from past to future is defined by growth of the complexity and information. All previous attempts to explain how time-symmetric laws can give rise to the various arrows of time have invoked special boundary conditions. In contrast, the complexity and information arrows are inevitable consequences of the form
of the gravitational law and nothing else. General relativity
shares key structural features with Newtonian gravity, so it may be possible to obtain similar results for Einsteinian gravity.

8 May 2014 : Simon Saunders

“Reference to indistinguishables, and other paradoxes”

Abstract:There is a seeming-paradox about indistinguishables: if described only by totally symmetric properties and relations, or by totally (anti)-symmetrized states, then how is reference to them possible? And we surely do refer to subsets of indistinguishable particles, and sometimes individual elementary particles (as in: the electrons, protons, and neutrons of which your computer screen is composed). Call it the paradox of composition.
The paradox can be framed in the predicate calculus as well, in application to everyday things: indistinguishability goes over to weak discernibility. It connects with two other paradoxes: the Gibbs paradox and Putnam’s paradox. It also connects with the hole argument in General Relativity. They are none of them the same, but they have a common solution.

This solution centres on the way that mathematical representations, including the set-theoretical constructions of model theory, connect with the world. The connection is by structural similarity, not by coordinates or particle labels (in physics), or (in model theory) by elements of sets. The appearance to the contrary is fostered by the simplicity of ostensive reference, on the one hand, and the assignment to structures of particle labels, coordinates, and elements of sets, on the other.

1 May 2014 : Oliver Pooley

“New work on the problem of time”

Abstract: One aspect of the “Problem of Time” in canonical general relativity results from applying to the theory Dirac’s seemingly well-established method of identifying gauge transformations in constrained Hamiltonian theories. This “orthodox” move identifies transformations generated by the first-class constraints as mere gauge. Applied to GR, the strategy yields the paradoxical result that no genuine physical magnitude takes on different values at different times. This orthodoxy is also what underwrites the derivation of the timeless Wheeler–DeWitt equation. It is thus intimately connected to one of the central interpretative puzzles of the canonical approach to quantum gravity, namely, how to make sense of a profoundly timeless quantum formalism.

This talk reviews some recent challenges to the technical underpinning of the orthodox view. Brian Pitts has argued that, in general, first-class constraints generate “not a gauge transformation, but a bad physical change”, even for theories like electromagnetism that are standardly taken to illustrate the correctness of orthodoxy. I argue that Pitts’ results are largely orthogonal to resolving the Problem of Time, and that they leave the orthodox interpretation of phase space untouched. Instead, I will endorse a very different criticism of Dirac’s position, due to Barbour and Foster. As Thébault has stressed, one moral is that a Hamiltonian theory can be manifestly deterministic even if physical magnitudes do not commute with some of the first-class constraints, namely, those that generate reparameterizations of histories. Unfortunately, due to its foliation (in distinction to reparameterization) invariance, Hamiltonian GR suffers from a residual apparent indeterminism. Replacing GR by shape dynamics is one “solution”. I will consider the prospects of finding an alternative.

13 March 2014: Philip Goyal

“An Informational Approach to Identical Particles in Quantum Theory”

Abstract: A remarkable feature of quantum theory is that particles with identical intrinsic properties must be treated as indistinguishable if the theory is to give valid predictions.  In the quantum formalism, indistinguishability is expressed via the symmetrization postulate, which restricts a system of identical particles to the set of symmetric states (`bosons’) or the set of antisymmetric states~(`fermions’).

However, the precise connection between particle indistinguishability and the symmetrization postulate has not been clearly established.  There exist a number of variants of the postulate that appear to be compatible with particle indistinguishability.  In particular, the widely influential topological approach due to Laidlaw & DeWitt and Leinaas & Myrheim implies that its validity depends on the dimensionality of space.  This variant leaves open the possibility that identical particles are generically able to exhibit so-called anyonic behavior in two spatial dimensions.

Here we show that the symmetrization postulate can be derived on the basis of a simple novel postulate.  This postulate establishes a functional relationship between the amplitude of a process involving indistinguishable particles and the amplitudes of all possible transitions when the particles are treated as distinguishable.  The symmetrization postulate follows by requiring consistency with the rest of the quantum formalism.  The key to the derivation is a strictly informational treatment of indistinguishability which prohibits the labelling of particles that cannot be experimentally distinguished from one another.  The derivation implies that the symmetrization postulate admits no natural variants.  In particular, the possibility that identical particles generically exhibit anyonic behaviour is excluded.

[1] “Informational Approach to Identical Particles in Quantum Theory”, http://arxiv.org/abs/1309.0478

6 March 2014: Sean Gryb

“Symmetry and Evolution in Quantum Gravity”

Abstract: A key obstruction for obtaining a non-perturbative definition of quantum gravity is the absence of a sensible quantum representation of spacetime refoliations, including global refoliations – or reparametrizations. We propose that these difficulties can be avoided by following a procedure for defining a degree of freedom due to Poincare, where emphasis is put on the independently specifiable initial data of the system, and a proposal for the decomposition of these degrees of freedom due to York (both of these ideas have later been advocated by Barbour). In our proposal, local refoliations are replaced by local scale transformations using a symmetry trading procedure developed in the “Shape Dynamics” approach to classical gravity. Then, global refoliations are dealt with using a technique similar to that used in unimodular gravity. We will first try to provide the philosophical motivation for our procedure then propose a set of formal equations which represent the quantization of a theory that is classically equivalent to General Relativity. However, the quantum theory we will propose has both a well-defined notion of local symmetry and global time evolution. Time permitting, we will also discuss explicit symmetry reduced toy models exhibiting some of the key features of our proposal.

20 February 2014 Tessa Baker

“Cosmological Tests of Gravity”

Abstract: The past decade has witnessed a surge of interest in extensions of Einstein’s theory of General Relativity. It is hoped that such theories of `modified gravity’ might account for the observed accelerating expansion rate of the universe, providing a more satisfactory and physical explanation than that of a simple cosmological constant.
I will give an overview of current attempts to extend GR, and how to test them observationally. I’ll describe a formalism that has been constructed to carry out these tests, a cosmological analogue of the Parameterised Post-Newtonian framework (PPN) that is used to test gravity in the Solar System. I’ll show how this new formalism acts as a bridge between the (sometimes disparate!) worlds of theory and observation, allowing us to make real progress in our understanding of gravity.

13 February 2014: Joerg Schmiedmayer

“How does the classical world emerge from microscopic quantum evolution”

Abstract: The world around us follows the laws of classical physics, processes are irreversible, and there exists a ‘arrow of time’.  On the microscopic scale our world is governed by quantum physics, its evolution is ‘unitary’ and reversible. How does the classical world emerge from the microscopic quantum world?  We conjecture that the classical world naturally emerges from the microscopic quantum world through the complexity of large quantum systems.  We have now for the first time the ability to probe this conjecture in the laboratory. Modern experimental techniques allow us to monitor the evolution of isolated quantum systems in detail. First experiments using ensembles of ultra cold atoms allow us to test fundamental aspects of the quantum to classical transition in such an quantum system completely isolated from it environment. I will present the concepts behind the emergence conjecture and show first experiments, which probe some of the fundamental aspects of it.

November 14 2013: Jeff Bub, University of Maryland

“Quantum Interactions with Closed Timelike Curves and Superluminal Signaling”

Abstract: There is now a significant body of results on quantum interactions with closed timelike curves (CTCs) in the quantum information literature, for both the Deutsch model of CTC interactions (D-CTCs) and the projective model (P-CTCs). As a consequence, there is a prima facie argument exploiting entanglement that CTC interactions would enable superluminal and, indeed, effectively instantaneous signaling. In cases of spacelike separation between the sender of a signal and the receiver, whether a receiver measures the local part of an entangled state or a disentangled state to receive the signal can depend on the reference frame. A proposed consistency condition gives priority to either an entangled perspective or  a disentangled perspective in spacelike separated scenarios. For D-CTC interactions, the consistency condition gives priority to frames of reference in which the state is disentangled, while for P-CTC interactions the condition selects the entangled state. It follows that there is a procedure that allows Bob to signal to Alice in the past via relayed superluminal communications between spacelike separated Bob and Clio, and spacelike separated Clio and Alice. This opens the door to time travel paradoxes in the classical domain. Ralph (arXiv1107.4675) first pointed this out for P-CTCs, but Ralph’s procedure for a ‘radio to the past’ is flawed. Since both D-CTCs and P-CTCs allow classical information to be sent around a spacetime loop, it follows from a result by Aaronson and Watrous (Proc.Roy.Soc. A, 465:631–647, 2009) for CTC-enhanced classical computation that a quantum computer with access to P-CTCs would have the power of PSPACE, equivalent to a D-CTC-enhanced quantum computer. (The talk represents joint work with Allen Stairs.)

November 7 2013 Sam Fletcher, University of California at Irvine

“On the Reduction of General Relativity to Newtonian Gravitation”

Abstract: Accounts of the reduction of general relativity (GR) to Newtonian gravitation (NG) usually take one of two approaches. One considers the limit as the speed of light c → ∞, while the other focuses on the limit of formulae (e.g., three-momentum) in the low-velocity limit, i.e., as v/c ≈ 0.  Although the first approach treats the reduction of relativistic spacetimes globally, many have argued that ‘c → ∞’ can at best be interpreted counterfactually, which is of limited value in explaining the past empirical success of NG.  The second, on the other hand, while more applicable to explaining this success, only treats a small fragment of GR.  Further, it usually applies only locally, hence is unable to account for the reduction of global structure.  Building on work by Ehlers, I propose a different account of the reduction relation that offers the global applicability of the c → ∞ limit while maintaining the explanatory utility of the v/c ≈ 0 approximation.  In doing so, I highlight the role that a topology on the collection of all spacetimes plays in defining the relation, and how the choice of topology corresponds with broader or narrower classes of observables that one demands be well-approximated in the limit.

October 31 2013 Paul Hoyningen-Heune, Leibniz University of Hannover.

“The dead end objection against convergent realisms.”

Abstract: The target of the dead end objection is any kind of scientific realism that bases its plausibility on the stable presence of some X in a sequence of succeeding theories. For instance, if X is a set of theoretical entities that remains stable even over some scientific revolutions, this may be taken as support for convergent scientific realism about entities. Likewise, if X is a similarly stable set of structures of theories, this may be taken as support for (convergent) structural realism. The dead end objection states that the conceded stability of X could also be due to the existence of an empirically extremely successful though ontologically significantly false theory. In this case, the inference from the stability of X to the probable reality of X would become invalid. Three examples from the history of science illustrate how the stability of some X over an extended period of time was indeed erroneously taken to indicate the finality of X.

October 24 2013 Basil Hiley, Birkbeck College, University of London.

“Bohmian Non-commutative Dynamics: Local Conditional Expectation Values are Weak Values.”

Abstract: Quantum dynamics can be described by two non-commutative geometric Clifford algebras, one of which describes the properties of the covering space of the symplectic manifold [1]. This gives rise to a non-commutative probability theory with conditional expectation values that correspond to local quantum properties which appear as weak values [3]. Examples of these are the T0μ(x) components of the energy-momentum tensor which, in turn, cor- respond to the Bohm momentum and Bohm energy for Schr ¨odinger, Pauli and Dirac particles [2]. In the case of photons, the Bohm momentum has already been measured by Kocis [4]. I will explain the theoretical background and discuss some new experiments involving weak measurements on non- zero rest mass particles that are being developed by Robert Flack [UCL] and myself to explore these ideas further.

 

October 17 2013Edward Anderson, DAMPT, Cambridge University.

“Background independence”

Abstract: This talk concerns what background independence itself is (as opposed to some particular physical theory that is background independent). This notion mostly arises from a layer-by-layer analysis of the facets of the Problem of Time in Quantum Gravity. Part of this notion consists of relational postulates. These are identified as classical precursors of two of the facets, and are tied to the forms of the GR Hamiltonian and momentum constraints respectively. Other aspects of Background Independence include the algebraic closure of these constraints, expressing physics in terms of beables, foliation-independence, the reconstruction of spacetime from space. The final picture is that background independence – a philosophically desirable and physically implementable feature for a theory to have – has the facets of the Problem of Time among its consequences. Thus these arise naturally and are problems to be resolved, as opposed to avoided `by making one’s physics background-dependent in order not to have these problems’. This serves as a selection criterion that limits the use of a number of model arenas and physical theories.