**MICHAELMAS TERM 2017**

Week 2 (October 19) Henrique Gomes, Perimeter Institute, Waterloo.

“New vistas from the many-instant landscape”

Abstract: Quantum gravity has many conceptual problems. Amongst the most well-known is the “Problem of Time”: gravitational observables are global in time, while we would really like to obtain probabilities for processes taking us from an observable at one time to another, later one. Tackling these questions using relationalism will be the preferred strategy during this talk. The ‘relationalist’ approach leads us to shed much redundant information and enables us to identify a reduced configuration space as the arena on which physics unfolds, a goal still beyond our reach in general relativity. Moreover, basing our ontology on this space has far-reaching consequences. One is that it suggests a natural interpretation of quantum mechanics; it is a form of ‘Many-Worlds’ which I have called Many-Instant Bayesianism. Another is that the gravitational reduced configuration space has a rich, highly asymmetric structure which singles out preferred, non-singular and homogeneous initial conditions for a wave-function of the universe, which is yet to be explored.

Week 3 (October 26) Jonathan Halliwell, Imperial College, London

“Comparing conditions for macrorealism: Leggett-Garg inequalities vs no-signalling in time”

Abstract: Macrorealism is the view that a macroscopic system evolving in time possesses definite properties which can be determined without disturbing the future or past state.

I discuss two different types of conditions which were proposed to test macrorealism in the context of a system described by a single dichotomic variable Q. The Leggett-Garg (LG) inequalities, the most commonly-studied test, are only necessary conditions for macrorealism, but I show that when the four three-time LG inequalities are augmented with a certain set of two-time inequalities also of the LG form, Fine’s theorem applies and these augmented conditions are then both necessary and sufficient. A comparison is carried out with a very different set of necessary and sufficient conditions for macrorealism, namely the no-signaling in time (NSIT) conditions proposed by Brukner, Clemente, Kofler and others, which ensure that all probabilities for Q at one and two times are independent of whether earlier or intermediate measurements are made in a given run, and do not involve (but imply) the LG inequalities. I argue that tests based on the LG inequalities have the form of very weak classicality conditions and can be satisfied, in quantum mechanics, in the face of moderate interference effects, but those based on NSIT conditions have the form of much stronger coherence witness conditions, satisfied only for zero interference. The two tests differ in their implementation of non-invasive measurability so are testing different notions of macrorealism. The augmented LG tests are indirect, entailing a combination of the results of different experiments with only compatible quantities measured in each experimental run, in close analogy with Bell tests, and are primarily tests for macrorealism per se. By contrast the NSIT tests entail sequential measurements of incompatible quantities and are primarily tests for non-invasiveness.

Based on the two papers J.J.Halliwell, Phys.Rev. A93, 022123 (2016); A96, 012121 (2017).

Week 4 (November 2) Sam Fletcher, Dept of Philosophy, University of Minnesota.

“Emergence and scale’s labyrinth”

I give precise formal definitions of a hierarchy of emergence concepts for properties described in models of physical theories, showing how some of these concepts are compatible with reductive (but not strictly deductive) relationships between these theories. Besides applying fruitfully to a variety of physical examples, these concepts do not in general track autonomy or novelty along a single simple dimensional scale such as energy, length, or time, but can instead involve labyrinthine balancing relationships between these scales. This complicates the usual view of emergence as relating linearly (or even partially) ordered levels.

Week 5 (November 9) James Ladyman, Dept of Philosophy, University of Bristol

“Why interpret quantum mechanics?”

Abstract: I discuss recent arguments that QM needs no interpretation, and that it should be understood as not representational. I consider how the interpretation of quantum mechanics relates to various kinds of realism, and the fact that the theory is known not to be a complete theory of the world. I tentatively suggest a position that is sceptical about the way the interpretation of quantum mechanics is often undertaken, in particular of the idea of the ontology of the wavefunction, but stops short of regarding quantum states as not representational.

Week 6 (November 16) Hasok Chang, Department of History and Philosophy of Science, University of Cambridge.

“Beyond truth-as-correspondence: Realism for realistic people”

Abstract: In this paper I present arguments against the epistemological ideal of “correspondence”, namely the deeply entrenched notion that empirical truth consists in the match between our theories and the world. The correspondence ideal of knowledge is not something we can actually pursue, for two reasons: it is difficult to discern a coherent sense in which statements correspond to language-independent facts, and we do not have the kind of independent access to the “external world” that would allow us to check the alleged statement–world correspondence. The widespread intuition that correspondence is a pursuable ideal is based on an indefensible kind of externalist referential semantics. The idea that a scientific theory “represents” or “corresponds to” the external world is a metaphor grounded in other human epistemic activities that are actually representational. This metaphor constitutes a serious and well-entrenched obstacle in our attempt to understand scientific practices, and overcoming it will require some disciplined thinking and hard work. On the one hand, we need to continue with real practices of representation in which correspondence can actually be judged; on the other hand, we should stop the illegitimate transfer of intuitions from those practices over to realms in which there are no representations being made and no correspondence to check.

Week 7 (November 23) Alison Fernandes, Department of Philosophy, University of Warwick.

“The temporal asymmetry of chance”

Abstract: The Second Law of Thermodynamics can be derived from the fact that an isolated system at non-maximal entropy is overwhelmingly likely to increase in entropy over time. Such derivations seem to make ineliminable use of objective worldly probabilities (chances). But some have argued that if the fundamental laws are deterministic, there can be no non-trivial chances (Popper, Lewis, Schaffer). Statistical-mechanical probabilities are merely epistemic, or otherwise less real than ‘dynamical’ chances. Many have also thought that chance is intrinsically temporally asymmetric. It is part of the nature of chance that the past is ‘fixed’, and that all non-trivial chances must concern future events. I’ll argue that it is no coincidence that many have held both views: the rejection of deterministic chance is driven by an asymmetric picture of chance in which the past produces the future. I’ll articulate a more deflationary view, according to which more limited temporal asymmetries of chance reflect contingent asymmetries of precisely the kind reflected in the Second Law. The past can be chancy after all.

Week 8 (November 30) Nancy Cartwright, Department of Philosophy, University of Durham and University of California, San Diego

TBC

**TRINITY TERM 2017**

**15th June 2017** Harvey Brown (Oxford), “QBism: the ineffable reality behind “ participatory realism””

Abstract: The recent philosophy of Quantum Bayesianism, or QBism, represents an attempt to solve the traditional puzzles in the foundations of quantum theory by denying the objective reality of the quantum state. Einstein had hoped to remove the spectre of nonlocality in the theory by also assigning an epistemic status to the quantum state, but his version of this doctrine was recently proved to be inconsistent with the predictions of quantum mechanics. In this talk, I present plausibility arguments, old and new, for the reality of the quantum state, and expose what I think are weaknesses in QBism as a philosophy of science.

**8th June 2017** David Jackson (Independent), “How to build a unified field theory from one dimension”

Abstract: Motivated in part by Kant’s work on the a priori nature of space and time, and in part by the conceptual basis of general relativity, a physical theory deriving from a single temporal dimension will be presented. We describe how the basic arithmetic composition of the real line, representing the one dimension of time, itself incorporates structures that can be interpreted as underpinning both the geometrical form of space and the physical form of matter. This unification scheme has a number of features in common with a range of physical theories based on ‘extra dimensions’ of space, while being heavily constrained in deriving from a single dimension of time. A proposal for combining general relativity with quantum theory in the context of this approach will be summarised, along with the connections made with empirical observations. In addition to extracts from Kant further references to sources in the philosophical literature will be cited, in particular with regard to the relation between mathematical objects and physical structures.

**1st June 2017** Jo E. Wolff (KCL), “Quantities – Metaphysical Choicepoints”

Abstract: Beginning from the assumption that quantities are (rich) relational structures, I ask, what kind of ontology arises from attributing this sort of structure to physical attributes. There are three natural questions to ask about relational structures: What are the relations, what are the relata, and what is the relationship between relata and relations? I argue that for quantities, the choicepoints available in response to these questions are:

1) intrinsicalism vs. structuralism

2) substantivialism vs. anti-substantivalism

3) absolutism vs. comparativism

In the remainder of the talk I sketch, which of these choices make for coherent candidate ontologies for quantities.

**18th May 2017** Paul Tappenden (Independent), “Quantum fission”.

Abstract: Sixty years on there is still deep division about Everett’s proposal. Some very well informed critics take the whole idea to be unintelligible whilst there are important disagreements amongst supporters. I argue that Everett’s fundamental and radical idea is to do with metaphysics rather than physics: it is to abolish the physically possible/actual dichotomy. I show that the idea is intelligible via a thought experiment involving a novel version of the mind-body relation which I have already used in the defence of semantic internalism.

The argument leads to a fission interpretation of branching rather than a “divergence” interpretation of the sort first suggested by David Deutsch in 1985 and more recently developed in different ways by Simon Saunders, David Wallace and Alastair Wilson. I discuss the two metaphysical problems which fission faces: transtemporal identity and the identification of probability with relative branch measure. And I claim that the Born rule applies transparently if the alternative mind-body relation is accepted. The upshot is that what Wallace calls the Radical View replaces his preferred Conservative View, with the result that there are some disturbing consequences such as inevitable personal survival in quantum Russian roulette scenarios and David Lewis’s suggestion that Everettians should “shake in their shoes”.

**11th May 2017** Michela Massimi (Edinburgh), “Perspectival models in contemporary high-energy physics”.

Abstract: In recent times perspectivism has come under attack. Critics have argued that when it comes to modelling, perspectivism is either redundant, or, worse, it leads to a plurality of incompatible or even inconsistent models about the same target system. In this paper, I attend to two tasks. First, I try to get clear about the charge of metaphysical inconsistency that has been levelled against perspectivism and identify some key assumptions behind it. Second, I propose a more positive role for perspectivism in some modelling practices by identifying a class of models, which I call “perspectival models”. I illustrate this class of models with examples from contemporary LHC physics.

**4th May 2017** Tushar Menon (Oxford), “Affine Balance: Algebraic functionalism and the ontology of spacetime”.

Abstract: Our two most empirically successful theories, quantum mechanics and general relativity, are at odds with each other when it comes to several foundational issues. The deepest of these issues is also, perhaps, the easiest to grasp intuitively: what is spacetime? Most attempts at theories of quantum gravity do not make it obvious which degrees of freedom are spatiotemporal. In non-general relativistic theories, the matter/spacetime distinction is adequately tracked by the dynamical/non-dynamical object distinction. General relativity is different, because spacetime, if taken to be jointly, but with some redundancy, represented by a smooth manifold and a metric tensor field, is not an immutable, inert, external spectator. Our dynamical/non-dynamical distinction appears no longer to do the work for us; we appear to need something else. In the first part of this talk, I push back against the idea that the dynamical/non-dynamical distinction is doomed. I motivate a more general algebraic characterisation of spacetime based on Eleanor Knox’s spacetime functionalism, and the Helmholtzian notion of free mobility. I argue that spacetime is most usefully characterised by its (local) affine structure.

In the second part of this talk, I consider the debate between Brown and Pooley on the one hand and Janssen and Balashov on the other, about the direction of the arrow of explanation in special relativity. Characterising spacetime using algebraic functionalism, I demonstrate that only Brown’s position is neutral on the substantivalism–relationalism debate. This neutrality may prove to be highly desirable in an interpretation of spacetime that one hopes will generalise to theories of quantum gravity—it seems like poor practice to impose restrictions on an acceptable quantum theory of spacetime based on metaphysical prejudices or approximately true effective field theories. The flexibility of Brown’s approach affords us a theory-dependent a posteriori identification of spacetime, and arguably counts in its favour. I conclude by gesturing towards how this construction might be useful in extending Brown’s view to theories of quantum gravity.

** 27th April 2017** Peter Hylton (UIC) “Analyticity, yet again”.

Abstract: Although Quine became famous for having rejected the analytic-synthetic distinction, he actually accepted it for the last quarter century of his philosophical career. Yet his doing so makes no difference to his other views. In this talk, I press the question ‘Why not?’, in the hope of gaining insight into Quine’s views, and especially his differences with Carnap. I contrast Quine’s position not only with Carnap’s but also with those of Putnam, as represented in his paper ‘The Analytic and the Synthetic’. Putnam there puts forward an answer to the ‘Why not?’ question which is, I think, fairly widely accepted, and perhaps taken to be Quine’s answer as well—wrongly so taken, I claim.

** 9th March 2017** Michael Hicks (Physics, Oxford), “Explanatory (a)symmetries and Humean laws”.

Abstract: Recently, Lange (2009) has argued that some physical principles are explanatorily prior to others. Lange’s main examples are symmetry principles, which he argues explain both conservation laws–through Noether’s Theorem–and features of dynamic laws–for example, the Lorentz invariance of QFT. Lange calls these “meta-laws” claims that his account of laws, which is built around the counterfactual stability of groups of statements, can capture the fact that these govern or constrain first-order laws, whereas other views, principally Humean views, can’t. After reviewing the problem Lange presents, I’ll show how the explanatory asymmetry between laws he describes follows naturally on a Humean understanding of what laws are–particularly informative summaries. The Humean should agree with Lange that symmetry principles are explanatorily prior to both conservation laws and dynamic theories like QFT; however, I’ll argue that Lange is wrong to consider these principles “meta-laws” which in some way govern first-order laws, and I’ll show that on the Humean view, the explanation of these two sorts of laws from symmetry principles is importantly different.

** 2nd March 2017** Ronnie Hermens (Philosophy, Groningen), “How ψ-ontic are ψ-ontic models?”.

Abstract: Ψ-ontology theorems show that in any ontic model that is able to reproduce the predictions of quantum mechanics, the quantum state must be encoded by the ontic state. Since the ontic state determines what is real, and it determines the quantum state, the quantum state must be real. But how does this precisely work in detail, and what does the result imply for the status of the quantum state in ψ-ontic models? As a test case scenario I will look at the ontic models of Meyer, Kent and Clifton. Since these models are able to reproduce the predictions of quantum mechanics, they must be ψ-ontic. On the other hand, quantum states play no role whatsoever in the construction of these models. Thus finding out which ontic state belongs to which quantum state is a non-trivial task. But once that is done, we can ask: does the quantum state play any explanatory role in these models, or is the fact that they are ψ-ontic a mere mathematical nicety?

**23rd February 2017** Simon Saunders (Philosophy, Oxford), “Quantum monads”.

Abstract: The notion of object (and with it ontology) in the foundations of quantum mechanics has been made both too easy and too hard: too easy, because particle distinguishability, and with it the use of proper names, is routinely assumed; too hard, because a number of metaphysical demands have been made of it (for example, in the notion of ‘primitive ontology’ in the writings of Shelly Goldstein and his collaborators). The measurement problem is also wrapped up with it. I shall first give an account of quantum objects adequate to the thin sense required of quantification theory (in the tradition of Frege and Quine); I then consider an alternative, much thicker notion that is strongly reminiscent of Leibniz’s monadology. Both apply to the Everett interpretation and to dynamical collapse theories (sans primitive ontology).

** 16th February 2017** Steven Balbus (Physics, Oxford), “An anthropic explanation for the nearly equal angular diameters of the Sun and Moon”.

Abstract: The very similar angular sizes of the Sun and Moon as subtended at the Earth is generally portrayed as coincidental. In fact, close angular size agreement is a direct and inevitable mathematical consequence of even roughly comparable lunar and solar tidal amplitudes. I will argue that the latter was a biological imperative for the evolution of land vertebrates and can be understood on the basis of anthropic arguments. Comparable tidal amplitudes from two astronomical sources, with close but distinct frequencies, leads to strongly modulated forcing: in essence spring and neap tides. This appearance of this surely very rare tidal pattern must be understood in the context of paleogeography and biology of the Late Devonian period. Two great land masses were separated by a broad opening tapering to a very narrow, shallow-sea strait. The combination of this geography and modulated tidal forces would have been conducive to forming a rich inland network of shallow but transient (and therefore isolating) tidal pools at an epoch when fishy tetrapods were evolving and acquiring land navigational skills. I will discuss the recent fossil evidence showing that important transitional species lived in habitats strongly influenced by intermittent tides. It may be that any planet capable of harbouring a contemplative species displays a moon in its sky very close in angular diameter to that of its sun.

** 9th February 2017** Alastair Wilson (Philosophy, Birmingham), “How multiverses might undercut the fine-tuning argument”.

Abstract: In the context of the probabilistic fine-tuning argument that moves from the fragility of cosmological parameters with respect to life to the existence of a divine designer, appealing to the existence of a multiverse has in general seemed problematically ad hoc. The situation looks rather different, though, if there is independent evidence from physics for a multiverse. I will argue that independently-motivated multiverses can be undercutting defeaters for the fine-tuning argument; but whether the argument is indeed undercut still depends on open questions in fundamental physics and cosmology. I will also argue that Everettian quantum mechanics opens up new routes to undercutting the fine-tuning argument, although by itself it is insufficient to do so.

** 26th January 2017** Antony Eagle (Philosophy, Adelaide), “Quantum location”.

Abstract: Many metaphysicians are committed to the existence of a location relation between material objects and spacetime, useful in characterising debates in the metaphysics of persistence and time, particularly in the context of trying to map ordinary objects into models of relativity theory. Relatively little attention has been paid to location in quantum mechanics, despite the existence of a position observable in QM being one of the few things metaphysicians know about it. I want to explore how the location relation(s) postulated by metaphysicians might be mapped onto the framework of QM, with particular reference to the idea that there might be such a thing as being indeterminately located.

**19th January 2017** Emily Adlam (DAMPT, Cambridge), “Quantum mechanics and global determinism”.

Abstract: We propose that the information-theoretic features of quantum mechanics are perspectival effects which arise because experiments on local variables can only uncover a certain subset of the correlations exhibited by an underlying deterministic theory. We show that the no-signalling principle, information causality, and strong subadditivity can be derived in this way; we then use our approach to propose a new resolution of the black hole information paradox.

**24 Nov 2016** David Glick (Philosophy, Oxford), “Swapping Something Real: Entanglement Swapping and Entanglement Realism”.

Abstract: Experiments demonstrating entanglement swapping have been alleged to challenge realism about entanglement. Seevinck (2006) claims that entanglement “cannot be considered ontologically robust” while Healey (2012) claims that entanglement swapping “undermines the idea that ascribing an entangled state to quantum systems is a way of representing some new, non-classical, physical relation between them.” My aim in this paper is to show that realism is not threatened by the possibility of entanglement swapping, but rather, should be informed by the phenomenon. I argue—expanding the argument of Timpson and Brown (2010)—that ordinary entanglement swapping cases present no new challenges for the realist. With respect to the delayed-choice variant discussed by Healey, I claim that there are two options available to the realist: (a) deny these are cases of genuine swapping (following Egg (2013)) or (b) allow for the existence of entanglement relations between timelike separated regions. This latter option, while radical, is not incoherent and has been suggested in quite different contexts. While I stop short of claiming that the realist must take this option, doing so allows one to avoid certain costs associated with Egg’s account. I conclude by noting several important implications of entanglement swapping for how one thinks of entanglement relations more generally.

**17 Nov 2016** Jim Weatherall (UC Irvine),”On Stuff: The Field Concept in Classical Physics”.

Abstract: Discussions of physical ontology often come down to two basic options. Either the basic physical entities are particles, or else they are fields. I will argue that, in fact, it is not at all clear what it would mean to say that the world consists of fields. Speaking classically (i.e., non-quantum-ly), there are many different sorts of thing that go by the name “field”, each with different representational roles. Even among those that have some claim to being “fundamental” in the appropriate sense, it does not seem that a single interpretational strategy could apply in all cases. I will end by suggesting that standard strategies for constructing quantum theories of fields are not sensitive to the different roles that “fields” can play in classical physics, which adds a further difficulty to interpreting quantum field theory. Along the way, I will say something about an old debate in the foundations of relativity theory, concerning whether the spacetime metric is a “geometrical” or “physical” field. The view I will defend is that the metric is much like the electromagnetic field: geometrical!

**10 Nov 2016** Lina Jansson (Nottingham), ‘Newton’s Methodology Meets Humean Supervenience about Laws of Nature’.

Abstract: Earman and Roberts [2005a,b] have argued for Humean supervenience about laws of nature based on an argument from epistemic access. In rough outline, their argument relies on the claim that if Humean supervenience is false, then we cannot have any empirical evidence in favour of taking a proposition to be a law of nature as opposed to merely accidentally true. I argue that Newton’s methodology in the Principia provides a counterexample to their claim. In particular, I argue that the success or failure of chains of subjunctive reasoning is empirically accessible, and that this provides a way of gaining empirical evidence for or against a proposition being a law of nature (even under the assumption that Humean supervenience fails).

**27 Oct 2016** Ryan Samaroo (Bristol), “The Principle of Equivalence is a Criterion of Identity”.

Abstract: In 1907 Einstein had an insight into gravitation that he would later refer to as ‘the happiest thought of my life’. This is the hypothesis, roughly speaking, that bodies in free fall do not ‘feel’ their own weight. This is what is formalized in ‘the equivalence principle’. The principle motivated a critical analysis of the Newtonian and 1905 inertial frame concepts, and it was indispensable to Einstein’s argument for a new concept of inertial motion. A great deal has been written about the equivalence principle. Nearly all of this work has focused on the content of the principle, but its methodological role has been largely neglected. A methodological analysis asks the following questions: what kind of principle is the equivalence principle? What is its role in the conceptual framework of gravitation theory? I maintain that the existing answers are unsatisfactory and I offer new answers.

**20 Oct 2016** Niels Martens (Oxford, Philosophy), “Comparativism about Mass in Newtonian Gravity”.

Abstract: Absolutism about mass asserts that facts about mass ratios are true in virtue of intrinsic masses. Comparativism about mass denies this. I present and dismiss Dasgupta’s (2013) analysis of his recent empirical adequacy argument in favour of comparativism—in the context of Newtonian Gravity. I develop and criticise two new versions of comparativism. Regularity Comparativism forms a liberalisation of Huggett’s Regularity Relationalism (2006), which uses the Mill-Ramsey-Lewis Best System’s Account to respond to Newton’s bucket argument in the analogous relationalism-substantivalism debate. To the extent that this approach works at all, I argue that it works too well: it throws away the massive baby with the bathwater. A Machian flavoured version of comparativism is more promising. Although it faces no knock-down objection, it is not without its own problems though.

**13 Oct 2016** David Wallace (USC, Philosophy), “Fundamental and emergent geometry in Newtonian gravity”.

Abstract: Using as a starting point recent and apparently incompatible conclusions by Simon Saunders (Philosophy of Science 80 (2013) pp.22-48) and Eleanor Knox (British Journal for the Philosophy of Science 65 (2014) pp.863-880), I revisit the question of the correct spacetime setting for Newtonian physics. I argue that understood correctly, these two theories make the same claims both about the background geometry required to deﬁne the theory, and about the inertial structure of the theory. In doing so I illustrate and explore in detail the view — espoused by Knox, and also by Harvey Brown (Physical Relativity, OUP 2005) — that inertial structure is deﬁned by the dynamics governing subsystems of a larger system. This clariﬁes some interesting features of Newtonian physics, notably (i) the distinction between using the theory to model subsystems of a larger whole and using it to model complete Universes, and (ii) the scale-relativity of spacetime structure.

**19 May 2016** Eleanor Knox (KCL, Philosophy), “Novel Explanation and the Emergence of Phonons”.

Abstract: Discussions of emergence in the philosophy of physics literature often emphasise the role of asymptotic limits in understanding the novelty of emergent phenomena while leaving the nature of the novelty in question unexplored. I’ll put forward an account of explanatory novelty that can accommodate examples involving asymptotic limits, but also applies in other cases. The emergence of phonons in a crystal lattice will provide an example of a description with novel explanatory power that does not depend on asymptotic limits for its novelty. The talk is based on joint work with Alex Franklin.

**12th May 2016** Yvonne Geyer (Oxford, Maths), “Rethinking Quantum Field Theory: Traces of String Theory in Yang-Mills and Gravity”.

Abstract: A multitude of recent developments point towards the need for a different understanding of Quantum Field Theories. After a general introduction, I will focus on one specific example involving one of the most natural and fundamental observables; the scattering amplitude. In Yang-Mills theory and Einstein gravity, scattering amplitudes exhibit a simplicity that is completely obscured by the traditional approach to Quantum Field Theories, and that is remarkably reminiscent of the worldsheet models describing string theory. In particular, this implies that – without additional input – the theories describing our universe, Yang-Mills theory and gravity, exhibit traces of string theory.

**28th April 2016** Roman Frigg (LSE, Philosophy), “Further Rethinking Equilibrium”.

Abstract: In a recent paper we proposed a new definition of Boltzmannian equilibrium and showed that in the case of deterministic dynamical systems the new definition implies the standard characterisation but without suffering from its well-known problems and limitations. We now generalise this result to stochastic systems and show that the same implication holds. We then discuss an existence theorem for equilibrium states and illustrate with a number of examples how the theorem works. Finally, fist steps towards understanding the relation between Boltzmannian and Gibbsian equilibrium are made.

**25 Feb 2016** Stephen J. Blundell (Oxford, Physics), ‘Emergence, causation and storytelling: condensed matter physics and the limitations of the human mind’

Abstract: The physics of matter in the condensed state is concerned with problems in which the number of constituent particles is vastly greater than can be comprehended by the human mind. The physical limitations of the human mind are fundamental and restrict the way in which we can interact with and learn about the universe. This presents challenges for developing scientific explanations that are met by emergent narratives, concepts and arguments that have a nonEtrivial relationship to the underlying microphysics. By examining examples within condensed matter physics, and also from cellular automata, I show how such emergent narratives efficiently describe elements of reality.

** 18 Feb 2016** Jean-Pierre Llored (University of Clermont-Ferrand), ‘From quantum physics to quantum chemistry’.

Abstract: The first part, which is mainly anthropological, summarizes the results of a survey that we carried out in several research laboratories in 2010. Our aims were to understand what quantum chemists currently do, what kind of questions they ask, and what kind of problems they have to face when creating new theoretical tools both for understanding chemical reactivity and predicting chemical transformations.

The second part, which is mainly historical, highlights the philosophical underpinnings that structure the development of quantum chemistry from 1920 to nowadays. In so doing, we will discuss chemical modeling in quantum chemistry, and the different strategies used in order to define molecular features using atomic ones and the molecular surroundings at the same time. We will show how computers and new laboratories emerged simultaneously, and reshaped the culture of quantum chemistry. This part goes on to describe how the debate between ab initio and semi-empirical methods turned out to be highly controversial because of underlying scientific and metaphysical assumptions about, for instance, the nature of the relationships between science and the possibility for human knowledge to reach a complete description of the world.

The third and last part is about the philosophical implications for the study of quantum chemistry and that of ‘quantum sciences’ at large. It insists on the fact that the history of quantum chemistry is also a history of the attempts of chemists to establish the autonomy of their theories and methods with respect to physical, mathematical, and biological theories. According to this line of argument, chemists gradually proposed new concepts in order to circumvent the impossibility to perform full analytical calculations and to make the language of classical structural chemistry and that of quantum chemistry compatible. Among different topics, we will query the meaning of a chemical bond, the impossibility to deduce a molecular shape from the Schrödinger equation, the way quantum chemistry is involved in order to explain the periodic table, and the possibility to go beyond the Born-Oppenheimer approximation. We would like to show that quantum chemistry is neither physics nor chemistry nor applied mathematics, and that philosophical debates which turned out to be relevant in quantum physics are not necessarily so in quantum chemistry, whereas other philosophical questions arise…

**11th Feb 2016** David Wallace (Oxford, Philosophy) , ‘Who’s afraid of coordinate systems?’.

Abstract: Coordinate-based approaches to physical theories remain standard in mainstream physics but are largely eschewed in foundational discussion in favour of coordinate-free differential-geometric approaches. I defend the conceptual and mathematical legitimacy of the coordinate-based approach for foundational work. In doing so, I provide an account of the Kleinian conception of geometry as a theory of invariance under symmetry groups; I argue that this conception continues to play a very substantial role in contemporary mathematical physics and indeed that supposedly ‘coordinate-free’ differential geometry relies centrally on this conception of geometry. I discuss some foundational and pedagogical advantages of the coordinate-based formulation and briefly connect it to some remarks of Norton on the historical development of geometry in physics during the establishment of the general theory of relativity.

** 21 Jan 2016** Philipp Roser (Clemson), ‘‘Time and York time in quantum theory’.

Abstract: Classical general relativity has no notion of a physically meaningful time parameter and one is free to choose one’s coordinates at will. However, when attempting to quantise the theory this freedom leads to difficulties, the notorious `problem of time’ of canonical quantum gravity. One way to overcome this obstacle is the identification of a physically fundamental time parameter. Interestingly, although purely aesthetic at the classical level, different choices of time parameter may in principle lead to different quantum phenomenologies, as I will illustrate with a simple model. This means that an underlying physically fundamental notion of time may (to some extent) be detectable via quantum effects.

For various theoretical reasons one promising candidate for a physical time parameter is `York time’, named after James York and his work on the initial-value problem of general relativity, where its importance first became apparent. I will derive the classical and quantum dynamics with respect to York time for certain cosmological models and discuss some of the unconventional structural features of the resulting quantum theory.

**3 Dec 2015** Thomas Moller-Nielsen (Oxford), “Symmetry and the Interpretation of Physical Theories”

Abstract: In this talk I examine two (putative) ways in which symmetries can be used as tools for physical theory interpretation. First, I examine the extent to which symmetries can be used as a guide to a theory’s ideology: that is, as a means of determining which quantities are real, according to the theory. Second, I examine the extent to which symmetries can be used as a guide to a theory’s ontology: that is, as a means of determining which objects are real, according to the theory. I argue that symmetries can only legitimately be used in the first, but not the second, sense.

** 26 Nov 2015** Ellen Clarke (All Souls), “Biological Ontology”.

Abstract: All sciences invent kind concepts: names for categories that gather particulars together according to their possession of some scientifically interesting properties. But kind concepts must be well-motivated: they need to do some sort of work for us. I show how to define one sort of scientific concept – that of the biological individual, or organism – so that it does plenty of work for biology. My view understands biological individuals as defined by the process of evolution by natural selection. I will engage in some speculation about how the situation compares in regard to other items of scientific ontology.

**19 November 2015 **Dan Bedingham (Oxford) “Dynamical Collapse of the Wavefunction and Relativity”.

Abstract: When a collapse of the wave function takes place it has an instantaneous effect over all space. One might then assume that a covariant description is not possible since a collapse whose effects are simultaneous in one frame of reference would not have simultaneous effects in a boosted frame. I will show, however, that in fact a consistent covariant picture emerges in which the collapsing wave function depends on the choice of foliation of space time, but that suitably defined local properties are unaffected by this choice. The formulation of a covariant description is important for models attempting to describe the collapse of wave function as a dynamical process. This is a very direct approach to solving the quantum measurement problem. It involves simply giving the wave function the stochastic dynamics that it has in practice. We present some proposals for relativistic versions of dynamical collapse models.

**12 November 2015** Karim Thébault (Bristol) “Regarding the ‘Hole Argument’ and the ‘Problem of Time’”

Abstract: The canonical formalism of general relativity affords a particularly interesting characterisation of the infamous hole argument. It also provides a natural formalism in which to relate the hole argument to the problem of time in classical and quantum gravity. In this paper I will examine the connection between these two much discussed problems in the foundations of spacetime theory along two interrelated lines. First, from a formal perspective, I will consider the extent to which the two problems can and cannot be precisely and distinctly characterised. Second, from a philosophical perspective, I will consider the implications of various responses to the problems, with a particular focus upon the viability of a ‘deflationary’ attitude to the relationalist/substantivalist debate regarding the ontology of space-time. Conceptual and formal inadequacies within the representative language of canonical gravity will be shown to be at the heart of both the canonical hole argument and the problem of time. Interesting and fruitful work at the interface of physics and philosophy relates to the challenge of resolving such inadequacies.

**5 November 2015** Joseph Melia (Oxford) “Haecceitism, Identity and Indiscernibility: (Mis-)Uses of Modality in the Philosophy of Physics”

Abstract: I examine a number of arguments involving modality and identity in the Philosophy of Physics. In particular, (a) Wilson’s use of Leibniz’ law to argue for emergent entities; (b) the implications of anti-haecceitism for the Hole argument in GR and QM; (c) the proposal to “define” or “ground” or “account” for identity via some version of Principle of the Identity of Indiscernibles or the Hilbert-Bernays formula.

Against (a) I argue that familiar problems with applications of Leibniz’ law in modal contexts block the argument for the existence of emergent entities;

On (b), I argue that (i) there are multiple and incompatible definitions of haecceitism at play in the literature; (ii) that, properly understood, haecceitism *is* a plausible position; indeed, even supposedly mysterious haecceities do not warrant the criticism of obscurity they have received; (iii) we do better to solve the Hole argument by other means than a thesis about the range and variety of possibilities.

On (c), I argue that recent attempts to formulate a principle of PII fit to serve as a definition of identity are either trivially true, or must draw distinctions between different kinds of properties that are problematic: better to accept identity as primitive.

Some relevant papers/helpful reading (I will not, of course, assume familiarity with these papers)

J. Ladyman: `On the Identity and Diversity of Objects in a Structure.’ Proc. Aristotelian Supp Soc. (2007).

D. Lewis: `On the Plurality of Worlds’, Chp.4. (1986)

O. Pooley: `Points, Particles and Structural Realism’, in Rickles, French and Saatsi, `The Structural Foundations of Quantum Gravity.’ (2006)

S. Saunders: `Are Quantum Particles Objects?’ Analysis (2006)

J. Wilson: `Non-Reductive Physicalism and Degrees of Freedom’, BJPS (2010)

**29 October 2015** Chiara Marletto (Oxford, Materials), “Constructor theory of information (and its implications for our understanding of quantum theory)”.

Abstract: Constructor Theory is a radically new mode of explanation in fundamental physics. It demands a local, deterministic description of physical reality – expressed exclusively in terms of statements about what tasks are possible, what are impossible, and why. This mode of explanation has recently been applied to provide physical foundations for the theory of information – expressing, as conjectured physical principles, the regularities of the laws of physics necessary for there to be what has been so far informally called ‘information’. In constructor theory, one also expresses exactly the relation between classical information and the so-called ‘quantum information’ – showing how properties of the latter arise from a single, constructor-theoretic constraint. This provides a unified conceptual basis for the quantum theory of information (which was previously lacking one qua theory of information). Moreover, the arising of quantum-information like properties in a deterministic, local framework also has implications for the understanding of quantum theory, and of its successors.

**22 October 2015 ** Bryan Roberts (LSE) “The future of the weakly interacting arrow of time”.

Abstract: This talk discusses the evidence for time asymmetry in fundamental physics. The main aim is to propose some general templates characterising how time asymmetry can be detected among weakly interacting particles. We will then step back and evaluate how this evidence bears on time asymmetry in future physical theories beyond the standard model.

**15 October 2015** Oscar Dahlsten (Oxford Physics) “The role of information in work extraction”.

Abstract: Since Maxwell’s daemon it has been known that extra information can give more work. I will discuss how this can be made concrete and quantified. I will focus on

so-called single-shot statistical mechanics. There one can derive expressions for the maximum work one can extract from a system given one’s information. Only one property

of the state one assigns to the system matters: the entropy. There are subtleties, including which entropy to use. I will also discuss the relation to fluctuation theorems, and our recent

paper on realising a photonic Maxwell’s daemon.

Some references, I will certainly not assume you have looked at them:

arXiv:0908.0424 The work value of information, Dahlsten, Renner, Rieper and Vedral

arXiv:1009.1630 The thermodynamic meaning of negative entropy, del Rio, Aaberg, Renner, Dahlsten and Vedral

arXiv:1207.0434 A measure of majorisation emerging from single-shot statistical mechanics, Egloff, Dahlsten, Renner, Vedral

arXiv:1409.3878 Introducing one-shot work into fluctuation relations, Yunger Halpern, Garner, Dahlsten, Vedral

arXiv:1504.05152 Equality for worst-case work at any protocol speed, Dahlsten, Choi, Braun, Garner, Yunger Halpern, Vedral

arxiv:1510.02164 Photonic Maxwell’s demon, Vidrighin, Dahlsten, Barbieri, Kim, Vedral and Walmsley”

**11 June 2015** Tim Pashby (University of Southern California)

‘Schroedinger’s Cat: It’s About Time (Not Measurement)’

Abstract: I argue for a novel resolution of Schroedinger’s cat paradox by paying particular attention to the role of time and tense in setting up the problem. The quantum system at the heart of the paradoxical situation is an unstable atom, primed for indeterministic decay at some unknown time. The conventional account gives probabilities for the result of instantaneous measurements and leads to the unacceptable conclusion that the cat can neither be considered alive nor dead until the moment the box is opened (at a time of the experimenter’s choosing). To resolve the paradox I reject the status of the instantaneous quantum state as `truthmaker’ and show how a quantum description of the situation can be given instead in terms of time-dependent chance propositions concerning the time of decay, without reference to measurement.

The conclusions reached in the case of Schroedinger’s cat may be generalized throughout quantum mechanics with the means of event time observables (interpreted as conditional probabilities), which play the role of the time of decay for an arbitrary system. Conventional quantum logic restricts its attention to the lattice of projections, taken to represent possible properties of the system. I argue that event time observables provide a compelling reason to look beyond the lattice of projections to the algebra of effects, and suggest an interpretation in which propositions are made true by events rather than properties. This provides the means to resolve the Wigner’s friend paradox along similar lines.

**4th June 2015 Neil Dewar (Oxford)
‘Symmetry and Interpretation: or, Translations and Translations’**

Abstract: There has been much discussion of whether we should take (exact) symmetries of a physical theory to relate physically equivalent states of affairs, and – if so – what it is that justifies us in so doing. I argue that we can understand the propriety of this move in essentially semantic terms: namely, by thinking of a symmetry transformation as a means of translating a physical theory into itself. To explain why symmetry transformations have this character, I’ll first look at how notions of translation and definition are dealt with in model theory. Then, I’ll set up some analogies between the model-theoretic formalism and the formalism of differential equations, and show how the relevant analogue of self-translation is a symmetry transformation. I conclude with some remarks on how this argument bears on debates over theoretical equivalence.

**28th May 2015George Ellis (Cape Town)
‘On the crucial role of top-down causation in complex systems’**

Abstract: It will be suggested that causal influences in the real world occurring on evolutionary, developmental, and functional timescales are characterized by a combination of bottom up and top down effects. Digital computers give very clear exemplars of how this happens. There are five different distinct classes of top down effects, the key one leading to the existence of complex systems being adaptive selection. The issue of how there can be causal openness at the bottom allowing this to occur will be discussed. The case will be made that while bottom-up self-assembly can attain a certain degree of complexity, truly complex systems such as life can only come into being if top-down processes come into play in addition to bottom up processes. They allow genuine emergence to occur, based in multiple realisability at lower levels of higher level structures and functions.

** 21 May 2015
Francesca Vidotto (Radboud University, Nijmegen) “Relational ontology from General Relativity and Quantum Mechanics”.**

Abstract: Our current most reliable physical theories, General Relativity and Quantum Mechanics, point both towards a relational description of reality. General Relativity builds up the spacetime structure from the notion of contiguity between dynamical objects. Quantum Mechanics describes how physical systems affect one another in the course of interactions. Only local interactions define what exists, and there is no meaning in talking about entities but in terms of local interactions.

**14 May 2015 Harvey Brown (Philosohy, Oxford) and Chris Timpson (Philosophy, Oxford) “Bell on Bell’s theorem: the changing face of nonlocality”.**

Between 1964 and 1990, the notion of nonlocality in Bell’s papers underwent a profound change as his nonlocality theorem gradually became detached from quantum mechanics, and referred to wider probabilistic theories involving correlations between separated beables. The proposition that standard quantum mechanics is itself nonlocal (more precisely, that it violates ‘local causality’) became divorced from the Bell theorem per se from 1976 on, although this important point is widely overlooked in the literature. In 1990, the year of his death, Bell would express serious misgivings about the mathematical form of the local causality condition, and leave ill-defined the issue of the consistency between special relativity and violation of the Bell-type inequality. In our view, the significance of the Bell theorem, both in its deterministic and stochastic forms, can only be fully understood by taking into account the fact that a fully Lorentz-covariant version of quantum theory, free of action-at-a-distance, can be articulated in the Everett interpretation.

**7 May 2015** Mauro Dorato (Rome) “The passage of time between physics and psychology”.

Abstract: The three main aims of my paper are: To defend a minimalistic theory of objective becoming that takes STR and GTR at face value; to bring to bear relevant neuro-psychological data in support of 1; to combine 1 and 2 to try to explain with as little metaphysics as possible three key features of our experience of passage, namely:

1. Our untutored belief in a cosmic extension of the now (leading to postulate privileged frames and presentism;

2. The becoming more past of the past (leading to Skow’s 2009 moving spotlight, branching spacetimes)

3. The fact that our actions clearly seem to bring new events into being (Broad 1923, Tooley 1997, Ellis 2014)

**26 February 2015** James Ladyman (Bristol)”Do local symmetries have ‘direct empirical consequences’?”

Abstract: Hilary Greaves and David Wallace argue that, contrary to the widespread view of philosophers of physics, local symmetries have direct empirical consequences. They do this by showing that there are `Galileo’s Ship Scenarios’ in theories with local symmetries. In this paper I will argue that the notion of `direct empirical consequences’ is ambiguous and admits of two kinds of precisification. Greaves and Wallace do not purport to show that local symmetries have empirical consequences in the stronger of the two senses, but I will argue that it is the salient one. I will then argue that they are right to focus on Galileo’s Ship Scenarios, and I will offer a characterisation of the form of such arguments from symmetries to empirical consequences. I will then discuss how various examples relate to this template. I will then offer a new argument in defence of the orthodoxy that direct empirical consequences do not depend on local symmetries.

**19 February 2015** David Wallace (Oxford): “Fields as Bodies: a unified treatment of spacetime and gauge symmetry”.

Abstract: Using the parametrised representation of field theory (in which the location in spacetime of a part of a field is itself represented by a map from the base manifold to Minkowski spacetime) I demonstrate that in both local and global cases, internal (Yang-Mills-type) and spacetime (Poincare) symmetries can be treated precisely on a par, so that gravitational theories may be regarded as gauge theories in a completely standard sense.

** 12 February 2015** Erik Curiel (Munich), “Problems with the interpretation of energy conditions in general relativity”.

Abstract: An energy condition, in the context of a wide class of spacetime theories (including general relativity), is, crudely speaking, a relation one demands the stress-energy tensor of matter satisfy in order to try to capture the idea that “energy should be positive”. The remarkable fact I will discuss is that such simple, general, almost trivial seeming propositions have profound and far-reaching import for our understanding of the structure of relativistic spacetimes. It is therefore especially surprising when one also learns that we have no clear understanding of the nature of these conditions, what theoretical status they have with respect to fundamental physics, what epistemic status they may have, when we should and should not expect them to be satisfied, and even in many cases how they and their consequences should be interpreted physically. Or so I shall argue, by a detailed analysis of the technical and conceptual character of all the standard conditions used in physics today, including examination of their consequences and the circumstances in which they are believed to be violated in the actual universe.

** 22nd January 2015** Jonathan Halliwell (Imperial College London):”Negative Probabilities, Fine’s Theorem and Quantum Histories”.

Abstract: Many situations in quantum theory and other areas of physics lead to quasi-probabilities which seem to be physically useful but can be negative. The interpretation of such objects is not at all clear. I argue that quasi-probabilities naturally fall into two qualitatively different types, according to whether their non-negative marginals can or cannot be matched to a non-negative probability. The former type, which we call viable, are qualitatively similar to true probabilities, but the latter type, which we call non-viable, may not have a sensible interpretation. Determining the existence of a probability matching given marginals is a non-trivial question in general. In simple examples, Fine’s theorem indicates that inequalities of the Bell and CHSH type provide criteria for its existence. A simple proof of Fine’s theorem is given. The results have consequences for the linear positivity condition of Goldstein and Page in the context of the histories approach to quantum theory. Although it is a very weak condition for the assignment of probabilities it fails in some important cases where our results indicate that probabilities clearly exist. Some implications for the histories approach to quantum theory are discussed.

** 4 December 2014**: Tony Sudbery (Maths, York), “The logic of the future in the Everett-Wheeler understanding of quantum theory”

Abstract: I discuss the problems of probability and the future in the Everett-Wheeler understanding of quantum theory. To resolve these, I propose an understanding of probability arising from a form of temporal logic: the probability of a future-tense proposition is identified with its truth value in a many-valued and context-dependent logic. I construct a lattice of tensed propositions, with truth values in the interval [0, 1], and derive logical properties of the truth values given by the usual quantum-mechanical formula for the probability of histories. I argue that with this understanding, Everett-Wheeler quantum mechanics is the only form of scientific theory that truly incorporates the perception that the future is open.

** 27 November 2014 **: Owen Maroney (Philosophy, Oxford), “How epistemic can a quantum state be?”

Abstract: The “psi-epistemic” view is that the quantum state does not represent a state of the world, but a state of knowledge about the world. It draws its motivation, in part, from the observation of qualitative similarities between characteristic properties of non-orthogonal quantum wavefunctions and between overlapping classical probability distributions. It might be suggested that it gives a natural explanation for these properties, which seem puzzling for the alternative “psi-ontic” view. However, for two key similarities, quantum state overlap and quantum state discrimination, it turns out that the psi-epistemic view cannot account for the values shown by quantum theory, and for a wide range of quantum states must rely on the same supposedly puzzling explanations as the “psi-ontic” view.

** 20 November 2014 **: Boris Zilber (Maths, Oxford), “The semantics of the canonical commutation relations”

Abstract: I will argue that the canonical commutation relations and the way of calculating with those discovered in the 1920th is in essence a syntactic reflection of a world the semantics of which is still to be reconstructed. The same can be said about the calculus of Feynman integrals. Similar developments have been taking place in pure mathematics since the 1950s in the form of Grothendieck’s schemes and the formalism of non-commutative geometry. I will report on some progress of reconstructing the missing semantics. In particular, for the canonical commutation relations it leads to a theory of representation in finite-dimensional “algebraic Hilbert spaces” which in the limit look rather similar, although not the same, as conventional Hilbert spaces.

** 13 November 2014 1st BLOC Seminar, KCL, London **: Huw Price (Philosophy, Cambridge), “Two Paths to the Paris Interpretation”

Abstract: In 1953 de Broglie’s student, Olivier Costa de Beauregard, raised what he took to be an objection to the EPR argument. He pointed out that the EPR assumption of Locality might fail, without action-at-a-distance, so long as the influence in question is allowed to take a zigzag path, via the past lightcones of the particles concerned. (He argued that considerations of time-symmetry counted in favour of this proposal.) As later writers pointed out, the same idea provides a loophole in Bell’s Theorem, allowing a hidden variable theory to account for the Bell correlations, without irreducible spacelike influence. (The trick depends on the fact that retrocausal models reject an independence assumption on which Bell’s Theorem depends, thereby blocking the derivation of Bell’s Inequality.) Until recently, however, it seems to have gone unnoticed that there is a simple argument that shows that the quantum world must be retrocausal, if we accept three assumptions (one of them time-symmetry) that would have all seemed independently plausible to many physicists in the years following Einstein’s 1905 discovery of the quantisation of light. While it is true that later developments in quantum theory provide ways of challenging these assumptions – different ways of challenging them, for different views of the ontology of the quantum world – it is interesting to ask whether this new argument provides a reason to re-examine the Costa de Beauregard’s ‘Paris interpretation’.

** 6 November 2014 **: Vlatko Vedral (Physics, Oxford), “Macroscopicity”

ABSTRACT: We have a good framework for how to quantify entanglement based, broadly speaking, on two different ideas. One is the fact that local operations and classical communications (LOCCs) do not increase entanglement and hence introduce a natural ordering on the set of entangled states. The other one is inspired by the mean-field theory and quantifies entanglement of a state by how difficult it is to approximate it with disentangled states (the two, while not identical, lead frequently to the same measures). Interestingly, neither of these captures the notion of “macroscopicity” which ask what states are very quantum and macroscopic at the same time. Here the GHZ states win as the ones with the highest macroscopicity, however, they are not highly entangled as far as either the LOCCs or the mean-field theory point of view. I discuss different ways of quantifying macroscopicity and exemplify them with a range of quantum experiments producing different many-body states (GHZ, and general GHZ states, cluster states, topological states). And the winner for producing the highest degree of macroscopicity is…

** 30 October 2014 **: David Wallace (Philosophy, Oxford), “How not to do the metaphysics of quantum mechanics”

Abstract: Recent years have seen an increasing interest in the metaphysics of quantum theory. While welcome, this trend has an unwelcome side effect: an inappropriate (and often unknowing) identification of quantum theory in general with one particular brand of quantum theory, namely the nonrelativistic mechanics of finitely many point particles. In this talk I’ll explain just why this is problematic, partly by analogy with questions about the metaphysics of classical mechanics.

** 23 October 2014 **: Daniel Bedingham (Philosophy, Oxford), “Time reversal symmetry and collapse models”

Abstract: Collapse models are modifications of quantum theory where the wave function is treated as physically real and collapse of the wave function is a physical process. This introduces a time reversal asymmetry into the dynamics of the wave function since the collapses affect only the future state. However, it is shown that if the physically real part of the model is reduced to the set of points in space and time about which the collapses occur then a collapsing wave function picture can be given both forward and backward in time, in each case satisfying the Born rule (under certain conditions). This implies that if the collapse locations can serve as an ontology then these models can in fact have time reversal symmetry.

** 16 October 2014 **: Dennis Lehmkuhl, “Einstein, Cartan, Weyl, Jordan: The neighborhood of General Relativity in the space of spacetime theories”.

Abstract: Recent years have seen a renewed interest in Newton-Cartan theory (NCT), i.e. Newtonian gravitation theory reformulated in the language of differential geometry. The comparison of this theory with the general theory of relativity (GR) has been particularly interesting, among other reasons, because it allows us to ask how `special’ GR really is, as compared to other theories of gravity. Indeed, the literature so far has focused on the similarities between the two theories, for example on the fact that both theories describe gravity in terms of curvature, and the paths of free particles as geodesics. However, the question of how `special’ GR is can only be properly answered if we highlight differences as much as similarities, and there are plenty of differences between NCT and GR. Furthermore, I will argue that it is not enough to compare GR to simpler theories like NCT, we also have to compare it to more complicated theories; more complicated in terms of geometrical structure and gravitational degrees of freedom. While NCT is the most natural degenerative limit of GR, gravitational theory defined on a Weyl geometry (to be distinguished from a unified field theory based on Weyl geometry) and gravitational scalar-tensor theories (like Jordan-Brans-Dicke theory) are two of the most natural generalisations of GR. Thus, in this talk I will compare Newton-Cartan, GR, Weyl and Jordan-Brans-Dicke theory, to see how special GR really is as compared to its immediate neighborhood in the `space of spacetime theories’.

** 19 June 2014 **: Antony Valentini (Physics, Clemson), “Hidden variables in the early universe II: towards an explanation for large-scale cosmic anomalies”

Abstract: Following on from Part I, we discuss the large-scale anomalies that have been reported in measurements of the cosmic microwave background (CMB) by the Planck satellite. We consider how the anomalies might be explained as the result of incomplete relaxation to quantum equilibrium at long wavelengths on expanding space (during a ‘pre-inflationary phase’) in the de Broglie-Bohm formulation of quantum theory. The first anomaly we consider is the reported large-scale power deficit. This could arise from incomplete relaxation for the amplitudes of the primordial perturbations. It is shown, by numerical simulations, that if the pre-inflationary era is radiation dominated then the deficit in the emerging power spectrum will have a characteristic shape (a specific dependence on wavelength). It is also shown that our scenario is able to produce a power deficit in the observed region and of the observed magnitude, for an appropriate choice of cosmological parameters. The second anomaly we consider is the reported large-scale anisotropy. This could arise from incomplete relaxation for the phases of the primordial perturbations. We report on recent numerical simulations for phase relaxation, and we show how to define characteristic scales for amplitude and phase nonequilibrium. While difficult questions remain concerning the extent to which the data might support our scenario, we argue that we have an (at least) viable model that is able to explain two apparently independent cosmological anomalies at a single stroke.

** 12 June 2014 **: Antony Valentini (Physics, Clemson), “Hidden variables in the early universe I: quantum nonequilibrium and the cosmic microwave background”.

Abstract: Assuming inflationary cosmology to be broadly correct, we discuss recent work showing that the Born probability rule for primordial quantum fluctuations can be tested (and indeed is being tested) by measurements of the cosmic microwave background (CMB). We consider in particular the hypothesis of ‘quantum nonequilibrium’ — the idea that the universe began with an anomalous distribution of hidden variables that violates the Born rule — in the context of the de Broglie-Bohm pilot-wave formulation of quantum field theory. An analysis of the de Broglie-Bohm field dynamics on expanding space shows that relaxation to quantum equilibrium is generally retarded (and can be suppressed) for long-wavelength field modes. If the initial probability distribution is assumed to have a less-than-quantum variance, we may expect a large-scale power deficit in the CMB — as appears to be observed by the Planck satellite. Particular attention is paid to conceptual questions concerning the use of probabilities ‘for the universe’ in modern theoretical and observational cosmology.

[Key references: A. Valentini, ‘Inflationary Cosmology as a Probe of Primordial Quantum Mechanics’, Phys. Rev. D 82, 063513 (2010) [arXiv:0805.0163]; S. Colin and A. Valentini, ‘Mechanism for the suppression of quantum noise at large scales on expanding space’, Phys. Rev. D 88, 103515 (2013) [arXiv:1306.1579].]

** 5 June 2014 **: Mike Cuffaro, “Reconsidering quantum no-go theorems from a computational perspective”

Abstract: Bell’s and related inequalities are misleadingly thought of as “no-go” theorems, except in a highly qualified sense. More properly, they should be understood as imposing constraints on locally causal models which aim to recover quantum mechanical predictions. Thinking of them as no-go theorems is nevertheless mostly harmless in most circumstances; i.e., the necessary qualifications are, in typical discussions of the foundations of quantum mechanics, understood as holding unproblematically. But the situation can change once we leave the traditional context. In the context of a discussion of quantum computation and information, for example, our judgements regarding which locally causal models are to be ruled out as implausible will be different than our similar judgements in the traditional context. In particular, the “all-or-nothing” GHZ inequality, which is traditionally considered to be a more powerful refutation of local causality than statistical inequalities like Bell’s, has very little force in the context of a discussion of quantum computation and information. In this context it is only the statistical inequalities which can legitimately be thought of as no-go theorems. Considering this situation serves to emphasise, I argue, that there is a difference in aim between practical sciences like quantum computation and information, and the foundations of quantum mechanics traditionally construed: describing physical systems as they exist and interact with one another in the natural world is different from describing what one can do with physical systems.

** 22 May 2014** Elise Crull, “Whence Physical Significance in Bimetric Theories?”

Abstract: Recently there has been lively discussion regarding a certain class of alternative theories to general relativity called bimetric theories. Such theories are meant to resolve certain physical problems (e.g. the existence of ghost fields and dark matter) as well as philosophical problems (e.g. the apparent experimental violation of relativistic causality and assigning physical significance to metrics).

In this talk, I suggest that a new type of bimetric theory wherein matter couples to both metrics may yield further insights regarding those same philosophical questions, while at the same time addressing (perhaps to greater satisfaction!) the physical worries motivating standard bimetric theories.

**15 May 2014**: Julian Barbour (Independent), “A Gravitational Arrow of Time”.

Abstract: My talk (based on arXiv: 1310.5167 [gr-qc]) will draw attention to a hitherto unnoticed way in which scale-invariant notions of complexity and information can be defined in the problem of N point particles interacting through Newtonian gravity. In accordance with these definitions, all typical solutions of the problem with nonnegative energy divide at a uniquely defined point into two halves that are effectively separate histories. They have a common ‘past’ at the point of division but separate ‘futures’. In each half, the arrow from past to future is defined by growth of the complexity and information. All previous attempts to explain how time-symmetric laws can give rise to the various arrows of time have invoked special boundary conditions. In contrast, the complexity and information arrows are inevitable consequences of the form

of the gravitational law and nothing else. General relativity

shares key structural features with Newtonian gravity, so it may be possible to obtain similar results for Einsteinian gravity.

** 8 May 2014 **: Simon Saunders (Philosohy, Oxford), “Reference to indistinguishables, and other paradoxes”.

Abstract:There is a seeming-paradox about indistinguishables: if described only by totally symmetric properties and relations, or by totally (anti)-symmetrized states, then how is reference to them possible? And we surely do refer to subsets of indistinguishable particles, and sometimes individual elementary particles (as in: the electrons, protons, and neutrons of which your computer screen is composed). Call it the paradox of composition.

The paradox can be framed in the predicate calculus as well, in application to everyday things: indistinguishability goes over to weak discernibility. It connects with two other paradoxes: the Gibbs paradox and Putnam’s paradox. It also connects with the hole argument in General Relativity. They are none of them the same, but they have a common solution.

This solution centres on the way that mathematical representations, including the set-theoretical constructions of model theory, connect with the world. The connection is by structural similarity, not by coordinates or particle labels (in physics), or (in model theory) by elements of sets. The appearance to the contrary is fostered by the simplicity of ostensive reference, on the one hand, and the assignment to structures of particle labels, coordinates, and elements of sets, on the other.

**1 May 2014 **: Oliver Pooley (Philosophy, Oxford), “New work on the problem of time”

Abstract: One aspect of the “Problem of Time” in canonical general relativity results from applying to the theory Dirac’s seemingly well-established method of identifying gauge transformations in constrained Hamiltonian theories. This “orthodox” move identifies transformations generated by the first-class constraints as mere gauge. Applied to GR, the strategy yields the paradoxical result that no genuine physical magnitude takes on different values at different times. This orthodoxy is also what underwrites the derivation of the timeless Wheeler–DeWitt equation. It is thus intimately connected to one of the central interpretative puzzles of the canonical approach to quantum gravity, namely, how to make sense of a profoundly timeless quantum formalism.

This talk reviews some recent challenges to the technical underpinning of the orthodox view. Brian Pitts has argued that, in general, first-class constraints generate “not a gauge transformation, but a bad physical change”, even for theories like electromagnetism that are standardly taken to illustrate the correctness of orthodoxy. I argue that Pitts’ results are largely orthogonal to resolving the Problem of Time, and that they leave the orthodox interpretation of phase space untouched. Instead, I will endorse a very different criticism of Dirac’s position, due to Barbour and Foster. As Thébault has stressed, one moral is that a Hamiltonian theory can be manifestly deterministic even if physical magnitudes do not commute with some of the first-class constraints, namely, those that generate reparameterizations of histories. Unfortunately, due to its foliation (in distinction to reparameterization) invariance, Hamiltonian GR suffers from a residual apparent indeterminism. Replacing GR by shape dynamics is one “solution”. I will consider the prospects of finding an alternative.

**13 March 2014**: Philip Goyal (Physics, Cambridge), “An Informational Approach to Identical Particles in Quantum Theory”.

Abstract: A remarkable feature of quantum theory is that particles with identical intrinsic properties must be treated as indistinguishable if the theory is to give valid predictions. In the quantum formalism, indistinguishability is expressed via the symmetrization postulate, which restricts a system of identical particles to the set of symmetric states (`bosons’) or the set of antisymmetric states~(`fermions’).

However, the precise connection between particle indistinguishability and the symmetrization postulate has not been clearly established. There exist a number of variants of the postulate that appear to be compatible with particle indistinguishability. In particular, the widely influential topological approach due to Laidlaw & DeWitt and Leinaas & Myrheim implies that its validity depends on the dimensionality of space. This variant leaves open the possibility that identical particles are generically able to exhibit so-called anyonic behavior in two spatial dimensions.

Here we show that the symmetrization postulate can be derived on the basis of a simple novel postulate. This postulate establishes a functional relationship between the amplitude of a process involving indistinguishable particles and the amplitudes of all possible transitions when the particles are treated as distinguishable. The symmetrization postulate follows by requiring consistency with the rest of the quantum formalism. The key to the derivation is a strictly informational treatment of indistinguishability which prohibits the labelling of particles that cannot be experimentally distinguished from one another. The derivation implies that the symmetrization postulate admits no natural variants. In particular, the possibility that identical particles generically exhibit anyonic behaviour is excluded.

[1] “Informational Approach to Identical Particles in Quantum Theory”, http://arxiv.org/abs/1309.0478

**6 March 2014**: Sean Gryb, “Symmetry and Evolution in Quantum Gravity”.

Abstract: A key obstruction for obtaining a non-perturbative definition of quantum gravity is the absence of a sensible quantum representation of spacetime refoliations, including global refoliations – or reparametrizations. We propose that these difficulties can be avoided by following a procedure for defining a degree of freedom due to Poincare, where emphasis is put on the independently specifiable initial data of the system, and a proposal for the decomposition of these degrees of freedom due to York (both of these ideas have later been advocated by Barbour). In our proposal, local refoliations are replaced by local scale transformations using a symmetry trading procedure developed in the “Shape Dynamics” approach to classical gravity. Then, global refoliations are dealt with using a technique similar to that used in unimodular gravity. We will first try to provide the philosophical motivation for our procedure then propose a set of formal equations which represent the quantization of a theory that is classically equivalent to General Relativity. However, the quantum theory we will propose has both a well-defined notion of local symmetry and global time evolution. Time permitting, we will also discuss explicit symmetry reduced toy models exhibiting some of the key features of our proposal.

**20 February 2014** Tessa Baker, “Cosmological Tests of Gravity”.

Abstract: The past decade has witnessed a surge of interest in extensions of Einstein’s theory of General Relativity. It is hoped that such theories of `modified gravity’ might account for the observed accelerating expansion rate of the universe, providing a more satisfactory and physical explanation than that of a simple cosmological constant.

I will give an overview of current attempts to extend GR, and how to test them observationally. I’ll describe a formalism that has been constructed to carry out these tests, a cosmological analogue of the Parameterised Post-Newtonian framework (PPN) that is used to test gravity in the Solar System. I’ll show how this new formalism acts as a bridge between the (sometimes disparate!) worlds of theory and observation, allowing us to make real progress in our understanding of gravity.

**13 February 2014**: Joerg Schmiedmayer, “How does the classical world emerge from microscopic quantum evolution”.

Abstract: The world around us follows the laws of classical physics, processes are irreversible, and there exists a ‘arrow of time’. On the microscopic scale our world is governed by quantum physics, its evolution is ‘unitary’ and reversible. How does the classical world emerge from the microscopic quantum world? We conjecture that the classical world naturally emerges from the microscopic quantum world through the complexity of large quantum systems. We have now for the first time the ability to probe this conjecture in the laboratory. Modern experimental techniques allow us to monitor the evolution of isolated quantum systems in detail. First experiments using ensembles of ultra cold atoms allow us to test fundamental aspects of the quantum to classical transition in such an quantum system completely isolated from it environment. I will present the concepts behind the emergence conjecture and show first experiments, which probe some of the fundamental aspects of it.

**November 14 2013**: Jeff Bu (Philosophy, Maryland), “Quantum Interactions with Closed Timelike Curves and Superluminal Signaling”.

Abstract: There is now a significant body of results on quantum interactions with closed timelike curves (CTCs) in the quantum information literature, for both the Deutsch model of CTC interactions (D-CTCs) and the projective model (P-CTCs). As a consequence, there is a prima facie argument exploiting entanglement that CTC interactions would enable superluminal and, indeed, effectively instantaneous signaling. In cases of spacelike separation between the sender of a signal and the receiver, whether a receiver measures the local part of an entangled state or a disentangled state to receive the signal can depend on the reference frame. A proposed consistency condition gives priority to either an entangled perspective or a disentangled perspective in spacelike separated scenarios. For D-CTC interactions, the consistency condition gives priority to frames of reference in which the state is disentangled, while for P-CTC interactions the condition selects the entangled state. It follows that there is a procedure that allows Bob to signal to Alice in the past via relayed superluminal communications between spacelike separated Bob and Clio, and spacelike separated Clio and Alice. This opens the door to time travel paradoxes in the classical domain. Ralph (arXiv1107.4675) first pointed this out for P-CTCs, but Ralph’s procedure for a ‘radio to the past’ is flawed. Since both D-CTCs and P-CTCs allow classical information to be sent around a spacetime loop, it follows from a result by Aaronson and Watrous (Proc.Roy.Soc. A, 465:631–647, 2009) for CTC-enhanced classical computation that a quantum computer with access to P-CTCs would have the power of PSPACE, equivalent to a D-CTC-enhanced quantum computer. (The talk represents joint work with Allen Stairs.)

**November 7 2013** Sam Fletcher (Irvine), “On the Reduction of General Relativity to Newtonian Gravitation”

Abstract: Accounts of the reduction of general relativity (GR) to Newtonian gravitation (NG) usually take one of two approaches. One considers the limit as the speed of light c → ∞, while the other focuses on the limit of formulae (e.g., three-momentum) in the low-velocity limit, i.e., as v/c ≈ 0. Although the first approach treats the reduction of relativistic spacetimes globally, many have argued that ‘c → ∞’ can at best be interpreted counterfactually, which is of limited value in explaining the past empirical success of NG. The second, on the other hand, while more applicable to explaining this success, only treats a small fragment of GR. Further, it usually applies only locally, hence is unable to account for the reduction of global structure. Building on work by Ehlers, I propose a different account of the reduction relation that offers the global applicability of the c → ∞ limit while maintaining the explanatory utility of the v/c ≈ 0 approximation. In doing so, I highlight the role that a topology on the collection of all spacetimes plays in defining the relation, and how the choice of topology corresponds with broader or narrower classes of observables that one demands be well-approximated in the limit.

**October 31 2013** Paul Hoyningen-Heune (Leibniz University of Hannover), “The dead end objection against convergent realisms”.

Abstract: The target of the dead end objection is any kind of scientific realism that bases its plausibility on the stable presence of some X in a sequence of succeeding theories. For instance, if X is a set of theoretical entities that remains stable even over some scientific revolutions, this may be taken as support for convergent scientific realism about entities. Likewise, if X is a similarly stable set of structures of theories, this may be taken as support for (convergent) structural realism. The dead end objection states that the conceded stability of X could also be due to the existence of an empirically extremely successful though ontologically significantly false theory. In this case, the inference from the stability of X to the probable reality of X would become invalid. Three examples from the history of science illustrate how the stability of some X over an extended period of time was indeed erroneously taken to indicate the finality of X.

**October 24 2013** Basil Hiley (Physics, Birkbeck), “Bohmian Non-commutative Dynamics: Local Conditional Expectation Values are Weak Values.”

Abstract: Quantum dynamics can be described by two non-commutative geometric Clifford algebras, one of which describes the properties of the covering space of the symplectic manifold [1]. This gives rise to a non-commutative probability theory with conditional expectation values that correspond to local quantum properties which appear as weak values [3]. Examples of these are the T0μ(x) components of the energy-momentum tensor which, in turn, cor- respond to the Bohm momentum and Bohm energy for Schr ¨odinger, Pauli and Dirac particles [2]. In the case of photons, the Bohm momentum has already been measured by Kocis [4]. I will explain the theoretical background and discuss some new experiments involving weak measurements on non- zero rest mass particles that are being developed by Robert Flack [UCL] and myself to explore these ideas further.

**October 17 2013**Edward Anderson (DAMPT, Cambridge), “Background independence”.

Abstract: This talk concerns what background independence itself is (as opposed to some particular physical theory that is background independent). This notion mostly arises from a layer-by-layer analysis of the facets of the Problem of Time in Quantum Gravity. Part of this notion consists of relational postulates. These are identified as classical precursors of two of the facets, and are tied to the forms of the GR Hamiltonian and momentum constraints respectively. Other aspects of Background Independence include the algebraic closure of these constraints, expressing physics in terms of beables, foliation-independence, the reconstruction of spacetime from space. The final picture is that background independence – a philosophically desirable and physically implementable feature for a theory to have – has the facets of the Problem of Time among its consequences. Thus these arise naturally and are problems to be resolved, as opposed to avoided `by making one’s physics background-dependent in order not to have these problems’. This serves as a selection criterion that limits the use of a number of model arenas and physical theories.