PFD version of the program (published 20 July 2015)
The teaching will take place in the room 1 in the Metsätalo Building (address: Unioninkatu 40, main entrance on Fabianinkatu side of the building).
(updated 15 June 2015)
Samson Abramsky
Oxford
We will present an introduction to some ideas in quantum information and foundations, emphasising logical and structural aspects. The aim is to understand how quantum mechanics requires a radical revision to our view of the nature of physical reality, while at the same time opening up new possibilities in the informatic realm. The main emphasis will be the concepts of nonlocality, contextuality and entanglement.
We will attempt to keep the prerequisites to a minimum. For the most part, these will be amount to basic knowledge of discrete mathematics, discrete probability and complex numbers.
In part 3, some very basic familiarity with category theory and topology would be useful.
The material will be in four main parts.
Observational scenarios.
Basic notions of contextuality and non-locality.
A logical approach to Bell inequalities.
Logical forms of contextuality and non-locality, a hierarchy of these notions.
Quantum resources.
From bit to qubits. Basic features of finite dimensional quantum mechanics.
Quantum realisations of empirical systems.
Examples where quantum resources exceed the capabilities of classical systems.
The broader world of no-signalling theories.
The mathematical structure of non-locality and contextuality.
We shall explore the rich mathematical structures underlying these concepts. The study of non-locality and contextuality can be expressed in a unified and generalised topological form in the language of sheaves or bundles, in terms of obstructions to global sections. These obstructions can, in many cases, be witnessed by cohomology invariants.
Contextuality in the classical world. Links between the contextual structures developed in the previous parts of the course and classical computation, including relational databases, constraints, and natural language semantics.
LINKS to supplementary material:
An overview of the material to be presented is in
http://arxiv.org/abs/1406.7386
The basic approach is described in
http://arxiv.org/abs/1102.0264
The logical view of Bell inequalities is described in
http://arxiv.org/abs/1203.1352
The cohomological ideas are in
http://arxiv.org/abs/1111.3620
The connections with databases are in
http://arxiv.org/abs/1208.6416
Jeremy Avigad
Carnegie Mellon
There is a tension in modern mathematics between the desire for powerful conceptual methods and the desire for methods that deliver explicit quantitative and computational information. Proof theory was born of this tension, and offers results that help explain how it is that abstract methods always have concrete content.
In this series of lectures, we will consider "conservation results," old and new, that help bridge the gap between the abstract and the concrete. We will also explore methods and results in "proof mining," a field which uses proof-theoretic methods to uncover additional information hidden in ordinary mathematical proofs.
Laura Fontanella
Hebrew University
Research in contemporary set theory has lead to the investigation of particular features of large cardinals, namely their `properties of reflection'. These correspond to general statements for a cardinal kappa establishing roughly that, for a given structure S of size kappa with some particular property (e.g. a stationary subset of kappa), there exists an ordinal alpha below kappa such that the intersection of S with alpha preserves the property (e.g. S intersection alpha is stationary in alpha).
Reflection principles are known to have many important applications not only in set theory, but also for the study of various infinite algebraic and topological structures. Despite their strength, those principles can hold even at small cardinals, e.g. assuming the consistency of large cardinals it is possible to force models of ZFC in which even a very small cardinal such as aleph_2 satisfies very strong properties of reflection.
We will discuss several applications of these principles.
Depending on students' familiarity with forcing we will examine some methods for forcing such properties at small cardinals (this will lead us to explore an additional topic, namely the general technique for combining forcing with large cardinals).
Familiarity with elementary set theory (i.e. the basic theory of cardinals and ordinals) is assumed.
Reading material
- J. Cummings, Notes on singular cardinal combinatorics, Notre Dame Journal of Formal Logic 46 (2005), 251--282
- J. Cummings, Large cardinals, forcing and reflection, Proc. of the RIMS symposium on ``Reflection principles and the set theory of large cardinals'' (2013)
- T. Eisworth, Successors of singular cardinals, in the Handbook of set theory, M. Foreman and A. Kanamori eds. 1229--1350, Springer, 2010.
- M. Foreman, M. Magidor and S. Shelah, Martin's maximum, saturated ideals, and nonregular ultrafilters I, Annals of Mathematics 127 (1988), 1--47.
- A. Kanamori, The higher infinite, Perspectives in Mathematical Logic, Springer-Verlag, Berlin, 1994.
- K. Kunen, Set theory, an introduction to independence proofs, North-Holland, 1980.
Curtis Franks
Notre Dame
We will trace the origins and evolution of the notion of logical completeness from antiquity to the present. Our focus will be on how the tenor of mathematical research has continually informed how meaningfully to ask of a logical system whether it is complete. Reflecting on the several coherent notions of logical completeness that we will extract from the history of mathematics helps us to see that the prevailing conception of completeness is highly contingent and that the centrality of this conception in our thinking about mathematics is an achievement in its own right. We also will see the value of rival conceptions of logical completeness and the ways of thinking about mathematics that led to them. From a technical point of view, we will consider the precise relationships among the several notions of completeness that have appeared in our history. Our focus will be on the work of Aristotle, Bolzano, Hilbert, Gödel, Gentzen, and Henkin.
Åsa Hirvonen
Helsinki
Classification theory studies classes of structures and aims at finding sets of invariants that will determine the structures up to isomorphism, or at proving no such invariants exists. The first crude dividing line is the stability spectrum, where classes are graded according to the number of types in the models. Finer dividing lines are offered by various order properties.
In this tutorial we will present methods developed for classification. We will also see how the basic methods have been extended to classes outside the original framework of stable first order theories.
SLIDES, DAY 1
SLIDES, DAY 2
SLIDES, DAY 3
SLIDES, DAY 4
SLIDES, DAY 5
Nicole Schweikardt
Berlin
In this tutorial, we will consider formulas where, apart from the symbols in the given vocabulary, also predicates for linear order and arithmetic may be used.
In particular, order-invariant formulas are formulas for which the following is true: If a structure satisfies the formula with one particular linear order of the structure's universe, then it satisfies the formula with any linear order of the structure's universe. Arithmetic-invariant formulas are defined analogously, where apart fromthe linear order other arithmetic predicates may be used in an invariant way.
When restricting attention to finite structures, it is known that order-invariant first-order logic (FO, for short) is strictly more expressive than plain FO, and arithmetic-invariant FO can express exactly the properties that belong to the circuit complexity class AC0. On the other hand, by Trakthenbrot's theorem we know that order-invariance (on the class of finite structures) is undecidable. The aim of this tutorial is to give an overview of the state-of-the art concerning the expressive power of order-invariant and arithmetic-invariant logics..
"A short tutorial on order-invariant logics" [preliminary version]
[The paper has on appeared in the Proceedings of CSR 2013, 8th International Computer Science Symposium in Russia, 2013]
Moshe Y. Vardi
Rice University
Algorithmic verification is one of the most successful applications of automated reasoning in computer science. In algorithmic verification one uses algorithmic techniques to establish the correctness of the system under verification with respect to a given property. Automama-theoretic Model checking is an algorithmic-verification technique that is based on a small number of key ideas, tying together graph theory, automata theory, and logic. In this self-contained tutorial I will describe how this "holy trinity" gave rise to algorithmic-verification tools, and discuss its applicability to both finite-state and infinite-state systems.