ProgLog /

# Programming Logic Seminar

Unless otherwise stated, talks take place at 13:15-14:15 in room 8103 (by the lunch room).

(There is a standing booking of room 8103 for Proglog meetings, every Wednesday 13:15-14:30, starting 2013-07-24.)

## UPCOMING TALKS

**2015**

Jesper Cockx is visiting June 9th to July 18th 2015.

## PAST TALKS

**2015**

**Wednesday April 15, 13:15-14:15**: Pierre Hyvernat,*Continuous functions between streams (et al.) as datatype*Brouwer already knew that continuous functions between streams (equipped with the usual product topology) had a representation in terms of infinite trees. Peter Hancock implemented this idea in dependent type theory and described it figuratively as "eating". He tried generalizing it to more general coinductive types and it is only recently that we succeeded, by using the notion of*interaction structure*(also called*indexed container*, or*polynomial functor*by others). I will present this work, introducing the necessary concepts on a "by need" basis: coinductive types and continuous functions, polynomial functors, simulations, and inductive/recursive definitions. This is joint work with Peter Hancock.**Wednesday March 18, 13:15-14:15**: Jean-Philippe Bernardy,*Focusing by Evaluation*We present an evaluator for Girard's LL, written in Haskell. Based on this evaluator, we construct a focusing algorithm for LL. Remarkably, focusing does not require any modification of the evaluator (one just calls it). We have used focusing to generate efficient code from LL proofs (resting on the usual proofs as programs interpretation). In fact, focusing and code-generation proper can be fused to obtain a shorter, and perhaps more clear code-generator. Joint work with Victor Lopez Juan.**Wednesday March 4, 13:15-14:15**: Peter Dybjer,*Undecidability of Equality in the Free Locally Cartesian Closed Category*(joint work with Simon Castellan and Pierre Clairambault) We show that a version of Martin-Löf type theory with extensional identity, a unit type N1, Sigma, Pi, and a base type is a free locally cartesian category on one object in a 2-categorical sense. We also show that equality in this category is undecidable by reducing it to the undecidability of convertibility in combinatory logic.**Wednesday February 25, 13:15-14:15**: Magnus Myreen,*CakeML: a formally verified compiler for ML*I will present the CakeML project. This project centers around a subset of ML that is carefully chosen to be easy to write programs in and convenient to reason about formally. I will talk about results so far, in particular, our formally verified implementation of CakeML (a compiler and a read-eval-print loop) in 64-bit x86 machine code. The construction and verification of this implementation required both proofs of high-level programs (the parser, type inferencer, and compiler) and low-level programs (the runtime in x86 machine code, its bignum library, and garbage collector). We used a novel technique to produce much of the verified low-level code: we bootstrapped the verified compiler algorithm, within higher-order logic, to produce the verified low-level implementation. I will also explain how divergence preservation was proved using simple clocks. This work was carried out using the HOL4 theorem prover.**Wednesday February 18, 13:15-14:15**: Víctor López Juan,*A classical approach to compositional array programming*Linear logics treat propositions as resources which must be produced or consumed a fixed number of types. This makes them potentially well-suited for modelling performance-sensitive programs. We use the classical variant of linear logic (CLL, Girard 1987) as the basis for a language in which programs can be composed in a functional style. At the same time, duality and linearity provide a straightforward fusion scheme, and guarantee that its application will not duplicate work. The result is more general than current approaches, which either rely on compiler heuristics or the use of a fixed set of combinators. In the talk I will briefly describe the logic and our extensions to it. I will then discuss the core ideas behind the compilation strategy. Finally, I will show some examples of array kernels written in the language, and benchmarks thereof.**Wednesday January 14, 13:15-14:15**: Andrea Vezzosi,*Representation Independent Existential Types in the Presheaf Model of Parametricity*In the "existential" view of parametricity existential types are the tool to define abstract data types, where two different internal representations are still equal if they cannot be told apart with the provided interface. I will present a truncation operation that when applied to Sigma types gives the expected existentials. In their presheaf model of parametricity Atkey et al. define a universe of discrete reflexive graphs to recover the identity extension lemma. That universe happens to be a reflective subcategory of sheaves and the sheafification functor is the truncation from the general universe of reflexive graphs. I will then use such existentials to define inductive types a la Guarded Recursive Types.

**2014**

**Friday December 12**: Anders Mörtberg, PhD thesis defense: Formalizing Refinements and Constructive Algebra in Type Theory**Thursday December 11**: Type theory and formalization of mathematics (in kårhuset, Ascom room)**9:30 - 10:00**Fika**10:00 - 11:00**Martin-Lof -*Making sense of normalization by evaluation***11:15 - 12:15**Palmgren -*Variants of categories with families and their formalization*

Abstract: Categories with families (cwfs) is one of the semantical structures for Martin-Löf type theory (MLTT) that is closest to the syntax of the theory. Formalizing the notion in type theory it self present some choices, whether to consider cwfs whose equality is interpreted as definitional equality in types and terms, which restricts the number of models, or is interpreted as extensional equality. The latter presents some problems of coherence. We also consider cwfs with named projections and their relation to Martin-Löf's Substitution Calculus (1992). This talk presents work in progress.**12:15 - 13:45**Lunch**13:45 - 14:45**Gonthier -*Constructing Algebraic Numbers*

While ordinarily algebraic numbers are introduced as a subset of the (classical) complex numbers, it is common folklore that they can also be obtained constructively. However actually constructing algebraics is not so easy, and doing so elegantly and rigorously in a formal proof system is even harder. With R. O'Connor and C. Cohen, we had to revisit constructively several topics in number theory to arrive at a satisfactory resolution. In the process we also got an answer of sorts to another ancient conundrum: isn't the Fundamental Theorem of Algebra really a theorem in Analysis?**15:00 - 16:00**Lumsdaine -*Characterising identity types of inductive types: from 1 ≠ 0 to Blakers–Massey.*

The “code-decode” method has been one of the most fruitful discoveries in the type-theoretic ("synthetic") development of homotopy theory. It extends traditional type-theoretic characterisations of identity types of inductive types (e.g. decidable equality of booleans and naturals), to similarly characterise or approximate the identity types of*higher*inductive types, recovering close analogues of various classical homotopy-theoretic results.**16:00 - 16:30**Fika**16:30 - 17:30**Rasmus Møgelberg -*Topos of trees***Wednesday December 3, 13:15-14:15**: Jean-Philippe Bernardy,*Controlled Array Fusion Using Linear Types*One issue with today's functional languages is that they exhibit unpredictable efficiency: a seemingly harmless refactoring may significantly decrease the efficiency of the compiled code. We attribute this unpredictability to the following tension: on the one hand removal of intermediate data structures (deforestation, or fusion) is critical to the performance of functional programs, but on the other hand fusion is unreliable. Based on classical linear logic, we present a calculus with reliable fusion, and sufficient to express many array-based computations. An additional benefit of our design is that parallelism opportunities are made explicit, allowing efficient compilation to parallel architectures.**Wednesday November 26, 13:15-14:15**: Simon Huber,*Implementing a variation of the cubical set model*I will report on a new implementation of a type checker based on a variation of the cubical set model of type theory. It features function extensionality, the univalence axiom, higher inductive types like the circle or propositional truncation, and the equality for J is definitional.**Wednesday November 12, 13:15-14:15**: Andrea Vezzosi,*A Formalized Proof of Strong Normalization for Guarded Recursive Types*Nakano's "later" modality allows negative recursive types and guarded general recursion without introducing inconsistencies. The modality has found applications for synthetic step-indexing, enforcing causality in Functional Reactive Programming and ensuring productivity of coinductive definitions. We present a strong normalization proof and its formalization in Agda for a small-step reduction parametrized by a natural number ``depth'' that expresses under how many guards we may step during evaluation.**Wednesday November 5, 13:15-14:15**: Benjamin Pierce,*Linguistic Foundations for Bidirectional Transformations*Note: Room changed to**EA**! Computing is full of situations where two different structures must be "connected" in such a way that updates to each can be propagated to the other. This is a generalization of the classical view update problem, which has been studied for decades in the database community; more recently, related problems have attracted considerable interest in other areas, including programming languages, software model transformation, user interfaces, and system configuration. Among the fruits of this cross-pollination has been the development of a linguistic perspective on the problem. Rather than taking some view definition language as fixed (e.g., choosing some subset of relational algebra) and looking for tractable ways of "inverting" view definitions to propagate updates from view to source, we can directly design new bidirectional programming languages in which every expression denotes a pair of functions mapping updates on one structure to updates on the other. Such structures are often called lenses. The foundational theory of lenses has been studied extensively, and lens-based language designs have been developed in several domains, including strings, trees, relations, graphs, and software models. These languages share some common elements with modern functional languages -- in particular, they come with very expressive type systems. In other respects, they are rather novel and surprising. This talk surveys recent developments in the theory of lenses and the practice of bidirectional programming languages.**Wednesday October 15, 13:15-14:15**: Francesco Mazzoli,*Type checking in the presence of meta-variables (reprise)*When treating meta-variables in dependently typed languages, the literature usually presents the unification algorithm as a standalone component. However the need to check definitional equality of terms while type checking gives rise to a tight interplay between type checking and unification. We propose an algorithm to express type checking entirely in the form of unification constraints, thus making the whole process significantly more modular and understandable.**Wednesday October 8, 13:15-14:15**: Bas Spitters*Formalizing Mathematics in the Univalent Foundations*We have been formalizing mathematics in type theory for many years. Although great progress has been made, there were still a number of issues that are hard to tackle, especially when leaving the realm of discrete mathematics. We mention a lack of quotient types and the absence of a clear mathematical semantics for the Coq type theory. Homotopy type theory promises a solution to these challenges. I will discuss how to develop some mathematics in the univalent foundations and report on the design of our Coq library. Basic familiarity with type theory and some categorical semantics is assumed.**Wednesday September 24**: Koen Claessen:*Automated reasoning for intuitionistic propositional logic using a SAT-solver**ABSTRACT:*I have built an automated theorem prover for intuitionistic propositional logic, using an existing SAT-solver and ~150 lines of Haskell code. The prover is based on applying the idea of Satisfiability Modulo Theories (SMT) *recursively*. SMT is a technique for implementing automated theorem provers that expresses the logic in which we want to prove things as an extension of propositional logic, so that we can use a SAT-solver to do the heavy lifting in the propositional part of the logic. My prover constructs proofs as well as counter models. It beats all existing automated provers on most (existing and new) benchmarks. The main reasons for giving this presentation are: (1) it is a cool idea that may actually improve your understanding of intuitionistic logic (it certainly improved mine), (2) I want to ask if anyone has any applications of this logic , (3) it may be more useful to extend the ideas to intuitionistic first-order logic but I don't know how to do this yet and I may need your help.**Wednesday September 17**: Pierry Boutry (University of Strasbourg):*On decidability of Tarski's system of geometry**ABSTRACT:*In 1948 Alfred Tarski established the decidability of the full first order theory of any instance of real closed field with decidable comparison or any instance of Tarski’s system of geometry with decidable equality. He achieved it by proving the completeness of his quantifier elimination procedure for the first order theory of real closed fields. This procedure was programmed and proved correct and complete in Coq by Cyril Cohen and Assia Mahboubi. Our goal is to prove that this procedure can be applied to Tarski's system of geometry. To do so we first need to prove two statements. First that the first ten axioms of Tarski’s system of geometry are a model of plane over an ordered fields which was recently formalized in Coq by Gabriel Braun. Second that any plane over an ordered field is a model of the first ten axioms of Tarski's system of geometry. We will first recall the axioms of both of these theories. We will then explain the modifications we made to Gabriel Braun's formal proof in order to get a proof which does not rely on the use of the excluded middle. Next we will present our approach for proving the second statement. Finally we will state what will remain to be done to formally prove the completeness of Tarski's system of geometry.**Wednesday September 10**: Florian Rabe (Jacobs University, Bremen):*MMT: A Foundation-Independent Approach to Declarative Languages**ABSTRACT:*MMT is a framework for representing declarative languages such as logics, type theories, or set theories. MMT achieves a high level of generality by systematically avoiding a commitment to a particular syntax or semantics. Instead, individual language features (e.g., lambda-abstraction, conjunction, etc.) are represented as reusable modules, which are composed into concrete languages. These modules can be declarative by specifying features as MMT theories or programmatic by providing individual rules as plugins. Despite this high degree of abstraction, it is possible to implement advanced algorithms generically at the MMT level. These include knowledge management algorithms (e.g, IDE, search, change management) as well as logical algorithms (e.g., parsing, type reconstruction, module system). Thus, we can use MMT to obtain advanced implementations of declarative languages at extremely low cost.**Wednesday August 20**: Damien Rouhling (ENS Lyon):*Universes in dependent type theory: let terms climb the ladder**ABSTRACT:*In 2011, Conor McBride wrote a blog post about the possibility of combining cumulativity and universe polymorphism in a dependent type system. He suggested to get cumulativity through a subtyping judgment and to achieve universe polymorphism by using a lifting operator. This operator just increments every universe level in a term and, thus, sends terms of one universe to terms in the next one. One can declare a term of a certain type and then use a lifted version of it when it is needed. I will show a proof of normalization by evaluation for a system in which the lifting operator is a term constructor. I won't assume any knowledge about normalization by evaluation. Only some familiarity with dependent types will be required from the audience. This will be a 20 minutes talk.**Wednesday June 25**: Damien Rouhling (ENS Lyon):*Congruence Closure Modulo a Theory**ABSTRACT:*The theory of Equality on Uniterpreted Functions can be combined with another theory. This combination is called the congruence closure of the given theory. In his PhD Stéphane Lescuyer described an algorithm to decide the consistency of a set of equalities and disequalities in such a combination. His approach gets its genericity from the use of a congruence on terms, hence the name congruence closure *modulo*. I will explain how I have exploited this algorithm to build a decision procedure for an automated prover called PSYCHE.**Wednesday June 18**: Andrea Vezzosi:*A Categorical Perspective on Pattern Unification**ABSTRACT:*In 1991 Miller described a subset of the higher-order unification problem for the Simply Typed Lambda Calculus which admits most general unifiers, called the pattern fragment. This subset has been extended to more complex type theories and it is still used as the basis of modern unification algorithms in applications like proof search and type inference. I will give a new presentation of the original unification algorithm that focuses on the abstract properties of the operations involved, using category theory as a structuring principle.**Monday April 7**: Rasmus Møgelberg (ITU Copenhagen):*Guarded recursion in type theory**ABSTRACT:*This talk is about extensions of type theory with guarded recursive types, i.e., recursively defined types in which the recursion variable only appears underneath a modal operator. One motivation for studying these type theories is that the guarded recursive types can be used to construct models of programming languages with advanced notions of state, which can be seen as synthetic variants of models constructed using step-indexing techniques, but unlike these offer a type system for expressing the model constructions. Guarded recursive variants of coinductive types have also been suggested as a way of expressing productivity requirements of coinductive definitions using types. In the talk I will describe the topos-of-trees model of guarded recursion and, if time permits, I’ll discuss intensional models of guarded recursion and relations between guarded recursive types and coinductive types.**Wednesday April 2**: Fabien Renaud (INRIA and Ecole Polytechnique):*The ProofCert project**ABSTRACT:*In the ProofCert project, we are exploring to what extent we might be able to accommodate a wide range of proof evidence from various provers and then to have independent checkers certify such proof evidence. In this talk I will present the various aspects of ProofCert as well as the design of the independent checkers, which rely on focused sequent calculi.**Wednesday March 19**: Per Martin-Löf (Stockholm University):*Dependence and causality***Wednesday March 12**: Thierry Coquand:*(Pre)sheaf models of type theory and parametricity**ABSTRACT:*We explain what is a presheaf model of type theory and illustrate this with the example of presheaf model over the category of partial bijection. This gives a semantics to the notion of "colors" of Jean-Phillipe and Guilhem, and we can also give a semantics to unary parametricity. The model of cubical sets (without the Kan condition) is a variation and gives a semantics to binary parametricity.**Wednesday February 19**: Andrea Vezzosi:*Mechanizing Pattern Unification**ABSTRACT:*In 1991 Miller described a subset of the higher-order unification problem which admits most general unifiers, called the pattern fragment. This subset is still used as the basis of modern unification algorithms in applications like proof search and type inference. Starting from an intrinsically typed syntax for the Simply Typed Lambda Calculus, I will present a formalization of the original algorithm as an Agda program whose type ensures both soundness and completeness. In the implementation we'll make use of basic abstractions from category theory to structure our proofs and let the type constraints guide us to find the unifier.**Wednesday February 12**: Fredrik Lindblad:*A sequent calculus style of case expressions**ABSTRACT:*How to deal with unification combined with iota-reduction in agsy was a long-lasting issue. A couple of problem examples finally made it clear that the previous hack was not only unappealing but could also lead to performance problems. I will present an alternative way of representing case expressions and recursive functions, analogous to the way normal application is represented in agsy and agda. With a sequent calculus style of applications (variable head plus list of arguments) syntax prohibits beta-redexes. Correspondingly, a sequent calculus style of case expressions prohibits iota-redexes, making unification a cleaner business.**Wednesday January 29**: Thorsten Altenkirch (University of Nottingham):*Containers in Homotopy Type Theory**ABSTRACT:*Homotopy Type Theory (HoTT) is a new extension of Martin-Löf Type Theory which has been proposed as a new foundation of Mathematics. One of its main applications so far has been to use type theory to model constructions in homotopy theory. In this talk I will look at a potential application of HoTT in computer science - more specifically using containers to model datatypes. It turns out that containers give rise to a much richer notion of datatypes in this setting, including multisets and cycles (a list without a beginning). There is an interesting interaction with the notion of a derivative of datatypes in particular the existence of antiderivatives which was investigated by Gylterud.**Wednesday January 22**: Bassel Mannaa:*Sheaf model of algebraically closed field extension**ABSTRACT:*I’ll briefly present dynamical computation of algebraic numbers. I’ll then present a model of this computation as a sheaf object over the site of finitely presented regular algebras. We also show that the axiom of choice fails for this model.**Wednesday January 15**: Jean-Philippe Bernardy:*A sequent-calculus presentation of type-theory**ABSTRACT:*We aim to construct a type-theory which can be used as a backend for dependently-typed languages such as Agda or Coq. Such a language should be - a type-theory: correctness should be expressible via types- low-level: one should be able to translate various high-level languages into this language while retaining properties such as run-time behaviour, etc.
- minimal: the type-checking program should be amenable to verification.

- syntax
- translation from a 'natural syntax'
- evaluation
- 'definitional' equality
- type-checking
- examples

**2013**

**Wednesday December 18**: Thierry Coquand:*An implementation of univalence**ABSTRACT:*Together with Anders, Cyril and Simon, we have implemented computations for type theory with univalence (and propositional truncation). This implementation takes a closed term of this extension of type theory and computes its value in the cubical set model. If this term is of type Nat we get a numeral as a result (but the computation may use higher dimensional object during the computation). In particular we can transport structures along isomorphisms. I will do a demo of the system.**Wednesday December 4**: Fabien Renaud:*Some Insights into focusing**ABSTRACT:*Focusing was originally presented as an efficient search procedure for linear logic. I will first introduce focusing as a way to constrain the shape of the proofs in different logics and discuss the different presentations found in the literature. Then I will show how focusing can be useful in logic programming (to specify the chaining method used); in functional programming (to specify evaluation orders); and if I have time, some other recent applications.**Wednesday November 27**: Cyril Cohen:*Refinements for free!*(j.w.w. Maxime Dénès and Anders Mörtberg)*ABSTRACT:*Formal verification of algorithms often requires a choice between definitions that are easy to reason about and definitions that are computationally efficient. One way to reconcile both consists in adopting a high-level view when proving correctness and then refining stepwise down to an efficient low-level implementation. Some refinement steps are interesting, in the sense that they improve the algorithms involved, while others only express a switch from data representations geared towards proofs to more efficient ones geared towards computations. We relieve the user of these tedious refinements by introducing a framework where correctness is established in a proof-oriented context and automatically transported to computation-oriented data structures. Our design is general enough to encompass a variety of mathematical objects, such as rational numbers, polynomials and matrices over refinable structures. Moreover, the rich formalism of the Coq proof assistant enables us to develop this within Coq, without having to maintain an external tool.**Wednesday November 20**: Anders Mörtberg:*A Coq formalization of finitely presented modules*(j.w.w. Cyril Cohen)*ABSTRACT:*The use of vector spaces in homology theory does not give the same information as with R-modules. A generalization to finitely presented modules covers computable Z-homology, which gives more fine-grained topological invariants. We formalize a construction of finitely presented modules over coherent and strongly discrete rings and discuss its properties. This formalization gives basic building blocks that lead to a clean definition of homology groups.**Wednesday November 13**: Andreas Abel:*Pattern inductive families need not store their indices, ever --- Or: On polymorphic type theory, with an application to proof term compression**ABSTRACT:*We review monomorphic type theory with uni-directional type inference and contrast it to bidirectional type checking, a technique somewhat related to polymorphic type theory. We present a few ideas for term compression stemming from bidirectional type checking, in particular ideas from Edwin Brady's "Inductive families need not store their indices" (TYPES 2003) and Jason Reed's "Redundancy Elimination for LF" (LFMTP 2004), the latter a clarification of George Necula and Peter Lee's "Efficient Representation and Validation of Proofs" (LICS 1998). We review the current partial implementation of these ideas in Agda's internal syntax and outline opportunities to put them to their full use (with potential to reduce the term size by the square root). We give a demonstration of their implementation in MiniAgda (under construction). This is an informal talk with half-baked results.**Wednesday November 6**: Dan Rosén:*Write and type check Agda code in your browser**ABSTRACT:*I will show my prototype implementation of a web browser editor for Agda. It supports some of the most important features from the emacs interface: syntax highlighting, type checking, refining, case split and auto. This is a shorter talk, open for discussions about future directions.**Wednesday October 30**: Åse Fahlander:*The correspondence between the universe à la Russell and the universe à la Tarski in Martin-Löf type theory**ABSTRACT:*It is generally assumed that type theory with universe(s) formulated a´ la Russell and type theory with universe(s) formulated a´ la Tarski are closely connected - rather two different ways of writing things than two different theories. I will present a proof that there is indeed a one-to-one correspondence between the derivable judgements in a version of type theory with one universe formulated a´ la Russell, and those derivable judgements in a corresponding version formulated a´ la Tarski that are irreducible with respect to a weak form of reduction. It also follows that there is a one-to-one correspondence between those derivable judgements in each theory that are irreducible with respect to ordinary reduction. The theories concerned are intensional and polymorphic, with untyped conversion in place of equality. The proof could probably be extended to cover other versions of type theory with one universe, but does not readily extend to multiple universes.**Wednesday October 16**: Iliano Cervesato:*Towards Meta-Reasoning in the Concurrent Logical Framework CLF**ABSTRACT:*The concurrent logical framework CLF is an extension of the logical framework LF designed to specify concurrent and distributed languages. While it can be used to define a variety of formalisms, reasoning about such languages within CLF has proved elusive. In this talk, we describe ongoing work towards defining an extension of LF that allows us to express properties of CLF specifications. We illustrate the approach with a proof of safety for a small language with a parallel semantics.**Wednesday September 11**: Roy Crole:*Nominal Lambda Calculus**ABSTRACT:*Reasoning about atoms (names) is difficult. The last decade has seen the development of numerous novel foundations, theories and techniques. Our work is founded on nominal sets and the slightly more general notion of FM-Sets. In the first part of the talk we shall give a brief introduction to FM-sets, expressing our ideas in the language of category theory. We shall try to give a range of concrete examples and to motivate the main part of the talk, which concerns what we term the Nominal Lambda Calculus. Theories in equational logic (also known as (simple) algebraic theories, or theories for equational reasoning) have been studied for many years. Clouston and Pitts introduced Nominal Equational Logic, which provides judgements both of expression equality, and of freshness of atoms. Just as Equational Logic can be enriched with function types to yield the lambda-calculus, we have developed Nominal Lambda Calculus (NLC) by enriching NEL with (atom-dependent) function types and abstraction types. In the second part of the talk we introduce pure NLC and look in detail at some the the rules of the NLC formal system. We show that NLC is sound by giving a categorical semantics, explaining some of the complexities that arise from a dependent type system. We then consider completeness and illustrate that we cannot define a complete categorical semantics for pure NLC. However, by adding in a novel form of dependently typed atom abstraction, we get a system that is complete. We demonstrate some of the rules for such atom abstraction, and outline how the problem with completeness can be solved in a neat manner.**Wednesday September 4**: Simon Huber:*A Model of Type Theory in Cubical Sets**ABSTRACT:*I will present a model of type theory in a constructive metatheory. The model is based on cubical sets and can be seen as a constructive version of the Kan simplicial set model of type theory; the main difference to the simplicial set model is a strengthening of the Kan filling conditions.**Thursday June 20**at**14:15**in**6128**: Koen Claessen will talk about SMT solvers Koen will say something about "the implementation of SMT solvers, how they are used, what their relationship is with full first-order logic provers, and so on".**Wednesday June 12**: Cyril Cohen:*A theory of contractible contexts***Wednesday May 7**: Cyril Cohen:*Formalization of algebraic numbers in Coq**ABSTRACT:*In this talk I show a formalization in Coq of real and complex algebraic numbers together with their structure of real closed field and algebraically closed field. These numbers are useful in the recently achieved formalization of the proof of the odd order theorem (aka Feit-Thompson theorem), where they substitute for real and complex numbers. In this proof, it is mandatory to know the partial order structure on complex numbers. This leads to build complex algebraics on top of real algebraic numbers. The construction of real algebraic numbers is based on a construction of Cauchy reals, in particular to define arithmetic operations. I will show the methodology we used to do epsilon-dela proof in a comfortable way. The final step of this construction involves a quotient, which we show how to perform. Finally, the construction of complex algebraic numbers and their order structure is done by extending real algebraic numbers with the imaginary element i and the proof of algebraic closure is the fundamental theorem of algebra.**Wednesday April 22**at**15:15**in**6128**: Nicolas Pouillard:*Constructive Probabilistic Reasoning*Joint work with Daniel Gustafsson.*ABSTRACT:*We present how to constructively reason about probabilistic functions in the Type Theory of Agda. We aim at being able to reason about security in for example e-voting protocols. Here, a lightweight model is constructed where probabilistic functions are presented as deterministic functions with an extra argument for the randomness. We build a generalised form of summation functions for a large collection of finite types. These summation functions exhibit a large quantity of derived properties, arising from two fundamental properties that these functions should have.**Wednesday March 27**: Peter Ljunglöf:*Type-based Human-Computer Interaction**ABSTRACT:*(This is an idea that I will present on the TYPES meeting in April. Since I've never been there before, I hope that you can give me lots of tips on how I can improve my presentation.) A dialogue system is a computer system that can engage in dialogue with a human in order to complete tasks such as answering questions or performing actions. Aarne Ranta and Robin Cooper described how a dialogue system can be implemented in a type-theoretic proof assistant. Metavariables in the proof tree represent questions that needs to be answered by the user. However, their theory is rather simple and yields a form-based dialogue system that cannot handle underspecified answers, anaphoric expressions, or ambiguous utterances. We extend their theory with ideas from linguistic theories of dialogue and syntax, to make the system handle more flexible dialogues. Ranta and Cooper use focus nodes to know which of the nodes in the tree that is the current question under discussion. To this we add the idea of unfixed nodes from Dynamic Syntax. An unfixed node is a subtree which we know should be attached somewhere below a given node, but we do not yet know exactly where. We use unfixed nodes for representing underspecified answers, which is a very common phenomenon occuring in everyday dialogue between human participants. The general idea of the dialogue manager is to ask the user to refine the current focus node. User utterances are then translated to (possibly incomplete) subtrees, which the system tries to incorporate into the dialogue tree. If the user utterance is of the same type as the focused metavariable, the tree can be extended directly. Otherwise the system will try to add the utterance as an unfixed node below the focus, or it will try to change focus to another metavariable that has a compatible type. There are (at least) three possible refinement strategies for deciding which node to select as the next focus, corresponding to different dialogue behaviour: I call them top-down, bottom-up and bottom-down refinement. The ideas of metavariables and focus nodes is borrowed from the Agda proof assistant. Hopefully the new additions of unfixed nodes and refinement strategies can be borrowed back, and be used for making proof assistants more intuitive to use.**Wednesday Feb 20**in**8103**: Johan Glimming:*EU Project Proposal on Constructive Programming, http://www.functor.se/research/seminar-feb-2013/***Wednesday Jan 30**in**5128**: Cyril Cohen and Peter Dybjer:*What's going on in Princeton?***Friday Jan 18**: 1300 - 1400 Stevan Andjelkovic, Strathclyde University, will give a talk in 5128 about his PhD work.**Friday Jan 11**: Henrik Nilsson, University of Nottingham, UK:*Functional Programming Gets Physical**ABSTRACT:*Equational models of physical phenomena and systems is essentially non-causal in the sense that the equations state the relation among the involved entities: there are no a priori assumptions about which entities are "known" and which are "unknown", needing to be computed from the known ones. However, making use of such models, for example, predicting the behaviour of a system through simulation, has traditionally required a manual re-formulation into causal form to make the computation of unknowns from knowns explicit. This causal re-formulation can be very cumbersome, especially for large models. This has led to the development of modelling and simulation languages that directly support a non-causal formulation, obviating the need for manual causalisation. This also turns out to give additional important methodological and modularity benefits. However, the current industrial-strength non-causal modelling and simulation languages have a number of limitations, notably when it comes to describing hybrid systems: systems that exhibit both continuous-time and discrete-time characteristics. In this talk, I'll outline a novel approach to non-causal modelling and simulation of hybrid systems, called Functional Hybrid Modelling (FHM), aimed at mitigating some of these limitations. As the name suggests, it draws from functional programming; in particular, it can be seen as a generalisation of Functional Reactive Programming (FRP), which allows very general hybrid systems to be described, except only causally. I will also briefly discuss our ongoing work on formalising parts of the static and dynamic semantics of FHM-type languages using Agda as a meta-language.

**2012**

**Wednesday Nov 7**: Cyril Cohen:*Meta-programming in Coq using Canonical Structures and Type Classes**ABSTRACT:*In Coq there are two different mechanisms for inferring more than standard type inference could do, through hints given by the user: Canonical Structure and Type Classes. Their implementation differ a lot, and the way to use them is also quite different. I will try to give some insight on how the mathematical component team [1] instrumented canonical structures to build a hierarchy of algebraic structures [2,3] and compare their approach to the math classes [4,5] approach.- http://www.msr-inria.inria.fr/Projects/math-components/
*Packaging Mathematical Structures*, François Garillot, Georges Gonthier, Assia Mahboubi and Laurence Rideau*Generic Proof Tools And Finite Group Theory*, François Garillot- http://math-classes.org/
*Type classes for mathematics in type theory*, Bas Spitters and Eelis van der Weegen

**Wednesday Oct 31**: Jean-Philippe Bernardy:*Type-theory in color**ABSTRACT:*For a long time, there has been support for erasure in the Coq proof assistant: erasing proofs/propositions yield mere programs/types. So far, programs obtained like this have remained completely separated from the world of proofs: one cannot reason about erased programs within Coq. I propose a design to remedy this issue. An immediate consequence is improved potential for reuse in dependently-typed programs and libraries. Depending on interest/time I will move on to advanced applications:- Increased logical power (parametricity)
- higher-dimensional aspects
- connections with equality and isomorphism
- non-computational function space (and equality variant based on it)
- other aspects that might arise in the discussion

**Wednesday Oct 24**: Nils Anders Danielsson:*Isomorphism implies equality**ABSTRACT:*I will talk about an application of the univalence axiom. Take two monoids whose underlying types are "sets" (which means, roughly, that the corresponding identity types are proof-irrelevant). If these monoids are isomorphic—meaning that there is a bijective homomorphism from one to the other—then the univalence axiom implies that the monoids are*equal*. Note that the situation is different in set theory, in which isomorphic monoids are not (in general) equal. The result can be generalised to a large class of algebraic structures. For details, see http://homotopytypetheory.org/2012/09/23/isomorphism-implies-equality/. This is joint work with Thierry Coquand.**Wednesday Oct 17**: Chien-Chung Huang, Max Planck Institute for Computer Science, Saarbrücken:*Near-Popular Matchings in the Roommates Problem**ABSTRACT:*Our input is a graph G = (V, E) where each vertex ranks its neighbors in a strict order of preference. The problem is to compute a matching in G that captures the preferences of the vertices in a popular way. Matching M is more popular than matching M' if the number of vertices that prefer M to M' is more than those that prefer M' to M. The unpopularity factor of M measures by what factor any matching can be more popular than M. We show that G always admits a matching whose unpopularity factor is O(log |V|), and such a matching can be computed in linear time. In our problem the optimal matching would be a least unpopularity factor matching. We will show that computing such a matching is NP-hard. In fact, for any ε, it is NP-hard to compute a matching whose unpopularity factor is at most 4/3 - ε of the optimal.**Wednesday Sep 19**: Nils Anders Danielsson:*Half-baked talk: Correct-by-construction pretty-printing**ABSTRACT:*I'll describe a correct-by-construction framework for pretty-printing. The idea is to start with a grammar, and to see pretty-printer "documents" as annotated parse trees.**Wednesday Sep 5**: Laurent Théry:*Geometric Algebras: a formalisation**ABSTRACT:*We present a formalisation of geometric algebras that has been done in the Coq proof assistant. For this, we have built a computational model of geometric algebras and show that it has all the required properties. From this model, we have derived an automatic procedure to prove problems in plane projective geometry.**Wednesday May 30**: Fredrik Lindblad:*A Deductive System Designed for Proof Search in HOL**ABSTRACT:*We have developed a variant of Agsy aimed for higher-order logic, called Agsy-HOL. I will present the deductive system which the theorem prover is based on. The system is mainly characterized by a sequent calculus style of elimination and an equality judgment which alternates between normalization and reasoning by equality chains. The soundness of the system, with respect to Church's simple type theory, has been formalized in Agda. I will also mention some implementation details and the empirical result on the TPTP higher-order problems, although not so much has changed since my last presentation.**Wednesday April 4**: Krasimir Angelov:*Types in Grammatical Framework**ABSTRACT:*In this talk I will outline the current status of the support for dependent types in GF and I will point out the differences and the similarities between GF and Agda. After that I will show why type checking and proof search in type theory are important for natural language applications. The presentation will mostly cover features that are already available but I will also mention the new challenges that we encounter when we try to combine the logic based approach of type theory with statistical language processing.**Wednesday March 28**at**15:00**, probably in**6128**: Thierry Coquand:*models of type theory with extensional equality**ABSTRACT:*I will present work in progress with Simon giving a constructive version of Voevodsky's Kan simplicial set model of type theory. This provides a way to implement a type-checker for type theory with an identity type which satisfies the axiom of extensionality (with the required canonicity property). Some extensions (axiom of univalence, propositional reflection) seem possible.**Wednesday March 21**: Discussion about universes and universe polymorphism**Wednesday March 7:**Peter Dybjer:*Tests, games, and Martin-Löf's meaning explanations for intuitionistic type theory..**ABSTRACT:*In this talk I shall argue that program testing provides the basis for constructive mathematical truth. This point of view leads to a modification of Martin-Löf's meaning explanations for intuitionistic type theory, where the judgements of type theory are viewed as conjectures which can be tested in order to be corroborated or refuted. In particular we get a new interpretation of hypothetical judgments, since tests for such judgments need methods for generating inputs. We sketch a "test manual" for intuitionistic type theory which employs concepts from game semantics, for example, input generation corresponds to opponent's moves in a game. In order to test a typing judgement we simultaneously play a strategy generated by a type and a strategy generated by a term, where the correct moves for the strategy of the term are determined dynamically by playing the strategy of the type. The talk is on joint work in progress with Pierre Clairambault, Cambridge.**Wednesday February 15**: Dimitrios Vytiniotis:*Stop when you are Almost-Full: adventures in constructive termination*(Joint work with Thierry Coquand and David Wahlstedt.)**Wednesday January 25**in**6128**: Guilhem Moulin:*A computational interpretation of parametricity, Cont.***Wednesday January 18**: Guilhem Moulin:*A computational interpretation of parametricity*(Joint work with Jean-Philippe Bernardy)*ABSTRACT:*Reynolds' abstraction theorem has recently been extended to lambda-calculi with dependent types. In this paper, we show how this theorem can be internalized. More precisely, we describe an extension of Pure Type Systems with a special parametricity rule (with computational content), and prove fundamental properties such as Church-Rosser's and strong normalization. All instances of the abstraction theorem can be both expressed and proved in the calculus itself. Moreover, one can apply parametricity to the parametricity rule: parametricity is itself parametric.**Tuesday January 10**at**11:15**in**6128**: Marc Bezem:*A set that is streamless but not provably noetherian**ABSTRACT:*Coquand and Spiwack recently coined the concepts streamless and noetherian as constructive notions, classically equivalent to finiteness. Noetherian implies streamless, and Coquand and Spiwack conjectured the existence of a set that is streamless but not provably noetherian (all in the constructive sense). In this talk we present an example of such a set, which is highly undecidable. This result leads to a more difficult question: the existence (or not) of an arithmetical or even decidable set with such properties.

**2011**

**Wednesday December 14**: Dan Rosén:*First steps in Automated Induction in Agda**ABSTRACT:*After Moa's talk last week about induction, I did an attempt on writing an automated induction solver in Agda, for the theory of natural numbers. It does induction on one variable and instantiates user-specified lemmas. If it gets stuck it will report the current state to the user. The set of functions is extendable, the user can choose unary and binary operators over natural numbers provided that the normalisation functions are supplied. I will show a demo of the inductive prover in action, and discuss future work which includes using stronger rewriting techniques like rippling and adding induction over other datatypes than natural numbers.**Wednesday December 7**: Moa Johansson:*Automating Induction: A quick Rippling tutorial**ABSTRACT:*Following last weeks talk when induction came up, I thought it might be good with a talk introducing the rippling heuristic for automating inductive proofs. Rippling is used to rewrite the step-case of inductive proofs, with the main idea being that the conclusion should only be manipulated in such a way that it becomes 'more similar' to the inductive hypothesis, as we want to use the hypothesis to conclude the proof. This is implemented by having a similarity measure between the conclusion and hypothesis which is required to decrease in each step. This not only guides the search, but also means that rewrite rules does not need to be oriented in advance and that termination can be guaranteed. I will also say something about how rippling is used in the context of proof-planning, and perhaps give a quick demo of the IsaPlanner system and its automated induction.**Wednesday November 30**: Peter Dybjer:*Using Agda for combining Interactive and Automatic Theorem Proving in First Order Theories of Combinators*(Joint work with Ana Bove and Andrés Sicard-Ramírez.)*ABSTRACT:*We propose a new approach to the computer-assisted verification of functional programs. We work in first order theories of functional programs which are obtained by extending Aczel's first order theory of combinatory formal arithmetic with positive inductive and coinductive predicates. Rather than building a special purpose system we implement our theories in Agda, a proof assistant for dependent type theory which can be used as a generic theorem prover. Agda provides support for interactive reasoning by encoding first order theories using the formulae-as-types principle. Further support is provided by off-the-shelf automatic theorem provers for first order logic which can be called by a tool which translates Agda representations of first order formulae into the TPTP language understood by the provers. We show some examples where we combine interactive and automatic reasoning, covering both proof by induction and coinduction.**Wednesday November 23**: Thierry Coquand talked about a constructive proof of the correctness of the size-change principle.**Wednesday 2nd of November**, Ana Bove*Gilbreath trick in Agda**ABSTRACT:*In 1991 Gerard Huet present a proof development of the Gilbreath trick (a non-trivial property of binary sequences) in the Coq proof assistant (version 5.6). With Huet's article as starting point I formalised the trick in Agda.

Here is my Agda code (could be prettier :-). Here is a Coq code by Vincent Siles.**Wednesday October 26**, Thierry Coquand talked about equality.**Tuesday October 11**, Nils Anders Danielsson:*Dependent lenses**ABSTRACT:*Haskell's record update notation is not perfectly suited for updating a field of a record nested in another. One might write something awkward likeFortunately some people¹ have found a better way to do this, using so-called lenses. A lens of type`r { field₁ = (field₁ r) { field₂ = g (field₂ (field₁ r)) } }`

.`Lens A B`

basically provides an updatable`B`

-view of`A`

s. If we have a lens`f₁`

for`field₁`

, and another lens`f₂`

for`field₂`

, then the code above can be replaced byHere`modify (f₂ ∘ f₁) r g`

.`_∘_`

is composition of lenses. The lenses above are non-dependent; the`B`

type is not indexed. However, in a setting with dependent records one probably wants to use /dependent/ lenses, where`B`

is indexed by values of type`A`

. I will discuss some possible designs for dependent lenses. [1] I don't have a full overview of the lens literature, but lenses seem to have their roots in database research done in the 1970's.**Tuesday September 27**, Vincent Siles:*How to formalize mathematic using Coq/Ssreflect**ABSTRACT:*When we read an algorithm in a textbook, we often see statements like "let x be a non zero element of the matrix M, then ...". However, if you want to implement this particular algorithm, you might want to avoid searching a whole 10000x10000 matrix for a non zero element. We identified two phases in the development of mathematical libraries: * a first one in which we can simply describe what the algorithm does and why it is correct (Ssreflect is a good framework for this). * a second one where we implement the algorithm in a more efficient way (Ssreflect is *not* a good framework to do this) and we prove our "computable" version is equal to our first description. We successfully applied this schema to two big examples: matrices and polynomials manipulation (My running example is going to be about matrix manipulation and how to compute the Smith normal form).**Wednesday August 23 in 5128**, Nils Anders Danielsson:*NBE for outrageous but meaningful coincidences**ABSTRACT:*I am implementing a reusable library for well-typed de Bruijn indices, using ideas from Conor McBride's papers Type-Preserving Renaming and Substitution and Outrageous but Meaningful Coincidences: Dependent type-safe syntax and evaluation. This library contains parametrised definitions of things like contexts, variables and substitutions, along with a number of lemmas. In order to test the library I am implementing a normaliser for a simple dependently typed language, using normalisation by evaluation. My presentation will focus on this implementation; expect to see lots of code. The (evolving) code is available:**Wednesday July 06**, Guilhem Moulin:*Proving termination / contextual equivalence**ABSTRACT:*I'll try to summarize the lecture Amal Ahmed gave last week at OPLSS. Starting with reducibility predicates, I'll recall (sketch) the strong normalization proofs for Gödel's system T & Girard's system F, then amend the predicates to be able to add recursive types. Eventually, I'll show how this technique, namely step-indexed logical relations, can be extended to prove contextual equivalence and approximation.*REFERENCES:*- Appel, A.W., McAllester, D.: An indexed model of recursive types for foundational proof-carrying code. ACM TOPLAS 23(5) (2001) 657-683
- Ahmed, A.: Step-indexed syntactic logical relations for recursive and quantified types. of Lecture Notes in Computer Science (2006) 69-83
- Pitts, A.: Step-Indexed Biorthogonality: a Tutorial Example

**Wednesday June 22**, Phil Scott:*Towards Geometry of Interaction for Polarized Linear Logic***Thursday June 9 in 3320**, Anders Mörtberg:*Homology of simplicial complexes in Haskell**ABSTRACT:*The main motivation for studying the Smith normal form algorithm in the ForMath project is to compute so called homology groups of simplicial complexes. I will describe what these are and why it is interesting to compute them. I will end with a small application of computing the number of connected components and holes in 2D digital images that I've implemented in Haskell.**Wednesday June 8**, Patricia Johann:*A generic induction rule for inductive types***Wednesday May 25**, Simon Huber:*Bar induction, bar recursion, and a computational interpretation of classical analysis**ABSTRACT:*In last week's talk by Nisse, some questions concerning bar induction and -recursion arose, as well as how one gives a computational meaning to the double negation shift (DNS). I will try to explain the two aformentioned notions and show how modified bar recursion by Paulo Oliva and Ulrich Berger gives a realizer for DNS; moreover, I will discuss how the latter gives rise to an extraction method for the impredicative theory of Peano Arithmetic plus countable choice.**Wednesday May 18**, Nils Anders Danielsson:*Infinitely often**ABSTRACT:*In the previous talk Marc Bezem discussed several ways in which one can state that "this predicate only holds for a finite number of natural numbers". In this talk I'll define "this predicate holds infinitely often" in several ways, and discuss how the definitions are related. I'll also discuss under what circumstances the construction commutes with binary sums (in the double-negation monad), and why this property can be useful.**Friday May 13 at 10:15 in 6128**, Marc Bezem:*On streams that are almost always black*(joint work with Keiko Nakata and Tarmo Uustalu)*ABSTRACT:*Mixing induction and coinduction, we give novel definitions of streams being almost always black. We organize our definitions into a hierarchy including also some well-known alternatives in intuitionistic analysis. The hierarchy collapses classically, but is intuitionistically of strictly decreasing strength. We characterize the differences in strength in a precise way by weak instances of the Law of Excluded Middle.**Wednesday May 11th**, Vincent Siles:*A decision procedure for regular expression equivalence*(joint work with Thierry Coquand)*ABSTRACT:*As part of the ForMath project, Thierry and I have been formalizing an algorithm (due to Brzozowsky) to decide the equivalence of regular expressions, using Coq/ssreflect: Two regular expressions are equivalent iff they recognize the same language. I'll present a variant of Brzozowsky algorithm based on a new (inductive) definition of finite sets, not equivalent to the usual one. Our algorithm uses well-founded induction, which is known to often prevent computation within Coq. Thanks to a "logical trick" by Bruno Barras, we managed to compute with it within Coq, without performing extraction, which gives us a completely certified program to decide the equivalence.**Wednesday 4th of May**, Anders Mörtberg:*An overview of the SSReflect extension to the Coq system**ABSTRACT:*The Small Scale Reflection (SSReflect) extension to the Coq system was initially developed by George Gonthier during the formalisation of the four color theorem and has since been successfully used for formalising many other mathematical theories. I will give an overview of what SSReflect adds to the existing Coq system in terms of "new" tactics, mathematical theories and the small-scale reflection methodology.**Wednesday 30th of March**, Simon Huber & Guilhem Moulin:*µOTT: a toy implementation of Observational Type Theory**ABSTRACT:*we'll present the project we did for the AFP course, namely an experimental "proof-assistant" written in Haskell for a fragment of Observational Type Theory, as introduced in the paper Observational equality, now! by Thorsten Altenkirch, Conor McBride & Wouter Swierstra.**Wednesday 16th of March**, Thierry Coquand:*Report on the meeting on the homotopy interpretation of constructive type theory**ABSTRACT:*There was an interesting one week workshop 02/27, 03/5 on homotopy and type theory with lectures by V. Voevodsky. I will try to give a report on the lectures and discussions.**Wednesday 2nd of March**: Nils Anders Danielsson talked about Conor McBride's paper Outrageous but Meaningful Coincidences: Dependent type-safe syntax and evaluation**Wednesday 23th of February**, Jason Reich:*Verifing a Compiler for the Reduceron in Agda**ABSTRACT:*My research is concerned with the production of a dependable optimising compiler for the Reduceron platform. I am currentlyusing Agda to verify its correctness. In this talk, I shall first introduce the Reduceron, an FPGA-based graph reduction machine for executing lazy functional programs and outline the current compilation process. Supercompilation will be briefly mentioned, as an optimisation currently under consideration. Then, I will a outline of the correctness proof and discuss how I am expressing it in Agda.**Wednesday 16th of February**at**15:15**, Vincent Siles,*A unified meta-theory of Pure Type Systems**ABSTRACT:*There are two usual ways to describe equality in a dependent typing system, one that uses an external notion of computation like beta-reduction, and one that introduces a typed judgment of beta-equality directly in the typing system. After being open for some time, the general equivalence between both approaches has been solved by Adams for a large class of Pure Type Systems (PTSs) called functional. During my PhD thesis, I managed to extend this result to any kind of PTSs, without restriction by combining the ideas of Adams with a study of the general shape of types in PTSs. The elaboration of such a technical proof has been eased by the used of the proof assistant Coq. In this talk, I'll expose the main problems raised by this equivalence, and how the use of Coq helped me to finally settled this proof.**Wednesday 8th of February**, Nils Anders Danielsson:*A Formalisation of Homotopy Levels and Weak Equivalence Based on an Axiomatic Equality***Thursday 20th of January**, OBS: 15.15 in room 6128!! Helmut Schwichtenberg:*TBA*.**Wednesday 19th of January**, Matthieu Sozeau:*Equations: a dependent pattern-matching compiler**ABSTRACT:*We present a compiler for definitions made by pattern matching on inductive families in the Coq system. It allows to write structured, recursive dependently-typed functions, automatically find their realization in the core type theory and generate proofs to ease reasoning on them. The high-level interface allows to write dependently-typed functions on inductive families in a style close to Agda or Epigram, while their low-level implementation is accepted by the vanilla core type theory of Coq. This setup uses the smallest trusted code base possible and additional tools are provided to maintain a high-level view of definitions. The compiler makes heavy use of type classes and the high-level tactic language of Coq for greater genericity and extensibility.

**2010**

**Wednesday 24th of November**, Fredrik Lindblad:*Auto in Agda*.*ABSTRACT:*The Auto command has been around in Agda for about a year. In the latest release it was formally announced. In this talk I'll summarize the technology behind the proof search. Then I'll show how to use the command. Different ways of using it are accompanied by examples. Finally, I'll report on an experiment we did on the higher-order subset of the TPTP problems. The results show that the performance of our approach in comparable to other systems solving HOL problems.**Wednesday 17th of November**, Ana Bove:*10 Years of Partiality and General Recursion*.*ABSTRACT:*In this talk I will present the invited talk I gave at DTP-2010 in Edinburgh during FLoC. Constructive type theory (CTT), even if originally developed for formalising mathematics, has been seen as a programming language with dependent types for the past 30 years. CTT is a theory of total functions. This clearly rules out genuine partial function. Most systems based on CTT only allow structurally recursive functions as a way to guarantee totality. These limitations are really a problem when we are faced with the task of formalising and proving correct a function that do not satisfy the requirements. The search for methodologies to simplify the formalisation of general recursive and partial function in CTT has been a hot topic during the last decade. Many ideas of different nature have been developed in order to facilitate the definition of this class of functions and their subsequent proof of correctness. However, the issue of how to best formalise recursive and partial functions is far from being resolved. In this talk we will have a look at some of the methodologies that have been popularised in the last few years. We will mainly concentrate on methodologies available for systems like Coq and Agda, but we will also discuss some of the methodologies available for Isabelle/HOL. Though Isabelle/HOL is not based on CTT but on classical high-order logic, functions should be total in Isabelle/HOL if one is to use to system as a theorem prover.**Wednesday 3rd of November**, JP Bernardy:*Realizability and Parametricity in PTSs*.*ABSTRACT:*We describe a systematic method to build a logic from any programming language described as a Pure Type System (PTS). The formulas of this logic express properties about programs. We define a parametricity theory about programs and a realizability theory for the logic. The logic is expressive enough to internalize both theories. Thanks to the PTS setting, we abstract most idiosyncrasies specific to particular type theories. This confers generality to the results, and reveals parallels between parametricity and realizability. If time allows, I will also discuss the implications of the construction on the design of dependently typed languages and proof assistants.**Wednesday 27th of October**, Andrew Pitts:*Nominal sets, part II.**ABSTRACT:*This is the second of two talks about the category of nominal sets as a place for modelling computation on structures involving names and name-binding operations. A copy of the slides for this seminar are available at <http://www.cl.cam.ac.uk/users/amp12/talks/nominal-sets-2.pdf>. Recommended reading: "Structural Recursion with Locally Scoped Names", which is available at <http://www.cl.cam.ac.uk/~amp12/papers/strrls/strrls.pdf>.**Wednesday 20th of October**at 11.00,*Discussion about important research problems in type theory*.**Wednesday 20th of October**, Andrew Pitts:*Nominal sets, part I*.**Tuesday 5th of October**in the EDIT room, Andrew Pitts:*Semantics of Scope*.*ABSTRACT:*This talk concerns the operational semantics of programming constructs involving locally scoped names. Typically such semantics involves stateful \emph{dynamic allocation}: a set of currently-used names forms part of the state and upon entering a scope the set is augmented by a new name bound to the the scoped identifier. (Deallocation of names upon leaving a scope is sometimes appropriate too.) More abstractly, one can see this as a transformation of local scopes by expanding them outward to a(n implicit) top-level. By contrast, in a neglected paper from POPL 1994, Martin Odersky gave a stateless lambda calculus with locally scoped names whose dynamics contracts scopes inward. The properties of "Odersky-style" local names are quite different from dynamically allocated ones. The former enjoy much nicer equational properties up to contextual equivalence than the latter. However, it is not so clear, until now, what is the expressive power of Odersky's system. Here we show that Odersky-style local names provide a direct semantics of scope from which the dynamic-allocation semantics of scope can be obtained by continuation-passing translation. More precisely, we show that there is a cps translation of typed lambda calculus with dynamically allocated names (the Pitts-Stark nu-calculus) into Odersky's $\lambda\nu$-calculus that is computationally adequate with respect to contextual equivalence in the two calculi. This is joint work with Steffen Loesch.**Wednesday 16th of June**, Marc Lasson:*Internalized realizability in pure type systems*.*ABSTRACT:*Let P be any pure type system, we are going to show how we can extend P into a PTS P' which will be used as a proof system whose formulas express properties about sets of terms of P. We will show that P' is (strongly) normalizable if and only if P is. Given a term t in P and a formula F in P', P' is expressive enough to construct a formula "t ⊩ F" that is interpreted as "t is a realizer of F". We then prove the following adequation lemma: if F is provable then by projecting its proof back to a term t in P we obtain a proof that "t ⊩ F". Finally, we will present the relation between this realizability construction and Jean-Philippe's work on Reynold's parametricity in PTS.**Wednesday 9th of June**, Koen Claessen:*Generating Counterexamples for Structural Inductions by Exploiting Nonstandard Models*.*ABSTRACT:*Induction proofs often fail because the stated theorem is non-inductive, in which case the user must strengthen the theorem or prove auxiliary properties before performing the induction step. (Counter)model finders are useful for detecting non-theorems, but they will not find any counterexamples for non-inductive theorems. We explain how to apply a well-known concept from first-order logic, nonstandard models, to the detection of non-inductive invariants. Our work was done in the context of the proof assistant Isabelle/HOL and the counterexample generator Nitpick. This is joint work with Jasmin Blanchette.**Wednesday 14th of April**in the EDIT room, Bob Constable:*Recent Advances in Computational Type Theory*.*ABSTRACT:*Computational Type Theory (CTT) is a theory of computation that has been implemented in the Nuprl and MetaPRL theorem provers. It is in the same family as Martin-Lof's Intuitionistic Type Theory (ITT) implemented in Agda and the Calculus of Induction Constructions (CIC) implemented in Coq. The Nuprl system has been used to specify, develop, and verify software systems, to prove theorems in constructive mathematics, and to solve open problems in mathematics. The type system of CTT is explicitly designed to capture concepts used in modern software systems including distributed systems. This lecture will examine some of the new type constructors and computational primitives added to CTT over the past seven years including dependent intersection types, subtyping, unguessable atoms, and processes. I will discuss how the process type allows us to implement the notion of constructive proofs as processes.**Wednesday 28th of April**in 5128, Pierre Clairambault:*Game semantics*.**Thursday 29th of April**at 13.15 in the Philosophy Department, Olof Wijksgatan 6 (Gamla Hovrätten), room T340, David Wahlstedt (discussion leader):*Truth and Provability*.**Wednesday 5th of May**, Jean-Philippe Bernardy:*Parametricity in Agda, Inductive definitions and Extensional equality*.*ABSTRACT:*We extend Reynolds' abstraction theorem to an Agda-like type system. Issues related to inductive definitions and extensionality will be discussed.**Wednesday 26th of May**, Miroslav Dobsicek:*On analogies between physics, topology, logic and computation*.*ABSTRACT:*I will present a recent paper by John C. Baez and M. Stay: Physics, Topology, Logic and Computation: A Rosetta Stone. http://arxiv.org/abs/0903.0340/

**PREVIOUS YEAR'S TALKS**

Other past talks can be found at the old pages