Language and Computation track advanced course, ESSLLI 2018 ++ Instructors: Jakub Dotlačil, Adrian Brasoveanu ++ August 6-10, 2018, 2:00-3:30 pm


This course introduces a new framework that integrates (i) formal syntactic and semantic theories, (ii) mechanistic processing models, and (iii) Bayesian methods of data analysis and parameter estimation. The main goal of the framework is a theoretical one: it enables us to build evidence-based, mathematically and computationally explicit theories/systems/models of natural language meaning (product) and interpretation (process).

The integration proceeds in two parts. First, competence-level generative theories and, in particular, dynamic semantics approaches to natural language meaning and interpretation (DRT, Kamp 1981, Kamp and Reyle 1993, FCS, Heim 1982, DPL, Groenendijk and Stokhof 1991) are embedded in performance-level processing theories formulated in the ACT-R cognitive architecture (Adaptive Control of Thought-Rational; Anderson and Lebiere 1998, Lewis and Vasishth 2005, Anderson 2007 a.o.).

Second, these integrated competence-performance theories, formalized as mechanistic processing models, become part of a Bayesian model, which can be fitted to experimental data. The main upshot is that we are able to consider alternative syntactic and semantic theories, as well as alternative theoretically-motivated processing models, and quantitatively compare how well they fit data from a variety of experimental tasks (lexical decision, forced choice, self-paced reading, eye-tracking while reading etc.).

The main goal of the framework is a theoretical one: we want to formulate integrated competence-performance theories of natural language syntax and semantics, fit them to experimental data, and quantitatively compare them. But this theoretical goal can be achieved only if we computationally implement these theories and models. Therefore, a crucial component of the framework is a new Python3 implementation of ACT-R pyactr and a growing collection of implemented mechanistic processing models for a variety of syntactic and semantic phenomena.

In sum, our DRT + ACT-R + Bayes framework — more generally: generative theories + ACT-R + Bayes — enables us to connect rich syntactic and semantic theories, mechanistic processing models and a variety of experimental tasks in a formally and computationally explicit way. The course is based on our forthcoming book Formal Linguistics and Cognitive Architecture (Brasoveanu & Dotlačil, in preparation) and related work.

Slides and other materials for the 5 lectures:


Computing Dynamic Meanings: Day 1 [ESSLLI 2018 Course]

Sun 05 August 2018 by Jakub Dotlačil, Adrian Brasoveanu

read more

Computing Dynamic Meanings: Day 2 [ESSLLI 2018 Course]

Sun 05 August 2018 by Jakub Dotlačil, Adrian Brasoveanu

Computing Dynamic Meanings: Day 3 [ESSLLI 2018 Course]

Sun 05 August 2018 by Jakub Dotlačil, Adrian Brasoveanu

read more

Computing Dynamic Meanings: Days 4-5 [ESSLLI 2018 Course]

Sun 05 August 2018 by Jakub Dotlačil, Adrian Brasoveanu

  • Mechanistic processing models for formal semantics (DRT + ACT-R + Bayes)

    we introduce mechanistic processing models for formal semantics that integrate dynamic semantics, specifically, Discourse Representation Theory (DRT, Kamp 1981, Kamp & Reyle 1993), and the ACT-R cognitive architecture

    we show how to embed these mechanistic processing models into Bayesian models …

read more