Universität Potsdam: Institut für Linguistik Universität Potsdam
 
Bereich Computerlinguistik
Lehrstuhl "Theoretische Computerlinguistik"
Institut für Linguistik
University of Potsdam
Postfach 601553
D-14415 Potsdam
GERMANY
 
Workshop
MATHEMATICAL METHODS IN COMPUTATIONAL LINGUISTICS

April 28-29, 2004
University of Potsdam

Speakers:

Peter beim Graben, Christophe Costa Florencio, Christian Ebert, Nissim Francez, Hans-Martin Gärtner, Thomas Holder, Gerhard Jäger, Aravind Joshi, Bryan Jurish, Ed Keenan, Stephan Kepser, Marcus Kracht, Jens Michaelis, Uwe Mönnich, Peter Staudacher

Organizers:

Gerhard Jäger, Jens Michaelis and Tom Hanneforth

Some hopefully useful links for our guests and visitors

The Schedule:

Wednesday
April 28
2004
campus Golm
bldg 14
room 0.45
09:15-09:30 Peter Staudacher (Potsdam) Opening
09:30-10:30 Peter beim Graben
and
Bryan Jurish
(Potsdam)
Context-free parsing by dynamical systems
10:30-10:50 Coffee
10:50-11:50 Christophe Costa Florencio
(Utrecht)
Formal learning theory and computational linguistics
12:00-13:00 Uwe Mönnich
(Tübingen)
Closure Properties of Linear Context-Free Tree Languages
13:00-14:30 Lunch Break
14:15-15:00 Hans-Martin Gärtner
and
Jens Michaelis
(Berlin/Potsdam)
Does Adding Late Adjunction to Minimalist Grammars Affect their Complexity?
15:05-15:50 Gerhard Jäger
(Potsdam/Stanford)
Parsing with proof nets
15:55-16:40 Peter Staudacher
(Potsdam)
A Note on Mild Context-Sensitivity
16:40-17:00 Coffee
17:00-18:00 Aravind Joshi
(Philadelphia)
Starting with Complex Primitives Pays off: Relevance to Weak and Strong Generative Capacity
19:30 dinner in the restaurant Barokoko in downtown Potsdam (map)
Thursday
April 29
2004
campus Neues Palais
bldg 8
room Foyer I/II
09:30-10:30 Nissim Francez
(Haifa)
Categorial grammar with ontology-refined types
10:30-10:50 Coffee
10:50-11:50 Marcus Kracht
(Los Angeles)
Compositionality and Semantics
12:00-13:00 Stephan Kepser
(Tübingen)
Some notes on the complexity of Optimality Theory
13:00-14:30 Lunch Break
14:30-15:30 Christian Ebert
(London)
The Expressive Power of Underspecified Representation Formalisms
15:40-16:40 Thomas Holder
(Berlin)
Grammar as Geometry
16:40-17:00 Coffee
17:00-18:00 Ed Keenan
(Los Angeles)
Excursions in Natural Logic
18:00 End of Day 2

Abstracts of the talks will be found below, on this very web page 
The workshop will be open to everyone who is interested


Abstracts:

Peter beim Graben and Bryan Jurish (Potsdam)
Context-free parsing by dynamical systems

We describe a technique for partioning a locally ambiguous context-free grammar into a finite set of nonambiguous sub-grammars for which deterministic pushdown automata can be defined.  Using a theorem proved by Moore, we map these automata onto discrete time dynamical systems acting at the unit square, where local ambiguities are represented by a control parameter.  The actual states of the automata are rectangles lying in the unit square that can be interpreted as cylinder sets in the context of symbolic dynamics theory.  We show that applying a wrong processing preference to a certain input string leads to an unwanted invariant set in the parsers dynamics. Then, syntactical reanalysis and repair can be modeled by a switching of the control parameter.
 

Christophe Costa Florencio (Utrecht)
Formal learning theory and computational linguistics

In 1967 E. M. Gold published a paper in which the language classes from the Chomsky-hierarchy were analyzed in terms of learnability, in the technical sense of identification in the limit. His results were mostly negative, and perhaps because of this his work had little impact on linguistics. In the early eighties there was renewed interest in the paradigm, mainly because of work by Angluin and Wright. Around the same time, Arikawa and his co-workers refined the paradigm by applying it to so-called Elementary Formal Systems. By making use of this approach Takeshi Shinohara was able to come up with an impressive result; any class of context-sensitive grammars with a bound on its number of rules is learnable. Some linguistically motivated work on learnability also appeared from this point on, most notably Wexler & Culicover 1980 and Kanazawa 1994. The latter investigates the learnability of various classes of categorial grammar, inspired by work by Buszkowski and Penn, and raises some interesting questions. We follow up on this work by exploring complexity issues relevant to learning these classes, answering an open question from Kanazawa 1994, and applying the same kind of approach to obtain (non)learnable classes of Combinatory Categorial Grammars, Tree Adjoining Grammars, Minimalist grammars, Generalized Quantifiers, and some variants of Lambek Grammars. We also discuss work on learning tree languages and its application to learning Dependency Grammars. Our main conclusions are:
 

Uwe Mönnich (Tübingen)
Closure Properties of Linear Context-Free Tree Languages

Context-free tree grammars are natural extensions of classical string grammars.  Originally defined by Rounds, they have been studied for more than three decades now. If a context-free tree grammar is linear, it does not permit the copying of subtrees.  A language generated by a linear context-free tree grammar is called a linear context-free language.  This class of tree languages is of particular interest for linguistics, because all known non-context-free phenomena of natural language can be described therein. For linear context-free tree languages we show the closure under linear frontier-to-root tree transducer mappings, linear root-to-frontier tree transducer mappings, and under intersection with regular tree languages.
 

Hans-Martin Gärtner (Berlin) and Jens Michaelis (Potsdam)
Does Adding Late Adjunction to Minimalist Grammars Affect their Complexity?

Minimalist grammars (MGs), as introduced in Stabler 1997, have proven a useful instrument in the formal analysis of syntactic theories developed within the minimalist branch of the principles and parameters framework (cf. Chomsky 1995, 2000). In fact, as shown in Michaelis 2001, MGs belong to the class of mildly context sensitive grammars. Interestingly, without there being a rise in (at least weak) generative power, (extensions and variants of) MGs accommodate a wide variety of (arguably) "odd" items from the syntactician's toolbox, such as head movement (Stabler 1997, 2001), affix hopping (Stabler 2001), (strict) remnant movement (Stabler 1997, 1999), adjunction (Frey and Gärtner 2002), and (to some extent) scrambling (Frey and Gärtner 2002). Here, we want to shed some light on the effects on the generative capacity, when enriching MGs with another controversial mechanism, namely, countercyclic adjunction.
 

Gerhard Jäger (Potsdam/Stanford)
Parsing with proof nets

Since Pentus (2003) it is known that provability in the associative Lambek calculus is NP-complete. Nonetheless, if a categorial lexicon is given, parsing is polynomial in the length of the sentence to be parsed. I will present a parsing algorithm that assigns proof nets to strings in polynomial time. Furthermore I will propose a novel, tree-like representation format for proof nets which enables extraction of parsed proof net corpora from standard tree banks like the Penn tree bank.
It is possible to define a similarity measure between proof nets on the basis of their topological structure. If a parsed proof net corpus is available, this measure can be used to rank parses by means of standard machine learning techniques like Support Vector Machines.
 

Aravind Joshi (Philadelphia)
Starting with Complex Primitives Pays off: Relevance to Weak and Strong Generative Capacity

In setting up a formal system to specify a grammar formalism, the conventional (mathematical) wisdom is to start with primitives (basic primitive structures) as simple as possible, and then introduce various operations for constructing more complex structures. An alternate approach is to start with complex (more complicated) primitives, which directly capture some crucial linguistic  properties and then introduce some general operations for composing these complex structures. This latter approach has given some insights into Strong Generative Capacity (SGC). Although SGC is highly relevant to linguistic description, there are really very few results concerning SGC. After briefly commenting on the above issues (including their relevance to statistical analysis), I will describe some interesting results related to structural descriptions (and thus to SGC). These results are obtained by rather simple combinatorial case analyses.
 

Nissim Francez (Haifa)
Categorial grammar with ontology-refined types

In grammars that are based on standard Lambek calculus with the usual syntax-semantics interface, sentences (1) and (2) below have the same derivation, because both the syntactic category and semantic type of `boy' and `table' are the same.

(1) Every boy smiles.
(2) Every table smiles.
This work extends the standard Lambek calculus so that selectional restrictions can be imposed in such a way that bad sentences like (2) are not derived.
The idea is to use an ontology for a type $\sigma$ in a way that enable us to restrict functions that standardly apply to any argument of type $\sigma$ to apply to arguments that correspond to some concepts but not other. For example, assume that in an ontology for type $e$ there are concepts `animate' and `inanimate'. Intuitively, none of this concepts is more general than the other. Now the concept associated with the word `boy' is at least as specific as `animate'. Similarly, the concept of associated with the noun `table' is at least as specific as `inanimate'. Now, the denotation of the verb `smile' will be restricted to apply to arguments that are `animate' (or more specific), by decorating its type with ontological concepts. Since the denotation of 'boy' is such, but not the denotation of 'table', sentence (1) will be derivable in our system while sentence (2) will not.
 

Marcus Kracht (Los Angeles)
Compositionality and Semantics

Usually, the requirement of compositionality on a language is believed to be empirically vacuous. Any language is compositional, it is just a matter of presenting it in the correct way. We believe, however, that this is misguided. The principal problem in previous definitions is the liberal conception of syntax/morphology and semantics. In the talk we shall try to flesh out a formal definition of compositionality that does justice to the intuitive meaning of the word. To this end, we shall try to delimit the notions of syntax/morphology and, especially, semantics. Most semantic representations that are considered contain syntactic information, though often in an implicit form. The best example is type theory, which connects meaning with syntax via the slash-can- cellation mechanism. Once semantics is purified so that it contains no more traces of syntax one can see that the question of compositionality is really nontrivial.
 

Stephan Kepser (Tübingen) (joint work with Uwe Mönnich)
Some notes on the complexity of Optimality Theory

We will present modularity results for optimality theory based on work by Jäger and by Wartena. An OT-system consists of a generator and a sequence of constraints.  For unidirectional optimality we show that a system consisting of a linear context-free grammar and a linear F-transducer as the generator and MSO formulae as the constraints can be rendered using finite state techniques.  We also provide a way of how to extend a modularity result on bidirectional optimality theory by Jäger for regular string languages to the case of regular tree languages.
 

Christian Ebert (London)
The Expressive Power of Underspecified Representation Formalisms

Underspecified representation formalisms have been widely used in computational semantics to represent the meaning of (scopally) ambiguous expressions in a compact way. Among the more promiment approaches, Underspecified DRT, Hole Semantics, Minimal Recursion Semantics, and Normal Dominance Constraints share some key features. Because of this resemblance, a long-standing question is, wether these formalisms are equivalent with respect to their expressive power. After phrasing the problem in a formal setting, I will give a negative answer to this question. Furthermore I will discuss what it means for underspecified representation formalisms to be 'expressively complete' and show that all four discussed approaches are incomplete in this respect.
 

Thomas Holder (Berlin)
Grammar as geometry

Keenan and Stabler[1999] propose within their framework of Bare Grammar a definition of the notion ‚syntactic structure of  a language L’ based on invariance of expressions under automorphisms of L. Problems with the claim, that this notion models faithfully linguists’intuition on structure, arise in the case of lexical extensions. As illustration a simple agreement system is given. Starting from a notion of syntactic homomorphism we show that the problem can be circumvented by the defining the structure of L not directly on L itself but on a language L* obtained from L by identification of expressions under the action of a suitable subgroup of the group of automorphisms of L.

Reference:
Keenan,E./Stabler,E.: Bare Grammar, ESSLI Aix-en-Provence 1999.
 

Ed Keenan (Los Angeles)
Excursions in Natural Logic

I present two novel entailment paradigms in English and begin to characterize them semantically.  Despite reliable judgments of entailment these patterns have gone largely unnoticed in work on philosophical logic and natural language semantics, possibly because many of the sentence pairs instantiating the patterns naturally invoke quantifier types not studied in standard logic, indeed not even definable in first order logic.  Also in one large class of cases the judgments of entailment rely on relations between pairs of quantifiers, so it is more natural to consider each pair as a binary quantifier rather than an instance of iterated unary quantifiers.  We obtain a characterization result for invariant self-dual quantifiers and another for quantifiers that fix the post-complement relation.
 

Comments and suggestions: Gerhard Jäger