15 May: linguistics weekend

Index of May 2001 | Index of year: 2001 | Full index


howdy.

below is a tentative schedule for this weekend. please let me know
asap if you'd like to be included in the dinner reservation for
saturday night (it's likely to cost about 25 pounds, plus drinks).

the venues will be finalized in a subsequent email.

carl



Dublin Computational Linguistics Research Seminar Series DCLRS 2000/1


Friday, May 18th, DCU

16:00 Dr. Wilfried Meyer-Viol and Prof. Ruth Kempson, King's College
London, Department of Philosophy: "Dynamic Syntax.... What goes
First?"



London - Dublin Computational and Theoretical Linguistics Colloquium -
II

Saturday, May 19th, DCU

09:50 Welcome & Introduction

10:00 Prof. Ruth Kempson, King's College London, Department of
Philosophy: "Tree Growth .... And what goes last

10:45 Dr. Wilfried Meyer-Viol, King's College London, Department of
Philosophy: "Dynamic Syntax and Tree-Automata"?

11:30 coffee

11:45 Dr. John Saeed, Trinity College Dublin, Centre for Language and
Communication Studies: "The interaction between syntax
and information structure in Somali, a morphological focus
system"

12:30 lunch

14:00 Dr. Jonathan Ginzburg, King's College London, Department of
Computer Science: "Defeasibly Structuring Facts"

14:45 Dr. Fintan Costello, Dublin City University, School of Computer
Applications: "Red-headed butchers, skilful violinists, fake
surgeons, and pet fish: The semantics of membership in noun-noun
and adjective-noun conjunctions."

15:30 tea

15:45 Dr. Carl Vogel, Trinity College Dublin, Department of Computer
Science, "Simulating the Emergence of Meaning and
Understanding."

16:30 Dr. Josef van Genabith, Dublin City University, School of
Computer Applications: "Experiments in Structure Preserving
Grammar Compaction"



Sunday, May 20th, TCD

10:00 Dr. Julie Berndsen, University College Dublin, Department of
Computer Science, "Phonological Constraints in Speech
Technology"

10:45 Dr. Martin Emms, Trinity College Dublin, Department of Computer
Science, "In Defence of Detail"

11:30 coffee

11:45 Prof. Shalom Lappin, King's College London, Department of
Computer Science: "A Framework for the Hyperintensional
Semantics of Natural Language with Two Implementations"

12:30 Dr. Tim Fernando, Trinity College Dublin, Department of Computer
Science, "Situations and Alternatives"

13:15 plenary session



ABSTRACTS:

"Dynamic Syntax.... What goes First?"

Wilfried Meyer-Viol and Ruth Kempson, KCL London

Abstract:

The claim made by the Dynamic Syntax framework is that grammar
formalisms for natural language should reflect the left to right
dynamics of language processing.

In this talk, we introduce the formal concepts of tree growth central
to the framework and use them to explore the interaction between
long-distance dependency effects and anaphora construal. What we shall
demonstrate is that the full range of different types of
"left-dislocation" effects can be naturally expressed as a consequence
of adopting a processing-oriented perspective, while providing the
flexibility necessary to express the array of cross-linguistic
variation. We shall use the general principles of tree growth to
articulate a typology of left-periphery effects, while sustaining a
unitary concept of anaphora as a pragmatically driven substitution
process. Structures considered include Hanging Topic Left
Dislocation, Topicalisation, Clitic Left Dislocation, Clitic Doubling.



"Tree Growth .... And what goes last?"

Ruth Kempson, KCL London

Abstract:

Right-periphery phenomena pose a puzzle for all current frameworks,
displaying an array of effects that are tantalisingly similar but
nonetheless distinct from left-periphery phenomena.

In this talk I take the concepts of node requirements, metavariables,
unfixed nodes and linked structures introduced in the previous talk,
and explore the extent to which they can be used to characterise right
periphery phenomena. What I shall show is that with a concept of
anticipatory metavariable (a variable that must be assigned a value by
some subsequent term) we can express the differences between left- and
right-periphery phenomena while nevertheless making use
of the same general principles of tree growth across linked
structures. Construction types to be covered include Clitic Doubling
(including the puzzling Porteno Spanish data), Expletives, Heavy NP
Shift and Right Node Raising. Finally I shall explore the challenge
posed by giving formal content to the concept of anticipatory
metavariable.



"Dynamic Syntax and Tree-Automata"

Wilfried Meyer-Viol, KCL London

Abstract:

In this talk the formal framework of Dynamic Syntax (DS) will be
analyzed from the perspective of formal language theory in order to
relate DS to more standard, static, approaches to syntax. This
relation will be established with the use of tree-automata. In DS,
the evaluation or parsing process can be represented as runs of
so-called alternating tree automata (of an extended kind). Standardly
a (top-down) tree automaton moves down a tree, leaving copies of
itself in fixed states, at every node. In DS, among these copies there
is at all times exactly one copy that is `in control' and this control
can be transferred. Moreover, a DS-automaton can change a state at a
node. So DS-automata have a more complex behaviour than standard
alternating tree-automata. But this is not the main difference between
standard- and DS-recognizable sets: in DS words function as mappings
between recognizable sets and are not identified with terminal nodes
of the trees in these sets. That is, the trees analyzing sentence
structure in DS are an extension of Context Free Trees, but the order
of the terminal nodes in such trees has no relation to surface-order.


R. Kempson, W. Meyer-Viol, D. Gabbay: "Dynamic Syntax". Blackwell,
Oxford, 2000.



"The interaction between syntax and information structure in Somali, a
morphological focus system"

John Saeed, Trinity College Dublin

In this paper I discuss the relationship between the grammatical
category of focus and the information structure notions of theme
(given) and rheme (new) in Somali, a Cushitic language of North-East
Africa. Somali and related languages employ grammatical morphemes to
mark focus and have been used as evidence in the grammatical typology
of focus. It appears that rather than simply adding a feature (or
feature bundle) FOCUS to a constituent (as has sometimes been proposed
for intonational focus in languages like English) these morphemes
display parallels to relative clause grammar. This has been used as
evidence for grammaticalization (Heine & Reh 1983). Specifically these
Cushitic languages have been described as strongly grammaticalized
focus systems with obligatory focus marking. I review this
categorization as background to discussing the semantic and discourse
functions of these focus morphemes. The paper re-emphasises the
independence of the focus/ground distinction from the theme/rheme
distinction, a point made by a number of writers including Steedman
(2000)




"Defeasibly Structuring Facts"

Jonathan Ginzburg, KCL London

It is standard in formal semantics to assume that contexts include the
set of assumptions common to the conversational participants. Call
this component FACTS. The motivation for FACTS is, primarily, to
account for various presuppositional phenomena. In this talk I will
consider additional phenomena one might use FACTS for, namely fact
ellipsis (e.g. (1)) and hasty accommodation (e.g. (2)), where a
speaker can coherently presuppose material that has not been accepted
into the common ground and after discussion retract it:

(1) A: Joyce just phoned. B: How amazing!

(2) A: Joyce just phoned. It amazes me. B: No, he's dead. A: Hmm, I
wonder who it was then.

I will show that providing an account of the potential antecedents for
fact ellipsis means that additional structure needs to be imposed on
FACTS. Hasty accommodation requires one to allow FACTS to be somehow
defeasible. I will suggest that, rather than employing complex belief
revision machinery such as given by AGM-style techniques, one can
profitably use independently motivated contextual structure to ensure
that some facts---those under discussion---are easily defeated. In so
doing I will build on insights developed by previous work on
propositional anaphora (Polanyi, Webber, Asher) and on default
unification (Carpenter, Grover et al).




"Red-headed butchers, skilful violinists, fake surgeons, and pet
fish: The semantics of membership in noun-noun and adjective-noun
conjunctions."

Fintan Costello, DCU, Dublin

Abstract:

How are the meanings of multiword phrases such as "pet fish", "skilful
violinist", "fake surgeon", or "red-headed butcher" produced from the
meaning of their constituent words? Meaning depends, at least in
part, on category membership: to understand a single word such as
"fish" is to be able to correctly classify items in the single
category FISH; to understand a noun-noun phrase such as "pet fish" is
to be able to correctly classify items in the conjunctive category
PET-FISH. To explain the semantics of combined phrases like "pet
fish", we must first explain how people classify items in conjunctive
categories. This talk describes a computational model which
successfully explains how people classify items in single noun
categories and in noun-noun conjunctions. The talk goes on to show
that the model also explains classification in combinations involving
three different types of adjective: intersective adjectives
(e.g. "red-headed butcher"), subsective adjectives ("skilful
violinist"), and privative adjectives ("fake surgeon"). For these
different adjective types, different relationships hold between
membership in a combination and membership in its adjective and noun
constituents. Most current theories give separate mechanisms for each
adjective type and for noun-noun conjunctions. The model gives a
unified account for these adjective-noun and noun-noun combinations.



"Simulating the Emergence of Meaning and Understanding."

Carl Vogel, TCD

Abstract:

A number of theories of language acquisition and linguistic evolution
incorporate the seductive yet indefensible assumption that semantic
bootstrapping takes place---that initially, meanings are shared, and
that language is built on the foundation of perfect communication. It
seems to me that if the assumption were valid, there would be no point
in having language (at least not without the simultaneous evolution of
writing systems). In this talk I'll present a system designed for
simulating language evolution in which the effect of this assumption
can be explored as a parameter setting (among other parameters, like
number of discriminable phonemes, etc.). The question to be explored
is whether systematic relations between words and meanings can develop
in the absence of semantic bootstrapping. A possibly surprising
answer is that to some nontrivial extent such relations can emerge.



"Experiments in Structure Preserving Grammar Compaction"

Mark Hepple (University of Sheffield ) and Josef van Genabith (DCU)

Abstract:

Structure preserving grammar compaction (SPC) is a simple CFG
compaction technique originally described in (van Genabith, Sadler and
Way, 1999). It works by generalising category labels and in so doing
"plugs holes" in the grammar. In the talk we present research on
applying SPC to a large grammar extracted from the Penn Treebank. We
examine its effects on grammar size and rule accession rates (as an
indicator of grammar completeness) and the performance of compacted
grammars under standard PCFG models.



"Phonological Constraints in Speech Technology"

Julie Berndsen, UCD

Abstract:

Although stochastic approaches are currently at the forefront of
speech recognition applications, it has now been recognised that
linguistic structure is required in order to deal with the problem of
recognising new words. This talk presents a constraint model for
multilinear representations of speech utterances which can provide
important fine-grained information for speech recognition
applications. The model combines aspects of event logic with efficient
finite state processing strategies and uses explicit phonotactic
constraints specifying overlap (coarticulation) and precedence
relations between phonological features to recognise well-formed
syllable structures. The constraints are provided with a ranking which
is enhanced by a constraint relaxation procedure to cater for
underspecified input and to extrapolate output representations in a
top-down fashion based on the phonotactic constraints. The
constraint-based phonotactic model can be employed in both the
recognition and the synthesis task domains.




"In Defence of Detail"

Martin Emms, TCD

Abstract:

The standard conception of the syntax/semantics interface is that to
each rule of syntactic composition there corresponds a rule of
semantic composition. Thus all NP-VP sentences are 'interpreted' via
the same rule of semantics. A consequence of this is that the very
same operation is used regardless of the content of the NPs and
VPs. This in turn requires of the ingredients to be composed be both
as abundant as there are different topics, yet sufficiently alike for
topic-neutral composition to make sense. Not many things square that
particular circle, set-theoretical constructions being the most
notable example.

I will be considering examples which suggest that this assumption of
topic neutral semantic composition is wrong. I will be looking for
answers to questions such as these: why is that (1) can be true yet
(2) probably neither true nor even well-formed:

(1) On Sunday over Houston, Mir was falling at 10 m/s
(2) On Sunday over Houston, Mir was spinning at 10 m/s

even if the things falling and spinning over Houston on Sunday are the
same ? Why can you say (3) but not (4)

(3) line l [meets plane p]_vp at an angle of 45 deg
(4) line l [lies in a plane not parallel to p]_vp at an angle of 45 deg

even if the VPs are logically equivalent ? Why does the following
argument sound strange:

(5) I clicked my fingers
(6) my fingers are a part of my body
(7) I clicked a part of my body




"A Framework for the Hyperintensional Semantics of Natural Language
with Two Implementations"

Chris Fox and Shalom Lappin, KCL London

Abstract:

In this paper we present a framework for constructing a
hyperintensional semantics for natural language. On this approach, the
axiom of extensionality is discarded from the axiom base of a
logic. Weaker conditions are specified for the connection between
equivalence and identity which prevent the reduction of the former
relation to the latter. In addition, by axiomatising an intensional
number theory we can provide an internal account of proportional
cardinality quantifiers, like 'most'. We use a (pre-)lattice defined
in terms of a (pre-)order that models the entailment relation.
Possible worlds/situations/indices are then prime filters of
propositions in the (pre-) lattice. Truth in a world/situation is then
reducible to membership in a prime filter. We show how this approach
can be implemented within (i) an intensional higher-order type theory,
and (ii) first-order property theory.





"Situations and Alternatives"

Tim Fernando, TCD

Abstract:

I'll talk about notions of situations and alternatives, drawing on
Lappin's recent account of the word MANY (L&P, Dec 2000) for
linguistic motivation, and Reichenbach 1947 and Barwise & Perry 1983
for basic intuitions. The work is intended as a contribution to Robin
Cooper's program of interpreting situation semantics
proof/type-theoretically.
------- End of forwarded message -------

Index of May 2001 | Index of year: 2001 | Full index