程序代写代做代考 Computational

Computational
Linguistics
CSC 485 Summer 2020
5A
5a. Extending grammars with features
Gerald Penn
Department of Computer Science, University of Toronto
Reading: Jurafsky & Martin: 12.3.4–6, 15.0–3; [Allen: 4.1–5]; Bird et al: 9.
Copyright © 2017 Suzanne Stevenson, Graeme Hirst and Gerald Penn. All rights reserved.

Agreement and inflection
• Problem: Agreement phenomena. Nadia {washes/*wash} the dog.
The boys {*washes/wash} the dog.
You {*washes/wash} the dog.
• Morphological inflection of verb must
match subject noun in person and number.
2

Subject–verb agreement 1
Present tense
1
2 3
1
2 3
Singular
I
you wash
Plural
we wash you wash they wash
wash he/she/it washes
I
you he, she, it
we are you are
am
are
is they are
3

Subject–verb agreement 2
1
2 3
1
2 3
I
you washed
we washed you washed they washed
we were you were they were
Past tense Singular
Plural
washed he, she, it washed
was you were
he, she, it was
I
4

Agreement features 1
• English agreement rules are fairly simple.
• Subject : verb w.r.t. person and number.
• No agreement required between verb and object.
• Many languages have other agreements.
• E.g., German: Article and adjective ending depends on noun gender and case:
5

Agreement features 2
Nominative Case (Subject Case)
Masculine
der
Feminine Neuter Plural
die das die
der neue Wagen the new car
die schöne Stadt the beautiful city
das alte Auto the old car
die neuen Bücher the new books
Masculine
ein
Feminine Neuter Plural
eine ein keine
ein neuer Wagen a new car
eine schöne Stadt a beautiful city
ein altes Auto an old car
keine neuen Bücher no new books
6
Ask about.com: German language: Adjective endings I and II. http://german.about.com/library/weekly/aa030298.htm and aa033098.htm

Agreement features 2
Masculine Feminine Neuter Plural
den die das die
den neuen Wagen the new car
die schöne Stadt the beautiful city
das alte Auto the old car
die neuen Bücher the new books
Masculine Feminine Neuter Plural
einen eine ein keine
einen neuen Wagen a new car
eine schöne Stadt a beautiful city
ein altes Auto an old car
keine neuen Bücher no new books
Accusative Case (Direct Object)
7
Ask about.com: German language: Adjective endings I and II. http://german.about.com/library/weekly/aa030298.htm and aa033098.htm

Agreement features 3
E.g., Chinese: Numeral classifiers, often based on shape, aggregation, …:
两条鱼 liang tiao yu ‘two CLASSIF-LONG-ROPELIKE fish’ 两条河 liang tiao he ‘two CLASSIF-LONG-ROPELIKE rivers’ 两条腿 liang tiao tui ‘two CLASSIF-LONG-ROPELIKE legs’ 两条裤子 liang tiao kuzi ‘two CLASSIF-LONG-ROPELIKE pants’ 两只胳膊 liang zhi gebo ‘two CLASSIF-GENERAL arms’
两件上衣 liang jian shangyi ‘two CLASSIF-CLOTHES-ABOVE-WAIST tops’ 两套西装 liang tao xizhuang ‘two CLASSIF-SET suits’
Zhang, Hong (2007). Numeral classifiers in Mandarin Chinese. Journal of East Asian Linguistics, 16(1), 43–59. Thanks also to Tong Wang, Vanessa Wei Feng, and Helena Hong Gao.
8

Agreement features 1
• English agreement rules are fairly simple.
• Many languages have other agreements.
• Some languages have multiple grammatical genders.
• E.g. Chichewa has genders for men, women, bridges, houses, diminuitives, men inside houses, etc. Between 12-18 in total.
• Some languages overtly realize many of these distinctions.
• E.g. some Hungarian verbs have as many as 4096 inflected forms.
9

Inflectional morphology
• Word may be inflected …
• … to indicate paradigmatic properties, e.g.
singular / plural, past / present, …
• … to indicate some (other) semantic properties
• … to agree with inflection of other words.
• Each (open-class) word-type has a base form / stem / lemma.
• Each occurrence of a word includes inflection by a (possibly null) morphological change.
10

Rule proliferation 1
• Problem: How to account for this in grammar.
• Possible solution: Replace all NPs, Vs, and VPs throughout the grammar.
S → NP VP
NP → you, dog, dogs, bear, bears, …
VP → V NP
V → washes, wash, washed, is, was, …
V1s → am, was, wash, washed, …
S → NP3s
S → NP3p
S → NP2
S → NP1s VP1s S → NP1p VP1p
NP3s → dog, bear, … NP3p → dogs, bears
NP2 → you ⋮
VP3s → V3s NP

V3s → is, was, washes, washed, … V3p → are, were, wash, washed, …
VP3s VP3p
VP2

11

Rule proliferation 2
• Drawback 1: the result is big … really big.
• Drawback 2: Losing the generalization:
• All these Ss, NPs, VPs have the same structure.
• Doesn’t depend on particular verb, noun, and number.
• CF rules collapse together structural and featural information.
• All information must be completely and directly specified.
• E.g., can’t just say that values must be equal for some feature without saying exactly what values.
12

Feature structures 1
• Solution: Separate feature information from syntactic, structural, and lexical information.
• A feature structure is a list of pairs: [feature-name feature-value]
• Feature-values may be atoms or feature structures.
• Can consider syntactic category or word to be bundle of features too.
• Can represent syntactic structure.
14

Feature structures 2
• Drawback: many equivalent notations.
N Num s Pers 3
Lex dog
dog Cat N Num s
N/dog Num s Pers 3
Feature paths:
features of features; e.g., (Agr Pers 3)
Pers 3
CatN CatN
Num s Pers 3 Lex dog
Agr Nums
[] Pers 3
dog
Lex
15

Feature structures 3
Cat Det Num s Pers 3 Lex a
Cat NP
Num s
Det [Num s]
Pers 3 Lex a
Cat N N [Num s ]
Pers 3 Lex dog
NP formed from Det and N. Feature values in components become feature names in new constituent.
Nums Pers 3 Lex dog
16

Components of feature use
• 1. Lexical specification: Description of properties of a word:
morphological, syntactic, semantic, …
Or:
N → dog
(N Agr) = 3s
N → dogs
(N Agr) = 3p
V → sleeps
(V Agr) = 3s
V → sleep
(V Agr) = {1s,2s,1p,2p,3p}
dog: dogs:
Cat N sleeps: Cat V [] []
Agr 3s Agr 3s Cat N sleep: Cat V
[][] Agr 3p Agr {1s,2s,1p,2p,3p}
18

Components of feature use
• 2. Agreement:
• Constraints on co-occurrence in a rule — within
or across phrases.
• Typically are equational constraints.
NP→Det N
(Det Num) = (N Num)
S → NP VP
(NP Agr) = (VP Agr)
19

Components of feature use
• 3. Projection:
• Sharing of features between the head of a
phrase and the phrase itself. VP → V . . .
(VP Agr) = (V Agr)
• Head features:
• Agr is typical, but so is the head-word itself as a feature.
(Common enough that there’s usually a mechanism for “declaring” head features and omitting them from rules.)
21

Constraints on feature values 1
• What does it mean for two features to be “equal”?
• A copy of the value or feature structure, or
a pointer to the same value or feature structure (re-entrancy, shared feature paths).
Cat N Cat N
Agr ➀ Agr ➀ Nums
Copy
[] Lex dog Pers3
Pointer
Lex sky
22

Constraints on feature values 2
• But: It may be sufficient that two features are not equal, just compatible — that they can be unified.
• E.g., Cat N Pers 3
Num s
and Cat N Pers 3
Gndr F
23

Subsumption of feature structures 1
• Feature structure X subsumes feature structure Y if Y is consistent with, and at least as specific as X.
• Also say that Y extends X.
Y can add (non-contradictory) features to those in X.
• Definition: X subsumes Y (X ⊑ Y) iff there is a simulation of X inside Y, i.e., a function s.t.:
• sim(X) = Y
• If X is atomic, so is Y and X = Y
• Otherwise, for all feature values X.f: Y.f is defined,
and sim simulates X.f inside Y.f.
24

Subsumption of feature structures 2
• Examples:
Cat N ⊑ Cat N Cat N Cat N Pers3 Pers3 but Pers3⋢Pers3
Gndr F
Num s
Gndr F
Cat VP
Agr ➀
Subj [Agr ➀]
Cat VP

Third example from Jurafsky & Martin, p. 496
25
Agr ➀
[] Num s
Subj Agr ➀ Pers 3

Unification 1
• The unification of X and Y (X ⨆ Y) is the most general feature structure Z that is subsumed by both X and Y.
• Z is the smallest feature structure that extends both X and Y.
• Unification is a constructive operation.
• If any feature values in X and Y are incompatible,
it fails.
• Else it produces a feature structure that includes all the features in X and all the features in Y.
26

Unification 2
Cat N Pers 3 Num s
Cat N ⨆ Pers 3
Gndr F
Cat N = Pers 3
Num s Gndr F
27

Features in chart parsing
• Each constituent has an associated feature structure.
• Constituents with children have a feature structure for each child.
• Arc addition:
• The feature structure of the new arc is initialized
with all known constraints. • Arc extension:
• The feature structure of the predicted constituent must unify with that of the completed constituent extending the arc.
28

Sample grammar fragment
S → NP VP
(NP Agr) = (VP Agr) NP → Det N
(NP Agr) = (N Agr) (Det Agr) = (N Agr)
VP → V
(VP Agr) = (V Agr)
Det → a [Agr 3s]
N → dog [Agr 3s]
V → sleep [Agr ^3s]
Det → all [Agr 3p]
N → dogs [Agr 3p]
V → sleeps [Agr 3s]
Det → the [Agr {3s,3p}]
30

Mismatched features fail
S N [Agr 1 3s] NP
[]
VP
Agr 1
Det [Agr 1]
Agr 2 []
[Agr 1] ⨆ [Agr 2]
FAIL
Det [Agr 3s] N [Agr 3s] a dog
V [Agr 2 ^3s]
V [Agr ^3s] sleep
31

Unifiable features succeed
S N [Agr 1 3s] NP
[]
VP
Agr 1
Det [Agr 1]
Agr 2 []
[Agr 1] ⨆ [Agr 2]
SUCCEED
Det [Agr 3s] N [Agr 3s] a dog
V [Agr 2 3s]
V [Agr 3s] sleeps
32

Advantages of this approach
• Distinguishes structure from ”functional” info.
• Allows for economy of specification:
• Equations in rules: S → NP VP
(NP Agr) = (VP Agr)
• Sets of values in lexicon: N → fish
(N Agr {3s, 3p})
• Allows for indirect specification and transfer of information, e.g., head features.
Must unify with
33

Features and the lexicon
• Lexicon may contain each inflected form. • Feature values and base form listed.
• Lexicon may contain only base forms.
• Process of morphological analysis maps inflected
form to base form plus feature values.
• Time–space trade-off, varies by language.
• Lexicon may contain semantics for each form.
34

Morphological analysis
• Morphological analysis is simple in English.

Reverse the rules for inflections, including spelling changes.

Irregular forms will always have to be explicitly listed in lexicon.
children → child [Agr 3p] sang → sing [Tns past]
35
dogs → dog [Agr 3p] dog → dog [Agr 3s] berries → berry [Agr 3p] buses → bus [Agr 3p]
eats → eat [Agr 3s, Tns pres] ripped → rip [Tns past] tarried → tarry [Tns past] running → run [Tns pp]

Morphology in other languages
• Rules may be more complex in other (even European) languages.
• Languages with compounding (e.g., German) or agglutination (e.g., Finnish) require more- sophisticated methods.
• E.g., Verdauungsspaziergang, a stroll that one takes after a meal to assist in digestion.
36

Semantics as a lexical feature
• Add a Sem feature:
Cat N
Num s
Pers 3
Lex dog Sem dog
Typewriter font
for semantic objects
• The meaning of dog is dog.
The meaning of chien and Hund are both dog. The meaning of dog is G52790.
37

Verb subcategorization 1

Problem: Constraints on verbs and their complements.
Nadia told / instructed / *said / *informed Ross to sit down. Nadia *told / *instructed / said / *informed to sit down. Nadia told / *instructed / *said / informed Ross of the
requirement to sit down.
Nadia gave / donated her painting to the museum. Nadia gave / *donated the museum her painting.
Nadia put / ate the cake in the kitchen. Nadia *put / ate the cake.
38

Verb subcategorization 2
• VPs are much more complex than just V with optional NP and/or PP.
• Can include more than one NP.
• Can include clauses of various types: that Ross fed the marmoset
to pay him the money
• Subcat: A feature on a verb indicating the
kinds of verb phrase it allows: _np, _np_np, _inf, _np_inf, …
Write this way to distinguish from constituents.
39

Verb tense and aspect 1
• Tense and aspect markings on verb:
• Locate the event in time (relative to another time).
• Mark the event as complete/finished or in progress.
Nadia rides the horse. — In progress now.
Nadia rode the horse. — Completed before now.
Nadia had ridden the horse. — Completed before before now.
Nadia was riding the horse. — In progress before now. ⋮
41

Verb tense and aspect 2
• Tense: past or present
• Aspect: simple, progressive, or perfect
Nadia …
Present Past
Simple
rides rode
Progressive
is riding was riding
Perfect
has ridden had ridden
… the horse
Auxiliary verb
42

Verb tense and aspect 3
• Tense: past or present
• Aspect: simple, progressive, or perfect
Nadia …
Simple
rides rode
Perfect progressive (continuous)
has been riding had been riding
… the horse
Auxiliary verbs
Present Past
43

Modal verbs
• Modal verbs: Auxiliary verbs that express degrees of certainty, obligation, possibility, prediction, etc.
Nadia
{could, should, must, ought to, might, will, …} {ride, be riding, have ridden, have been riding}
the horse.
44

English auxiliary system
• Structure (so far):
[MODAL] [HAVE] [BE] MAIN-VERB
• General pattern:
VP → AUX VP
AUX → MODAL | HAVE | BE
• Use features to capture necessary agreements.
45

Voice 1
The goalie kicked the ball. Event: kicked
Role: Agent (doer)
Thing: the goalie
kick (agent=goalie, theme=ball)
Thing: the ball
ACTIVE
Role: Theme (thing affected)
46

Voice 2
The ball was kicked. Event: kicked
Role: Theme (thing affected)
Thing: the ball
kick (agent=?, theme=ball)
PASSIVE
47

Voice 3
The ball was kicked by the goalie. Event: kicked
Role: Theme (thing affected)
Thing: the ball
kick (agent=goalie, theme=ball)
Thing: the goalie
Role: Agent (doer)
PASSIVE
48

Passive as Diathetic alternation
the goalie kicked the ball
49

Passive as Diathetic alternation
From object position in VP to subject position in S
the ball was kicked by the goalie
From subject position in S to PP in VP
But the semantic representation doesn’t change
50

Voice 4
• Voice: System of assigning thematic roles to syntactic positions.
• •
• Englishhasactiveandpassivevoices.
Passive expressed with be+past participle. Other auxiliaries may also apply, including progressive be.
Nadia was kissed. Nadia had been kissed. Nadia could be kissed. kissed.
Nadia was being kissed. Nadia had been being kissed. Nadia could have been being
• Structure:
[MODAL] [HAVE] [BE1] [BE2] MAIN-VERB
51

Some useful features
• VForm: The tense/aspect form of a verb: passive, pastprt, …
• CompForm: The tense/aspect form of the complement of an auxiliary.
54

Augmenting rules for passive voice
• For all rules of the form:
VP → V NP X
(V Subcat) = _y
ADD
VP → V X
(V Subcat) = _y
(V VForm) = passive (VP VForm) = passive
Metarule to ease grammar coding
• Augment Aux+VP rules:
VP → AUX VP
(AUX Root) = Be2
(AUX CompForm) = (VP2 VForm) (VP2 VForm) = passive
55

The GAP feature for passive voice
S → NP VP
1 (NP Agr) = (VP Agr)
2(VP VForm) = passive 3(VP Gap Cat) = NP
4(VP Gap Agr) = (NP Agr) 5(VP Gap Sem) = (NP Sem)
VP → AUX VP
1 (VP1 Agr) = (AUX Agr)
2 (VP1 VForm) = (VP2 VForm) 3 (VP1 Gap) = (VP2 Gap) 4(AUX Lex) = be2
5 (VP2 VForm) = passive
V → kicked
1 (V VForm) = {pastprt, passive}
2 (V Subcat) = _np 3 (V Lex) = kick 4(V Sem) =kick
VP → V NP
1 (VP VForm) = (V VForm)
2 (VP Gap) = (NP Gap)
3 (V Subcat) = _np
NP → ε
1 (NP Gap Cat) = NP
2 (NP Gap Agr) = (NP Agr)
3 (NP Gap Sem) = (NP Sem)
NP → cans
1 (NP Agr) = 3p
2 (NP Lex) = can
3 (NP Sem) = cans
AUX → were
1 (AUX Agr) = 3p
2 (AUX Lex) = be2
Empty string
59

(NP (Agr ➊ 3p
Sem ➋ cans )
VP (Agr ➊ VForm ➂
Gap ➃
AUX (Agr ➊ 3p
Lex be2) VP (VForm ➂
Gap ➃
V (VForm ➂ passive
Subcat _np Sem kick)
NP (Agr ➀ Sem ➋
Gap ➃ (Cat NP Agr ➀
(Agr ➄
VForm ➂
Gap ➃
AUX (Agr ➄ 3p
Lex be2) VP (VForm ➂
Gap ➃
V (VForm ➂ passive
Subcat _np Sem kick)
NP (Agr ➀ Sem ➁
Gap ➃ (Cat NP Agr ➀
Note: The green ➊’s of the S were ➄’s until the 4th con- straint of the rule S → NP VP. The 5th constraint fills in the Sem of the Gap ➋.
(VForm ➂
Gap ➃
V (VForm ➂ {passive,
pastprt} Subcat _np
Sem kick) NP (Agr ➀
Sem ➁
Gap ➃ (Cat NP
Agr ➀ Sem ➁))
Sem ➋))) S
AUX
were
VP
Sem ➁)))
VP
)
NP
cans
(VForm {passive, pastprt}
V
kicked Sem kick)
NP ε
(Agr ➀
Sem ➁
Gap (Cat NP
Agr ➀ Sem ➁))
(Agr 3p Sem cans)
(Agr 3p Lex be2)
Subcat _np
60

Other cases of gap percolation

Other constructions involve NPs in syntactic configurations where they would not get the right thematic roles using linear order alone.
Nadia seems to like Ross. Nadia seems to be liked. Nadia is easy to like.
Who did Nadia like?
I fed the dog that Nadia likes to walk.
Can use grammar rules with gap features to ensure correct structure/interpretation of these as well.

61

Summary
• Features help capture syntactic constructions in a general and elegant grammar.
• Features can encode the compositional semantics of a sentence as you parse it.
• Features can accomplish mapping functions between syntax and semantics that simplify the interpretation process.
62