程序代写代做代考 algorithm Video 1: Parsing with semantic attachments

Video 1: Parsing with semantic attachments
ANLP Week 9/Unit 3 Syntax/Semantics interface (Semantic analysis)
Sharon Goldwater
(based on slides by James Martin and Johanna Moore)
Sharon Goldwater
ANLP Week 9/Unit 3
At last: Semantic Analysis
Sharon Goldwater
ANLP Week 9/Unit 3 1
Syntax Driven Semantic Analysis
• Given this way of representing meanings, how do we compute meaning representations from sentences?
• The task of semantic analysis or semantic parsing.
• Most methods rely on a (prior or concurrent) syntactic parse.
• Here: a compositional rule-to-rule approach based on FOL augmented with λ-expressions.
• Based on the principle of compositionality.
– meaning of the whole built up from the meaning of the parts
– more specifically, in a way that is guided by word order and syntactic relations.
• Build up the MR by augmenting CFG rules with semantic composition rules.
• Representation produced is literal meaning: context independent and free of
inference
Note: other syntax-driven semantic parsing formalisms exist, e.g. Combinatory Categorial Grammar (Steedman, 2000) has seen a surge in popularity recently.
Sharon Goldwater
ANLP Week 9/Unit 3 2
Sharon Goldwater ANLP Week 9/Unit 3 3

Example of final analysis
• What we’re hoping to build Serving(e)
ProperNoun → AyCaramba MassNoun → meat NP → ProperNoun NP → MassNoun
{AyCaramba} {Meat} {ProperNoun.sem} {MassNoun.sem}
• Unary rules normally just copy the semantics of the child to the parents (as in NP rules here).
CFG Rules with Semantic Attachments
Proposed rules
• Ex: AyCaramba serves meat (with parse tree)
• Rules with semantic attachments for nouns and NPs:
• To compute the final MR, we add semantic attachments to our CFG rules.
• These specify how to compute the MR of the parent from those of its children.
• Rules will look like:
A → α1 …αn {f(αj.sem,…,αk.sem)}
• A.sem (the MR for A) is computed by applying the function f to the MRs of some subset of A’s children.
Sharon Goldwater ANLP Week 9/Unit 3 5
What about verbs?
• Before event reification, we had verbs with meanings like: λy. λx. Serving(x,y)
• λs allowed us to compose arguments with predicate. • We can do the same with reified events:
λy. λx. ∃e. Serving(e) ∧ Server(e, x) ∧ Served(e, y)
Sharon Goldwater ANLP Week 9/Unit 3 6
Sharon Goldwater ANLP Week 9/Unit 3 7

VP → Verb NP VP
􏰋􏰋􏰋􏰋􏰋HHHHH
Verb NP serves Mass-Noun
meat
{Verb.sem(NP.sem)} where Verb.sem =
λy. λx. ∃e. Serving(e) ∧ Server(e, x) ∧ Served(e, y)
and NP.sem = Meat
{Verb.sem(NP.sem)} where Verb.sem =
λy. λx. ∃e. Serving(e) ∧ Server(e, x) ∧ Served(e, y)
and NP.sem = Meat
What about verbs?
Building larger constituents
• Before event reification, we had verbs with meanings like: λy. λx. Serving(x,y)
• λs allowed us to compose arguments with predicate.
• We can do the same with reified events:
λy. λx. ∃e. Serving(e) ∧ Server(e, x) ∧ Served(e, y)
• This MR is the semantic attachment of the verb:
Verb → serves
{ λy. λx. ∃e. Serving(e) ∧ Server(e, x) ∧ Served(e, y) }
Sharon Goldwater ANLP Week 9/Unit 3 8
Building larger constituents
• The remaining rules specify how to apply λ-expressions to their arguments. So, VP rule is:
• The remaining rules specify how to apply λ-expressions to their arguments. So, VP rule is:
Sharon Goldwater
ANLP Week 9/Unit 3 9
Building larger constituents
VP → Verb NP
{Verb.sem(NP.sem)}
• The remaining rules specify how to apply λ-expressions to their arguments.
So,
VP rule is:
VP → Verb NP VP
􏰋􏰋􏰋􏰋􏰋HHHHH
Verb NP serves Mass-Noun
• So, λy. λx.
meat
λx. ∃e. Serving(e) ∧ Server(e, x) ∧ Served(e, y) (Meat) =
VP.sem =
∃e. Serving(e) ∧ Server(e, x) ∧ Served(e, Meat)
Sharon Goldwater
ANLP Week 9/Unit 3
10
Sharon Goldwater ANLP Week 9/Unit 3 11

Finishing the analysis
Video 2: Type raising and research directions
• Final rule is:
S → NP VP {VP.sem(NP.sem)}
• now with VP.sem =
λx. ∃e. Serving(e) ∧ Server(e, x) ∧ Served(e, Meat)
and NP.sem = AyCaramba
• So, S.sem =
λx. ∃e. Serving(e) ∧ Server(e, x) ∧ Served(e, Meat) (AyCa.) = ∃e. Serving(e) ∧ Server(e, AyCaramba) ∧ Served(e, Meat)
Sharon Goldwater ANLP Week 9/Unit 3 12
Problem with these rules
• Consider the sentence Every child sleeps.
∀ x. Child(x) ⇒ ∃ e. Sleeping(e) ∧ Sleeper(e, x)
• Meaning of Every child (involving x) is interleaved with meaning of sleeps
• As next slides show, our existing rules can’t handle this example, or quantifiers
(from NPs with determiners) in general.
• We’ll show the problem, then the solution.
Sharon Goldwater
ANLP Week 9/Unit 3 13
Breaking it down
• What is the meaning of Every child anyway? • Every child …
…sleeps …cries …talks …likes pizza
∀ x. Child(x) ⇒ ∃ e. Sleeping(e) ∧ Sleeper(e, x)
∀ x. Child(x) ⇒ ∃ e. Crying(e) ∧ Crier(e, x)
∀ x. Child(x) ⇒ ∃ e. Talking(e) ∧ Talker (e, x)
∀ x. Child(x) ⇒ ∃ e. Liking (e) ∧ Liker(e, x) ∧ Likee(e, pizza)
Sharon Goldwater ANLP Week 9/Unit 3 14
Sharon Goldwater
ANLP Week 9/Unit 3 15

• What is the meaning of Every child anyway?
• Every child …
• We said S.sem should be VP.sem(NP.sem) • but
λy. ∃ e. Sleeping(e) ∧ Sleeper(e, y) (∀ x. Child(x) ⇒ Q(x)) yields
∃ e. Sleeping(e) ∧ Sleeper(e, ∀ x. Child(x) ⇒ Q(x))
• This isn’t a valid FOL: complex expressions cannot be arguments to predicates.
…sleeps …cries …talks …likes pizza
∀ x. Child(x) ⇒ ∃ e. Sleeping(e) ∧ Sleeper(e, x)
∀ x. Child(x) ⇒ ∃ e. Crying(e) ∧ Crier(e, x)
∀ x. Child(x) ⇒ ∃ e. Talking(e) ∧ Talker (e, x)
∀ x. Child(x) ⇒ ∃ e. Liking (e) ∧ Liker(e, x) ∧ Likee(e, pizza)
Breaking it down
Could this work with our rules?
• So it looks like the meaning is something like
∀ x. Child(x) ⇒ Q(x)
• where Q(x) is some (potentially quite complex) expression with a predicate-like meaning
Sharon Goldwater ANLP Week 9/Unit 3 16
Switching things around
• But if we define S.sem as NP.sem(VP.sem) it works! • First, must make NP.sem into a functor by adding λ:
λQ ∀ x. Child(x) ⇒ Q(x)
Sharon Goldwater ANLP Week 9/Unit 3 17
Switching things around
• But if we define S.sem as NP.sem(VP.sem) it works! • First, must make NP.sem into a functor by adding λ:
λQ ∀ x. Child(x) ⇒ Q(x)
• Then, apply it to VP.sem:
λQ ∀ x. Child(x) ⇒ Q(x) (λy. ∃ e. Sleeping(e) ∧ Sleeper(e, y)) ∀ x. Child(x) ⇒ (λy. ∃ e. Sleeping(e) ∧ Sleeper(e, y)) (x)
∀ x. Child(x) ⇒ ∃ e. Sleeping(e) ∧ Sleeper(e, x)
Sharon Goldwater
ANLP Week 9/Unit 3 18
Sharon Goldwater ANLP Week 9/Unit 3 19

Sharon Goldwater
ANLP Week 9/Unit 3 20
One last problem
• So, Every child is derived as
λP. λQ. ∀ x. P(x) ⇒ Q(x) (λx. Child(x)) λQ ∀ x. (λx. Child(x))(x) ⇒ Q(x)
λQ ∀ x. Child(x) ⇒ Q(x)
Sharon Goldwater ANLP Week 9/Unit 3 21
λ to the rescue again
• Assign a different MR to proper nouns, allowing them to take VPs as arguments:
But, how can we get the right NP.sem?
But, how can we get our NP.sem?
• We will need a new set of noun rules:
• We will need a new set of noun rules:
Noun → Child Det → Every NP → Det Noun
{λx. Child(x)}
{λP. λQ. ∀ x. P(x) ⇒ Q(x)} {Det.sem(Noun.sem)}
Noun → Child Det → Every NP → Det Noun
{λx. Child(x)}
{λP. λQ. ∀ x. P(x) ⇒ Q(x)} {Det.sem(Noun.sem)}
• Our previous MRs for proper nouns were not functors, so don’t work with our new rule S → NP VP {NP.sem(VP.sem)}.
ProperNoun → Kate
• For Kate sleeps, this gives us
{λP. P(Kate)}
S
􏰋􏰋􏰋􏰋􏰋􏰋HHHHH
NP VP ProperNoun Verb
Kate (λy. ∃ e. Sleeping(e) ∧ Sleeper(e, y)) ⇒ Not valid!
λP. P(Kate) (λy. ∃ e. Sleeping(e) ∧ Sleeper(e, y)) (λy. ∃ e. Sleeping(e) ∧ Sleeper(e, y))(Kate)
∃ e. Sleeping(e) ∧ Sleeper(e, Kate))
Kate
sleeps
Sharon Goldwater
ANLP Week 9/Unit 3 22
Sharon Goldwater ANLP Week 9/Unit 3 23

λ to the rescue again
• Assign a different MR to proper nouns, allowing them to take VPs as arguments:
Final grammar?
{NP.sem(VP.sem)} {Verb.sem} {Verb.sem(NP.sem)} {Det.sem(Noun.sem)} {ProperNoun.sem}
{λP. λQ. ∀x. P(x) ⇒ Q(x)} {λx. Child(x)}
{λP. P(Kate)}
{λx. ∃e. Sleeping(e) ∧ Sleeper(e, x)}
{λy. λx. ∃e. Serving(e) ∧ Server(e, x) ∧ Served(e, y)}
ANLP Week 9/Unit 3 25
What we did achieve
ProperNoun → Kate
• For Kate sleeps, this gives us
{λP. P(Kate)}
S→NPVP VP → Verb VP→VerbNP NP → Det Noun NP → ProperNoun Det → Every Noun → Child ProperNoun → Kate Verb → sleeps Verb → serves
λP. P(Kate) (λy. ∃ e. Sleeping(e) ∧ Sleeper(e, y)) (λy. ∃ e. Sleeping(e) ∧ Sleeper(e, y))(Kate)
∃ e. Sleeping(e) ∧ Sleeper(e, Kate))
• Terminology: we type-raised the the argument a of a function f, turning it into a function g that takes f as argument. (!)
– The final returned value is the same in either case.
Sharon Goldwater ANLP Week 9/Unit 3 24
Complications
• This grammar still applies Verbs to NPs when inside the VP.
• Try doing this with our new type-raised NPs and you will see it doesn’t work.
• In practice, we need automatic type-raising rules that can be used exactly when needed, otherwise we keep the base type.
– e.g., “base type” of proper noun is “entity”, not “function from (functions from entities to truth values) to truth values”.
Sharon Goldwater
Developed a grammar with semantic attachments using many ideas now in use: • existentially quantified variables represent events
• lexical items have function-like λ-expressions as MRs
• non-branching rules copy semantics from child to parent
• branching rules apply semantics of one child to the other(s) using λ-reduction.
Sharon Goldwater ANLP Week 9/Unit 3 26
Sharon Goldwater ANLP Week 9/Unit 3 27

Sharon Goldwater
ANLP Week 9/Unit 3 28
Summary
Semantic parsing algorithms
Learning a semantic parser
• Given a CFG with semantic attachments, how do we obtain the semantic analysis of a sentence?
• One option (integrated): Modify syntactic parser to apply semantic attachments at the time syntactic constituents are constructed.
• Second option (pipelined): Complete the syntactic parse, then walk the tree bottom-up to apply semantic attachments.
• Much current research focuses on learning semantic grammars rather than hand-engineering them.
• Given sentences paired with meaning representations, e.g.,
Every child sleeps ∀ x. Child(x) ⇒ ∃ e. Sleeping(e) ∧ Sleeper(e, x)
AyCaramba serves meat ∃e. Serving(e) ∧ Server(e, AyCaramba) ∧ Served(e, Meat) • Can we automatically learn
– Which words are associated with which bits of MR?
– How those bits combine (in parallel with the syntax) to yield the final MR?
• And, can we do this with less well-specified semantic representations? See, e.g., Zettlemoyer and Collins (2005); Kwiatkowski et al. (2010); Reddy et al. (2014); Choi et al. (2015)
Sharon Goldwater ANLP Week 9/Unit 3 29
References
Choi, E., Kwiatkowski, T., and Zettlemoyer, L. (2015). Scalable semantic parsing with partial ontologies. In Proceedings of the Association for Computational Linguistics.
Kwiatkowski, T., Zettelmoyer, L., Goldwater, S., and Steedman, M. (2010). Inducing probabilistic CCG grammars from logical form with higher-order unification. In Proceedings of the Conference on Empirical Methods in Natural Language Processing.
Reddy, S., Lapata, M., and Steedman, M. (2014). Large-scale semantic parsing without question- answer pairs. Transactions of the Association for Computational Linguistics, 2:377–392.
Zettlemoyer, L. S. and Collins, M. (2005). Learning to map sentences to logical form: Structured classification with probabilistic categorial grammars. In Proceedings of the Conference on Uncertainty in Artificial Intelligence.
• Semantic analysis/semantic parsing: the process of deriving a meaning representation from a sentence.
• Uses the grammar and lexicon (augmented with semantic information) to create context-independent literal meanings
• λ-expressions handle compositionality, building semantics of larger forms from smaller ones.
• Final meaning representations are expressions in first-order logic.
Sharon Goldwater ANLP Week 9/Unit 3 30
Sharon Goldwater ANLP Week 9/Unit 3 31