Skip to main navigation menu Skip to main content Skip to site footer

Memory-Based Semantic Parsing

Abstract

We present a memory-based model for context-dependent semantic parsing. Previous approaches focus on enabling the decoder to copy or modify the parse from the previous utterance, assuming there is a dependency between the current and previous parses. In this work, we propose to represent contextual information using an external memory. We learn a context memory controller that manages the memory by maintaining the cumulative meaning of sequential user utterances. We evaluate our approach on three semantic parsing benchmarks.  Experimental results show that our model can better process context-dependent information and demonstrates improved performance without using task-specific decoders.
Presented at EMNLP 2021 Article at MIT Press