Interleaving natural language parsing and generation through uniform processing

作者:

Highlights:

摘要

We present a new model of natural language processing in which natural language parsing and generation are strongly interleaved tasks. Interleaving of parsing and generation is important if we assume that natural language understanding and production are not only performed in isolation but also work together to obtain subsentential interactions in text revision or dialog systems.The core of the model is a new uniform agenda-driven tabular algorithm, called UTA. Although uniformly defined, UTA is able to configure itself dynamically for either parsing or generation, because it is fully driven by the structure of the actual input—a string for parsing and a semantic expression for generation.Efficient interleaving of parsing and generation is obtained through item sharing between parsing and generation. This novel processing strategy facilitates the automatic exchange of items (i.e., partial results) computed in one direction to the other direction as well.The advantage of UTA in combination with the item sharing method is that we are able to extend the use of memorization techniques to the case of an interleaved approach. In order to demonstrate UTA's utility for developing high-level performance methods, we present a new algorithm for incremental self-monitoring during natural language production.

论文关键词:Natural language processing,Reversible grammars,Chart-based uniform algorithm for parsing and generation,Self-monitoring and revision

论文评审过程:Available online 23 June 1998.

论文官网地址:https://doi.org/10.1016/S0004-3702(97)00072-6