1-Pager: One Pass Answer Generation and Evidence Retrieval
Abstract
We present 1-PAGER the first system that answers a question and retrieves evidence using
a single Transformer-based model and decoding process. 1-PAGER incrementally partitions
the retrieval corpus using constrained decoding to select a document and answer string,
and we show that this is competitive with comparable retrieve-and-read alternatives according to both retrieval and answer accuracy metrics. 1-PAGER also outperforms the equivalent ‘closed-book’ question answering model,
by grounding predictions in an evidence corpus. While 1-PAGER is not yet on-par with
more expensive systems that read many more
documents before generating an answer, we argue that it provides an important step toward
attributed generation by folding retrieval into
the sequence-to-sequence paradigm that is currently dominant in NLP. We also show that the
search paths used to partition the corpus are
easy to read and understand, paving a way forward for interpretable neural retrieval.
a single Transformer-based model and decoding process. 1-PAGER incrementally partitions
the retrieval corpus using constrained decoding to select a document and answer string,
and we show that this is competitive with comparable retrieve-and-read alternatives according to both retrieval and answer accuracy metrics. 1-PAGER also outperforms the equivalent ‘closed-book’ question answering model,
by grounding predictions in an evidence corpus. While 1-PAGER is not yet on-par with
more expensive systems that read many more
documents before generating an answer, we argue that it provides an important step toward
attributed generation by folding retrieval into
the sequence-to-sequence paradigm that is currently dominant in NLP. We also show that the
search paths used to partition the corpus are
easy to read and understand, paving a way forward for interpretable neural retrieval.