Google Research

Improving Zero-shot Entity Linking with Long Range Sequence Modeling

EMNLP Findings (2020)

Abstract

This paper considers the problem of zero-shot entity linking, in which a link in the test time may not present in training. Following the prevailing Bert-based research efforts, we find a simple yet effective way is to expand the long-range sequence modeling. Unlike many previous methods, our method does not require expensive pre-training of BERT with long position embeddings. Instead, we propose an efficient position embeddings initialization method called Embedding-repeat, which initializes larger position embeddings based on BERT models. For the zero-shot entity linking task, our method improves the STOA from 76.06\% to 79.08\%.

Learn more about how we do research

We maintain a portfolio of research projects, providing individuals and teams the freedom to emphasize specific types of work