Improving Zero-shot Entity Linking with Long Range Sequence Modeling

Zonghai Yao
Liangliang Cao
Huapu Pan
EMNLP Findings (2020)
Google Scholar

Abstract

This paper considers the problem of zero-shot entity linking, in which a link in the test time may not present in training.
Following the prevailing Bert-based research efforts, we find a simple yet effective way is to expand the long-range sequence modeling.
Unlike many previous methods, our method does not require expensive pre-training of BERT with long position embeddings. Instead,
we propose an efficient position embeddings initialization method called Embedding-repeat, which initializes larger position embeddings based on BERT models.
For the zero-shot entity linking task, our method improves the STOA from 76.06\% to 79.08\%.