Jump to Content

StructFormer: Joint Unsupervised Induction of Dependency and Constituency Structure from Masked Language Modeling

Yikang Shen
Yi Tay
Che Zheng
Aaron Courville
ACL 2021 (to appear)
Google Scholar

Abstract

The grammar of the natural language has two major classes: Dependency grammar that models are one-to-one correspondences between words, and Constituency grammar that models the assembly of one or several corresponded words. While previous unsupervised parsing methods mostly focus on inducing one class of grammars, we introduce a novel model, StructFormer, that can induce dependency and constituency structure at the same time. In order to achieve this, we propose a new self-attention mechanism with novel hierarchical and dependency constraints. Experiment results show that our model can achieve strong results on Unsupervised Constituency parsing, Unsupervised Dependency Parsing and Masked Language Modeling.

Research Areas