Google Research

T-STAR: Truthful Style Transfer using AMR Graph as IntermediateRepresentation

EMNLP, 2022 (2022), pp. 8


Unavailability of parallel corpora for training text style transfer (TST) models is a very challenging yet common scenario. Also, TST models implicitly need to preserve the content while transforming a source sentence into the target style. To tackle these problems, an intermediate representation is often constructed that is devoid of style while still preserving the meaning of the source sentence. In this work, we study the usefulness of using Abstract Meaning Representation (AMR) graph as the intermediate style agnostic representation. We posit that semantic notations like AMR are a natural choice for an intermediate representation. Hence, we propose the \textbf{T-STAR} model comprising of two components, text-to-AMR and AMR-to-text. We ensure that the intermediate representation is style agnostic, and use style-aware pretraining to improve the AMR-to-text performance. We show that the proposed model outperforms the state of the art TST models with improved content preservation and style accuracy numbers via automatic and human evaluations.

Learn more about how we do research

We maintain a portfolio of research projects, providing individuals and teams the freedom to emphasize specific types of work