Vinh Q. Tran

Vinh Q. Tran

I’m a Staff Research Engineer at Google Research NY where I work methods for sequence modeling / NLP and machine learning. My research interests are in all things related to improving, expanding, or rethinking the functionality of Transformers and other state-of-the-art sequence models.
Authored Publications
Sort By
  • Title
  • Title, descending
  • Year
  • Year, descending
    Google
DSI++: Updating Transformer Memory with New Documents
Yi Tay
Jinfeng Rao
Emma Strubell
Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing
Dense Feature Memory Augmented Transformers for COVID-19 Vaccination Search Classification
Yi Tay
Chaitanya Kamath
Shailesh Bavadekar
Evgeniy Gabrilovich
Proceedings of the 2022 Conference on Empirical Methods in Natural Language Processing (2022)
A New Generation of Perspective API: Efficient Multilingual Character-level Transformers
Alyssa Whitlock Lees
Yi Tay
Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining (2022)