This week, Berlin hosts the
2016 Annual Meeting of the Association for Computational Linguistics (ACL 2016), the premier conference of the field of computational linguistics, covering a broad spectrum of diverse research areas that are concerned with computational approaches to natural language. As a leader in
Natural Language Processing (NLP) and a Platinum Sponsor of the conference, Google will be on hand to showcase research interests that include syntax, semantics, discourse, conversation, multilingual modeling, sentiment analysis, question answering, summarization, and generally building better learners using labeled and unlabeled data, state-of-the-art modeling, and learning from indirect supervision.
Our systems are used in numerous ways across Google, impacting user experience in search, mobile, apps, ads, translate and more. Our work spans the range of traditional NLP tasks, with general-purpose syntax and semantic algorithms underpinning more specialized systems.
Our researchers are experts in natural language processing and machine learning, and combine methodological research with applied science, and our engineers are equally involved in long-term research efforts and driving immediate applications of our technology.
If you’re attending ACL 2016, we hope that you’ll stop by the booth to check out some demos, meet our researchers and discuss projects and opportunities at Google that go into solving interesting problems for billions of people. Learn more about Google research being presented at ACL 2016 below (Googlers highlighted in
blue), and visit the Natural Language Understanding Team page at
g.co/NLUTeam.
PapersGeneralized Transition-based Dependency Parsing via Control ParametersBernd Bohnet, Ryan McDonald, Emily Pitler, Ji MaLearning the Curriculum with Bayesian Optimization for Task-Specific Word Representation LearningYulia Tsvetkov, Manaal Faruqui, Wang Ling (Google DeepMind), Chris Dyer (Google DeepMind)Morpho-syntactic Lexicon Generation Using Graph-based Semi-supervised Learning (
TACL)
Manaal Faruqui, Ryan McDonald, Radu SoricutMany Languages, One Parser (
TACL)
Waleed Ammar, George Mulcaire, Miguel Ballesteros, Chris Dyer (Google DeepMind)*, Noah A. Smith
Latent Predictor Networks for Code GenerationWang Ling (Google DeepMind), Phil Blunsom (Google DeepMind), Edward Grefenstette (Google DeepMind), Karl Moritz Hermann (Google DeepMind), Tomáš Kočiský (Google DeepMind), Fumin Wang (Google DeepMind), Andrew Senior (Google DeepMind)
Collective Entity Resolution with Multi-Focal AttentionAmir Globerson, Nevena Lazic, Soumen Chakrabarti, Amarnag Subramanya, Michael Ringgaard, Fernando PereiraPlato: A Selective Context Model for Entity Resolution (
TACL)
Nevena Lazic, Amarnag Subramanya, Michael Ringgaard, Fernando Pereira
WikiReading: A Novel Large-scale Language Understanding Task over WikipediaDaniel Hewlett, Alexandre Lacoste, Llion Jones, Illia Polosukhin, Andrew Fandrianto, Jay Han, Matthew Kelcey, David Berthelot
Stack-propagation: Improved Representation Learning for SyntaxYuan Zhang, David WeissCross-lingual Models of Word Embeddings: An Empirical ComparisonShyam Upadhyay, Manaal Faruqui, Chris Dyer (Google DeepMind), Dan Roth
Globally Normalized Transition-Based Neural Networks (Outstanding Papers Session)Daniel Andor, Chris Alberti, David Weiss, Aliaksei Severyn, Alessandro Presta, Kuzman Ganchev, Slav Petrov, Michael CollinsPostersCross-lingual projection for class-based language modelsBeat Gfeller, Vlad Schogol, Keith HallSynthesizing Compound Words for Machine TranslationAustin Matthews, Eva Schlinger*, Alon Lavie, Chris Dyer (Google DeepMind)*
Cross-Lingual Morphological Tagging for Low-Resource LanguagesJan Buys, Jan A. BothaWorkshops1st Workshop on Representation Learning for NLPKeynote Speakers include:
Raia Hadsell (Google DeepMind)Workshop Organizers include:
Edward Grefenstette (Google DeepMind), Phil Blunsom (Google DeepMind), Karl Moritz Hermann (Google DeepMind)Program Committee members include:
Tomáš Kočiský (Google DeepMind), Wang Ling (Google DeepMind), Ankur Parikh (Google), John Platt (Google), Oriol Vinyals (Google DeepMind)1st Workshop on Evaluating Vector-Space Representations for NLPContributed Papers:
Problems With Evaluation of Word Embeddings Using Word Similarity TasksManaal Faruqui, Yulia Tsvetkov, Pushpendre Rastogi, Chris Dyer (Google DeepMind)*Correlation-based Intrinsic Evaluation of Word Vector RepresentationsYulia Tsvetkov, Manaal Faruqui, Chris Dyer (Google DeepMind)SIGFSM Workshop on Statistical NLP and Weighted AutomataContributed Papers:
Distributed representation and estimation of WFST-based n-gram modelsCyril Allauzen, Michael Riley, Brian RoarkPynini: A Python library for weighted finite-state grammar compilationKyle Gorman
* Work completed at CMU↩