Jump to Content

Qin Cao

Research Areas

Authored Publications
Google Publications
Other Publications
Sort By
  • Title
  • Title, descending
  • Year
  • Year, descending
    Recognizing Multimodal Entailment (tutorial at ACL 2021)
    Afsaneh Hajiamin Shirazi
    Blaž Bratanič
    Christina Liu
    Gabriel Fedrigo Barcik
    Georg Fritz Osang
    Jared Frank
    Lucas Smaira
    Ricardo Abasolo Marino
    Roma Patel
    Vaiva Imbrasaite
    (2021) (to appear)
    Preview abstract How information is created, shared and consumed has changed rapidly in recent decades, in part thanks to new social platforms and technologies on the web. With ever-larger amounts of unstructured and limited labels, organizing and reconciling information from different sources and modalities is a central challenge in machine learning. This cutting-edge tutorial aims to introduce the multimodal entailment task, which can be useful for detecting semantic alignments when a single modality alone does not suffice for a whole content understanding. Starting with a brief overview of natural language processing, computer vision, structured data and neural graph learning, we lay the foundations for the multimodal sections to follow. We then discuss recent multimodal learning literature covering visual, audio and language streams, and explore case studies focusing on tasks which require fine-grained understanding of visual and linguistic semantics question answering, veracity and hatred classification. Finally, we introduce a new dataset for recognizing multimodal entailment, exploring it in a hands-on collaborative section. Overall, this tutorial gives an overview of multimodal learning, introduces a multimodal entailment dataset, and encourages future research in the topic. View details
    How Does Noise Help Robustness? Explanation and Exploration under the Neural SDE Framework
    Cho-Jui Hsieh
    Tesi Xiao
    Xuanqing Liu
    Conference on Computer Vision and Pattern Recognition (CVPR) (2020)
    Preview abstract Neural Ordinary Differential Equation (Neural ODE) has been proposed as a continuous approximation to the traditional ResNet structure. However, the resulting ODE system is often unstable---a small input perturbation will be amplified through the ODE system and could eventually blow up. In this paper, we propose a new continuous neural network framework called Neural Stochastic Differential Equation (Neural SDE) which injects random noise by forming a stochastic differential equation. Our framework can model different noise injection regularization techniques in discrete networks, such as dropout and additive/multiplicative noise injection at each block. We provide a theoretical analysis showing the improved robustness of Neural SDE against small input perturbations. Furthermore, we show that the Neural SDE framework can achieve better generalization error than Neural ODE on real datasets and is more stable to small adversarial and non-adversarial input perturbations in practice. View details
    No Results Found