Jump to Content

AttentiveVideo: A Multimodal Approach to Quantify Emotional Responses to Mobile Advertisements

Phuong Pham
ACM Transactions on Interactive Intelligent Systems., vol. 9 (2019), pp. 1-30

Abstract

Understanding a target audience’s emotional responses to a video advertisement is crucial to evaluate the advertisement’s effectiveness. However, traditional methods for collecting such information are slow, expensive, and coarse-grained. We propose AttentiveVideo, a scalable intelligent mobile interface with corresponding inference algorithms to monitor and quantify the effects of mobile video advertising in real time. Without requiring additional sensors, AttentiveVideo employs a combination of implicit photoplethysmography (PPG) sensing and facial expression analysis (FEA) to detect the attention, engagement, and sentiment of viewers as they watch video advertisements on unmodified smartphones. In a 24-participant study, AttentiveVideo achieved good accuracy on a wide range of emotional measures (the best average accuracy = 82.6% across 9 measures). While feature fusion alone did not improve prediction accuracy with a single model, it significantly improved the accuracy when working together with model fusion. We also found that the PPG sensing channel and the FEA technique have different strength in data availability, latency detection, accuracy, and usage environment. These findings show the potential for both low-cost collection and deep understanding of emotional responses to mobile video advertisements.