- David Pearl
- Laura Beth Fulton
- Megan Cackett
This paper presents an evaluation of “Quick Commands” to control an Assistant in a variety of hands-free contexts. Quick Commands are an application of NLP that removes invocation words “Hey Google”, “Hey Siri”, “Alexa” can lend voice interaction to retain an element of naturalness. For situations where non-invoked commands may be useful, we anticipated media as a key opportunity for “Quick Commands” is to allow control of media through hands-free and invocation free interaction.
Research goals included validating the value proposition of Quick Commands for earbud interaction in terms of usability: feature helpfulness, naturalness, and comfort with use in public and private spaces. Study design included recruiting a group of over 80 users to test the assistant in different contexts, followed by a focus group of 18 users to provide detailed feedback. The study ran 1 week in duration and encouraged testing of Quick Commands in different scenarios: alone in a quiet room, alone with quiet, ambient noise (e.g. when walking), and alone in a noisy room (e.g. TV on.) During the week of testing, participants completed check-in surveys at the mid-week point and end of the week. A debrief session was scheduled with a random set of participants to understand commonalities and get feedback in a small group setting.
Results of this study demonstrate pain-points and delights for Quick Command and user comfort with voice interactions as part of earbud wear in private versus public settings. Evaluation methods can be replicated to validate future NLP advances and Assistant features before implementation in public facing applications.
Learn more about how we do research
We maintain a portfolio of research projects, providing individuals and teams the freedom to emphasize specific types of work