Jump to Content
Matthew Simpson

Matthew Simpson

Matt Simpson is a Staff User Experience Designer in the Geo group at Google. With a background in architecture, interaction and web design, Matt works on Google Maps and on several of Geo's enterprise projects, including Earth Builder. Matt formerly lectured at the University of Queensland teaching design and studio process in an information technology context. During this time Matt was a researcher with the Australasian CRC for Interaction Design (ACID) exploring locative experiences and the relationship of social networks to the design process.
Authored Publications
Google Publications
Other Publications
Sort By
  • Title
  • Title, descending
  • Year
  • Year, descending
    On the move: Mixed methods for research in Mobile HCI
    Victoria Schwanda Sosik
    Phil Adams
    Robert Beaton
    Rebecca Gulotta
    Rob Schill
    2016 SIGCHI Conference on Designing Interactive Systems (DIS 2016) (2016) (to appear)
    Preview abstract Mixed methods are commonly used within HCI, reflecting the field's many contributing disciplines. While there is a rich literature on mixed methodology in the social sciences, particularly education, health, and evaluation, HCI researchers have engaged little with this literature or its resulting frameworks for mixed methods research design. The goal of this workshop is to bring together researchers in industry and academia who practice mixed methods for mobile technology design in order to formalize the processes that we have used with success, to document our challenges, and to make recommendations for designing mixed methods research in the subdiscipline. We will accomplish these goals with a number of hands on activities designed to share participants’ individual expertise, challenge pre-conceived notions, and later converge on a set of frameworks. View details
    Backtracking Events as Indicators of Usability Problems in Creation-Oriented Applications
    David Akers
    Robin Jeffries
    Terry Winograd
    ACM Transactions on Computer-Human Interaction (TOCHI), vol. 19 Issue 2, July 2012 (2012)
    Preview abstract A diversity of user goals and strategies make creation-oriented applications such as word processors or photo-editors difficult to comprehensively test. Evaluating such applications requires testing a large pool of participants to capture the diversity of experience, but traditional usability testing can be prohibitively expensive. To address this problem, this article contributes a new usability evaluation method called backtracking analysis, designed to automate the process of detecting and characterizing usability problems in creation-oriented applications. The key insight is that interaction breakdowns in creation-oriented applications often manifest themselves in backtracking operations that can be automatically logged (e.g., undo and erase operations). Backtracking analysis synchronizes these events to contextual data such as screen capture video, helping the evaluator to characterize specific usability problems. The results from three experiments demonstrate that backtracking events can be effective indicators of usability problems in creation-oriented applications, and can yield a cost-effective alternative to traditional laboratory usability testing. View details
    Undo and Erase Events as Indicators of Usability Problems
    David Akers
    Robin Jeffries
    Terry Winograd
    Proceedings of SIGCHI 2009, ACM, N/A
    Preview abstract One approach to reduce the costs of usability testing is to facilitate the automatic detection of critical incidents: serious breakdowns in interaction that stand out during software use. This research evaluates the use of undo and erase events as indicators of critical incidents in Google SketchUp (a 3D-modeling application), measuring an indicator’s usefulness by the numbers and types of usability problems discovered. Our evaluation also compares problems identified using undo and erase events to problems identified using the user-reported critical incident technique [CITE]. In a within-subjects experiment with 37 participants, undo and erase episodes together revealed over 80% of the problems rated as severe, one third of which would not have been discovered by self-report alone. Moreover, problems found by all three techniques were rated as significantly more severe than those identified by only a subset of techniques. These results suggest that undo and erase events will serve as a useful complement to user reported critical incidents for low cost usability evaluation of design-oriented applications like Google SketchUp. View details
    Zebra: Exploring users’ engagement in fieldwork
    Yann Riche
    Stephen Viller
    DIS 2008: Designing Interactive Systems
    No Results Found