Dustin C Smith
Dustin C Smith is a human factors psychologist and staff user experience researcher at Google. He studies how people are affected by the systems and environments around them in a variety of contexts: software engineering, free-to-play gaming, and healthcare. His research at Google has emphasized identifying areas where software developers can feel happier and more productive during development.
Research Areas
Authored Publications
Sort By
Validation of the GUESS-18: A Short Version of the Game User Experience Satisfaction Scale (GUESS)
Barbara Chaparro
Joseph R. Keebler
Mikki H. Phan
William J. Shelstad
Journal of Usability Studies, 16 (2020), pp. 49-62
Preview abstract
The Game User Experience Satisfaction Scale (GUESS) is a 55-item tool assessing nine constructs describing video game satisfaction. While the development of the GUESS followed best practices and resulted in a versatile, comprehensive tool for assessing video game user experience, responding to 55 items can be cumbersome in situations where repeated assessments are necessary. The aim of this research was to develop a shorter version of the scale for use in iterative game design, testing, and research. Two studies were conducted, the first one to create a configural model of the GUESS that was then truncated to an 18-item short scale to establish an initial level of validity, and a second study with a new sample to demonstrate cross-sample validity of the 18-item GUESS scale. Results from a CFA of the 18-item scale demonstrated excellent fit and construct validity to the original nine construct instrument. Use of the GUESS-18 is encouraged as a brief, practical, yet comprehensive measure of video game satisfaction for practitioners and researchers.
View details
Preview abstract
The Accelerate State of DevOps Report represents six years of work and data from over 31,000 professionals worldwide. It is the largest and longest-running research of its kind, providing an independent view into the practices and capabilities that drive high performance. The results enable us to understand the practices that lead the powerful business outcomes that result from excellence in technology delivery.
Our research employs rigorous statistical methods to present data-driven insights about the move effective and efficient ways to develop and deliver technology. Cluster analysis allows teams to benchmark against the industry, identifying themselves as a low, medium, high, or elite performers at a quick glance. Teams can then leverage the findings of our predictive analysis to identify the specific capabilities they can use to then improve their software delivery performance, so they can become an elite performer.
Our research continues to show that the industry-standard Four Key Metrics of software development and delivery drive organizational performance in technology transformations. This year’s report revalidates previous findings that it is possible to optimize for stability without sacrificing speed. We also the capabilities shown to drive improvement in the Four Key Metrics: these include technical practices, cloud adoption, organizational practices (including change approval processes), and culture. For organizations seeking guidance on how to improve, we point to the only real path forward: start with foundations, and then adopt a continuous improvement mindset by identifying your unique constraint (or set of constraints). Once those constraints no longer hold you back, repeat the process. We also provide guidance on the most effective strategies for enacting these changes.
This year, we investigate the ways in which organizations can support engineering productivity through through initiatives like information search, more usable deployment toolchains, and reducing technical debt with flexible architecture and viewable systems. We also find that productivity helps with burnout and offer guidance on how to reduce burnout by supporting work recovery, or the ability to detach and “leave work at work.”
View details