UXPA Magazine, vol. 22.3 (2022)
A problem faces machine learning (ML) product designers. As datasets grow larger and more complex, the ML models built on them grow complex and increasingly opaque. End users are less likely to trust and adopt the technology without clear explanations. Furthermore, audiences for ML model explanations vary considerably in background, experience with mathematical reasoning, and contexts in which they apply these technologies. UX professionals can use explainable artificial intelligence (XAI) methods and techniques to explain the reasoning behind ML products.
Learn more about how we do research
We maintain a portfolio of research projects, providing individuals and teams the freedom to emphasize specific types of work