An interdisciplinary creative team helping to shape what machine learning does, where and why it is used, and how it interacts with and benefits our society and planet.
About the team
The Mural team uses design, data, code, and narrative to steer the development and distribution of new technologies within Google’s AI principles. Our practice involves:
- Discovering and addressing interconnected societal or cultural questions
- Revealing opportunities by prototyping and evaluating potential experiences
- Sharing reusable methods and tools to multiply progress across the field
- Engaging external communities in collaboration and critique on all of the above
From foresight practice to the power of listening, members of Google’s speculative design teams discuss how they’re rehearsing for what’s next. Our team lead, Alison Lentz, shares strategies for designing for preferred futures.
The Artists + Machine Intelligence program provides funding and mentorship to artists and researchers working with ML. The artists for this open call for proposals push at the boundaries of generative ML and natural language processing.
Our team's dual focus on trust and privacy pushed us to explore how to enable ML innovation without raw user data ever reaching Google's servers. We developed principles and concepts for more private and powerful experiences, which launched as Protected Computing at I/O 2022.
Many visually-inclined creatives can search the world around them with a mood in mind, and capture images that match. Mood Board Search helps with searches for appropriately fuzzy aesthetics like: “vibrant color palette that feels part memory, part dream".
We believe that good, useful technology shouldn't come with a privacy cost. We worked with our federated analytics and Google Health teams to make sure contributions to a respiratory health study did not make personally identifiable study data available to Google, or third parties.
Federated learning was invented by Google Research in 2016, and today is being used worldwide for privacy-preserving machine learning. We kick-started comprehension with a cheeky comic, bringing the real-people value and nuts-and-bolts mechanics into the public discourse.
Our team explored how to evolve phone calls for hard of hearing users, with transitions between text and voice to keep conversations going strong. Live Caption transforms the voice call and text response, so the caller hears the text response aloud.
Live Translate brings us a tangible step closer to removing language barriers both digitally and in person. Our team researched interactions and use cases for this groundbreaking set of features – Live Translate for Pixel 6 in Interpreter Mode and in Messages.