Skip to main content
noted

How Can Design Thinking Build Trust in the Age of Machine Learning?

October 2019

Article

Article credits
Spotify Design Team

230 UX designers and machine learning (ML) experts from across industries gathered at Spotify’s New York City Event space this October for an event that highlighted the intersection of cutting-edge tech and human-centered design. The gathering was conceived by Spotify Design as a way to connect with the broader UX and Tech community around best practices and inspiring stories in the field of Design for ML. The team also engaged with the SF-based meetup Machine Learning & User Experience Meetup (MLUX) as a community partner.

As central as machine learning has become in recent years—doing everything from helping you find music for your workout, to helping you discover your next favorite podcast—those lines of code analyzing reams of data aren’t infallible. In order to do what they’re intended to do, ML products need to be created with humans in mind, and it’s at that intersection that good design becomes indispensable.

“Answers are only helpful when you’re asking the right questions,” Mark Kizelshteyn said to a rapt audience at the Spotify Design event. Kizelshteyn is a designer on Spotify’s home experience and has seen firsthand the issues created by having a one-size-fits-all approach to human taste. 

He uses the example of Spotify’s early recommendation-driven home experience, which simply suggested content based on the user’s listening history. The issue being - that approach only answered the “what” rather than taking a holistic view of listening habits. “The answers we were getting didn’t capture the nuances of the human experience,” he said. “We knew we needed to reshape the algorithms in a human-centered way.” So Kizelshteyn and his team started asking more questions: What does it mean to like something when listening? Why would someone listen to this, in this context? What does someone need to know before making the choice of what to listen to? Answering those questions is no easy task, but Kizelshteyn believes designers working on ML-driven products have a responsibility to put people in the middle of machine learning platforms. 

Matt Cronin and Jennifer Lind, both members of Spotify’s Data Curation team, concurred. They took the stage together to talk about how keeping “humans-in-the-loop” is central to creating ML-driven platforms that are as useful as they are powerful. At times that can be easier said than done, Lind and Cronin have found that separating the signal from the noise can mean the difference between a platform turning users off and becoming indispensable. “You need to balance the quality and quantity of data when you’re training a model,” Lind said. “Humans-in-the-loop can help demystify those issues and focus on what’s relevant, creating a frictionless process."

Not all problems are meant to be solved by machine learning though. Di Dang, an Emerging Tech Design Advocate at Google, encouraged the audience to first identify if machine learning adds unique value to your product. “‘Can we use AI/ML to ___?’ is the wrong question,” Dang told the audience. “We need to ask first what is a valuable problem to solve for our users, before we validate whether machine learning can solve that problem in a unique way that couldn’t be solved through other means.”

Dang has a deep background in UX design and co-created Google’s People + AI Guidebook, a valuable resource for anyone looking to understand how to make machine learning design decisions. She highlighted several of the Guidebook’s overarching themes throughout her presentation, including the role design plays in calibrating user trust. Because ML-driven products are based on statistics and probability, product creators need to make design decisions to help users understand the system’s predictions. “If [people] don’t understand what or why they’re seeing something, they may not trust it and end up abandoning a product altogether,” Dang said. “If there’s too much trust, they might assume an AI is magical and knows better than they do.” To strive for the sweet spot of calibrated trust, users should know what the system can do well, but also know when they will need to use their own intelligence to override the ML.

Finding that happy medium is all about empathy. Dang spoke about the importance of first understanding users’ “mental models” in order to manage expectations for ML-driven features. It’s similar to how James Kirk, a Machine Learning Engineer on Spotify’s Listening Experiences team, described his approach to UX issues on ML-powered platforms. “Machine learning products are just guessing at their answers; they’re often wrong,” Kirk said, reiterating a common theme of the night. “Those experiences are going to be different for every user. It’s challenging to develop and share expectations between users. You need to take time to think about which aspects of the product should be personal and which ones are shared.” 

Diane Murphy, a Senior UX Writer in the Personalization team at Spotify, showed how composing tight, purposeful copy can go a long way in that calibration as well. Explaining machine learning processes to users who, in the moment, may not want to read more than a sentence is difficult—Murphy referenced a focus group participant who said that she “scrolls” when she sees too much text in her apps—but vital to setting expectations. Still, much like creating the right level of trust, formulating the right copy is a calibration game. “You can’t overpromise, you can’t lean into emotional language,” Murphy told the audience. “If something isn’t true, it can break your trust as a user.” The best way to solve these issues is to understand your users and adjust accordingly, something that requires both a strong ML-based approach, as well as an equally comprehensive human perspective. 

Maheen Sohail, a Lead Product Designer at Facebook working on AI and VR products, was the final person to take the stage and continued to advocate for putting humans at the center of ML-driven design. She underlined the role that designers play in crafting the platforms of the future and how those products will facilitate human connections. She points to the increasingly sophisticated technology present in the Oculus VR headset and the Facebook Portal, two products that are considering the human experience in their design. She asked the gathered crowd of machine learning experts and UX design leaders to consider tomorrow’s users as well as today’s when crafting ML products. “It’s critical for designers today to design for the future,” Sohail said in closing, reiterating the need for those working on machine learning products to keep people at the center of their work.

Sohail’s presentation closed out a thought-provoking Spotify Design event that brought together leaders from a wide array of disciplines. Collaboration is at the core of Spotify Design’s approach to problems: cross-functional teams looking at problems they have to solve using toolkits that span expertise and experience. It’s that multidisciplinary approach that yields Spotify Design’s unique approach to next-generation tools—like machine learning—and creates products that put human experience front-and-center.

Want more on Machine Learning? Also check out Three Principle for Designing ML Powered Products.

More resources:

MLUX Meetup

People + AI Guidebook, from the People + AI Research (PAIR) team at Google

Join the Spotify Design team

Credits

Spotify Design Team

We're a cross-disciplinary team of people who love to create great experiences and make meaningful connections between listeners and creators.

Read More

Our latest in Noted

View all Noted stories

Want Spotify Design updates
sent straight to your inbox?

By clicking send you'll receive occasional emails from Spotify Design. You always have the choice to unsubscribe within every email you receive.