Pharmaceuticals & Biotech

AI & user experience: As always, the user comes first

Nov. 17, 2019 | Article | 6-minute read

AI & user experience: As always, the user comes first


While most of us are aware of the importance of user experience, UX goes beyond simply developing a relevant and usable interface for software. UX is a discipline that requires a thorough understanding of users’ needs and the context in which they use technology. Whatever solution you may roll out to users, good UX is about meeting those needs.

With the proliferation of AI-driven solutions and proofs of concept, it’s easy to focus on the data science and forget that what you’re developing eventually needs to serve a person on the job. Within pharma companies, the finished product is often the visualization of complex data that appears in software on a laptop or mobile device. However, if users can’t understand these insights or the insights aren’t valuable to them, then the data science was a wasted effort. That’s where UX comes in. More specifically, that’s why UX should have come in a long time ago.

 

To better understand the role of UX in developing AI solutions, I spoke with ZS Principal Natalie Hanson, leader of our UX practice, whose team has helped develop multiple AI-driven solutions for clients.

Q: Can you share some best practices and lessons learned from visualizing data in AI-driven solutions?



A: It’s not that different from user experience work in any other complex domain. However, when you’re working with AI or machine learning, it’s important to work closely with subject matter experts and data scientists.  In other words, experts can look at data and see something exciting in what they've found because they deeply understand the domain and the data. To design a compelling visual for data, we really need to deeply understand the story we're trying to tell.

 

What’s more challenging about this kind of work is that the data visualization tends to be more dimensional. The data might be better viewed in 3-D or represented across time, for example. In these cases, we may find inspiration from a non-adjacent field, like a video game or the way weather is represented to consumers.

Q: When you think of AI-driven tools such as a recommendation engine, what are some of the key UX components you should keep in mind and how do these components differ from work on other solutions?



A: Again, I’d say there are more similarities than differences. On a recent AI project for a sales audience, my team helped understand how reps wanted to see and interact with the data. The people who were working on the solution were focused on extracting value from the data we had. And that was a hard problem. But then we had to figure out how to share those valuable insights in a way that would be useful to this audience. The data scientists and machine learning engineers didn’t know anything about what a salesperson's life is like. We can’t expect them to make that last mile jump between all this amazing data and how it should be visualized and organized, so we looked at where the reps were in their daily lives when consuming data. Having a good understanding of the context was key because that affects how the data should be represented. For example, we had a client who had all these amazing, robust sales-related data points, and they had pushed it to iPads for their reps. But the tool, when teams logged into it, started them at the national level, so they had to drill down to a region, drill down to a territory and then drill down to a physician. They had this massive amount of data to wade through, and they’re on iPads, sitting in a parking lot. Suddenly it becomes onerous to make use of that data in that context. We saw all kinds of crazy work-arounds for that. Reps were doing things like opening the physician view and grabbing a screenshot and saving it to photos or printing it out and putting it in a binder. Basically, what they were saying is, “The data's good, but it needs to be more accessible at this point in my day.” Our job was making sure that we were providing the right information at the right time, in the right way.

 

We also look at what we call information architecture or taxonomy. How do the reps think about these kinds of insights? How can we organize the information according to their mental models? And we also consider what we call progressive disclosure. That’s about how much information is really needed at the outset before someone wants to dig deeper.

Q: What are your thoughts on new innovations like conversational AI, which lets you hear spoken insights, or augmented reality (AR) and virtual reality (VR), which let you consume information in 3-D?



A: A lot of this has to do with users, and context is key. What we saw with reps, for example, is that they don't want to talk to a chat bot because of privacy issues. They can't be in a public place with their phone saying, “Hey, tell me about physician so-and-so,” and having it talk back to them. That would be disclosing their physicians’ information. Also, many reps have security protocols on their devices. So we thought about helping to make optimal use of their time in the car, but many of the reps can't use their devices while driving.

 

I think we're just in the infancy of exploring what AR and VR might do. For example, if you're a rep serving a large hospital, and you're trying to build new relationships, how do you find a physician, a head of purchasing, a head of surgery? Could there be an AR solution that helps them locate these people?

 

What’s exciting and challenging about creating solutions in augmented or virtual reality is we have to design for multiple senses in a way that we haven’t done in the enterprise context before.  We’re drawing on design that may be familiar to gamers. For example, sound plays a more critical role, as do sensory cues like vibrations. The use of these different elements has to be done in a thoughtful way that informs and guides the user without overwhelming them, and in a way that’s appropriate for the context of use.

 

But it will be a long time before users will want the next generation in data visualization. When you think of the era of mobile reporting that started around 2013, we're just now at the point where users have a strong point of view about how they like to see data and what helps them and what doesn't. That's six years. I don't know if there's any kind of guidance on how long that evolution takes, but I would say that if you want to think about bringing AR, VR or other innovations, be practical and realistic about your audience’s readiness, and how quickly they will adapt to it.  What might make perfect sense for a young patient may not work for a seasoned sales rep operating in a hospital setting, for example.



About the author(s)