The SpoolCast with Jared Spool

The SpoolCast has been bringing UX learning to designers’ ears around the world since 2005. Dozens and dozens of hours of Jared Spool interviewing some of the greatest minds in design are available for you to explore.

Episode #224 Jason Ulaszek & Brian Winters - From Research to Experience Roadmaps

November 15, 2013  ·  30 minutes

Listen Now

Download the MP3

Nowadays, design is an increasingly important business tool. As Jared Spool reminds us, Apple is one of the largest companies in the world, largely based upon continually engaging in good design. A great user experience can be a differentiator in the business landscape. For Jason Ulaszek, a good design starts with good research to guide and direct the organization’s decisions.

Show Notes

Jason is the VP of Experience Design at Manifest Digital in Chicago. In his virtual seminar, Design as a Business Tool: Research to Experience Roadmaps, Jason discusses the influence designers in today’s world have in the overall direction of an organization and how the toolset that designers possess leads to better decisions about the user experience. In this podcast, Jason is joined by his co-worker Brian Winters and UIE’s Adam Churchill to address questions our audience had during the live seminar.

  • What methods do you use to understand the competitive landscape?
  • How early in the research process do you involve end users?
  • Do you use observations in addition to interviews in your research?
  • How do you apply a mental model or gap analysis to a new product or service?
  • Is there an online tool available for creating mental models?
  • What is the difference between an experience map and a mental model?
  • How do mental models and story mapping compare?
  • How do you avoid jumping immediately to solutions?

Full Transcript

Adam Churchill: Welcome, everyone, to another edition of the SpoolCast. Recently, Jason Ulaszek joined us to present his virtual seminar, "Design as a Business Tool, From Research to Experience Road Maps." Jason's seminar, along with 115 others that teach the tools and techniques you need to create great design, is now part of the UIE User Experience Training Library, and soon to be unveiled as part of UIE's "All You Can Learn."

Designers face the sometimes daunting responsibility of influencing where organizations spend money and time to gain a stronger customer experience. In the seminar, Jason shows how he's wrapped several UX tools and processes into a repeatable framework for designing great customer experiences. In the seminar, he also shares how storytelling can provide vital insights that remove politics and create a foundation for artifacts that inform your business decisions.

We asked Jason to be part of the UIE virtual seminar program because we saw an earlier version of this presentation at Interactions 13 in Toronto earlier in 2013. At the time, it was a co-presentation with Brian Winters, and they work together at Manifest Digital in Chicago. For today's podcast, we've invited them both to come back and discuss some of the issues that were prevalent in the seminar and some of the remaining questions from our audience.

Jason and Brian, welcome, and thanks so much for taking some time to join us.
Jason Ulaszek: Thanks a lot. It's great to follow up with some discussion.
Brian Winters: Thanks, Adam.
Adam: For those that weren't with us for your seminar, Jason, can you give us an overview?
Jason: Sure, yeah. The seminar was really around, as you mentioned, connecting the dots as designers. I think, as designers, we have this opportunity and this innate ability to be able to take these activities and tools and approaches that we have to really help hone in on how to make the right type of business decisions, help influence them, help inform them, right? I think that's what we're doing today as designers, and that's gravitated and made its way all the way from many years of hard work in our industry to be able to really be at the table in terms of how to guide and help set some of the future direction for organizations.

This seminar was really a story about some of the work that we've done, Brian and I, and I would call it a methodology, although that's a pretty strong word at times. It's an approach, maybe, is a better word for it, that we've devised, partly because we needed to be able to tell the story ourselves about how design can have an influence to setting the longer-term direction for organizations, creating products and services, whether that be from scratch or re-envisioning, re-imagining products and services that they already have out there for any number of years.

That has led us into a bit of that approach that, if you joined us, you saw and listened to. We're going to chat a little bit about some of the follow-up questions around that today.

In reality, it's turning design and the processes and the skills that we have into this business-decision framework, and laying it out so that you understand what you're working with, the goals and objectives that companies have, that stakeholders have, what our end users are looking for, unlocking that through research that's well-crafted and intended to inform the entire customer life cycle, that we can also use in part with some competitive analysis to inform where we need to support, where we need to fill in the gaps, where do we need to optimize the entire customer experience.

Then a bit about, out of all the things that we could do -- both small, medium, and large -- how do we set the right direction, and how do we do that in a way that takes some of the politics out of the decision-making process and looks at the drivers of the importance and the goals and objectives for both customers and business stakeholders, and prioritize those into a bit of a road map, if you will, and ultimately wrap that all into a story that can be used to create shared understanding within organizations. That's the story of the storytelling, I guess I would say, as we work with clients.
Adam: Very cool. Let's get to some of those questions.

Obviously, you spent some time talking about assessing the competitive landscape, and there were some really good questions that came in regarding that. Jason, when you're trying to get your head around this understanding of your competitive experiences, what are some of the methods you use to do that?
Jason: That's great. I guess I'll cover off on the second needs first. There's a ton of research available through a number of different partners and sources, organization like Forrester, that can provide deep insights relative to specific industries or specific situations, which is extremely helpful when you may not necessarily have access to all the business insight around the competition in the space. We'll look at that to steep ourselves in understanding broadly.

Then primarily what we'll wind up doing is, we look at design -- and I say that in a broad sense -- and we look at that from an interaction, look at that from a branding and identity standpoint, a content-strategy standpoint, front-end technology development. There's some marketing in there. You can throw in all these various disciplines that equate to the overall experience, and you start to look at them as a cross-discipline, almost, heuristic, right?

We know that there are some solid practices to follow, and it's one that we live by as designers many times over. When we work with our clients to walk through those, and then identifying other additional ones we need to consider, and then we start to talk about, "Well, what are we trying to solve? What are we trying to best understand?"

When we look at, let's say, one app versus another, two competitors, it's important for us to understand things like organizational principles of the design. What's the personality of the design? What's the content that's being delivered? How is it being delivered? What's the reveal of that?

How is branding and identity treated as it relates to those competitors and laddering back up to brand promise and values? How is the interaction design handled, right? Think about the importance of many of those factors today around navigation and flow, and you wind up with either experiences that excel or experiences that [laughs] really suck, that have a negative connotation or perception on the brand overall.

We identify a number of those criteria and start to assign levels of scoring. Although it's, at times, a little subjective, we try to objectify it by defining what constitutes a one versus a five, and we follow that same pattern throughout a number of direct competitors.

That's important because, it's not extremely scientific, but it gives us a way in to understanding and talking about the value of design and the direction of the experience that one competitor takes over the other, and why something is more powerful, right? Or at least unlocking additional questions about, is this really one of the reasons why it's driving growth and engagement, the personality and feedback, how responsive this experience is? That's the key indicator for us to set the bar with working with our clients.

Lastly, the other aspect to think about is the indirect competitors. Many times we forget about these indirect competitors. These are the brands that we all know and love. You walk into an Apple store, you have an experience with an Apple product or service or employee, and that sets the bar.

Much as if you went into a department store, like a Nordstrom's, those start to set the bar for your experience around similar brands that that persona or that archetype will gravitate towards. Even if your client's in insurance, and they're working and expecting the experience they get out of a Nordstrom's and Apple, that bar is raised, and it's important to understand that.
Adam: Brian, let's talk a bit about research activity selection and its purpose. How early in the decision cycle do you involve the end users?
Brian: From my perspective as a researcher, it's never too early to talk to an end user, because I think there's always something to learn with every conversation. It's important that you talk to people as soon as you have a clear idea of your objectives, the goals that you have for the design, and what you're going to use in terms of success factors.

Once you have that outlined and you know the goals of the organization or the stakeholders you're dealing with, and you've done some stakeholder interviews to understand the customer from their point of view, you can then begin to craft an approach to talk to people. I like to talk to end users earlier than later, but that's not always feasible given time lines and clients' appetite for research.
Adam: Do you use observations as a research technique, in addition to the interviews you're conducting with them?
Brian: Oh, absolutely. It depends. Like a lot of things in user experience, you can answer with "it depends." We did some research with people who were shopping for homeowners insurance, and we could've easily done that just through interviews, but we went into people's homes and we did an interview, but then we would ask them a question like, "How do you go about comparing your policies?"

Then we watched people go to the back of their house, find a dusty box, open it up, then sift through it to try to find their insurance policy, while they're on their hands and knees, cursing the whole time that they have to find this document that they never, ever need to access.

That observation, we didn't know that that was going to be an important part of our understanding, but watching them go back and try to dig out their policy, so they can do an apples-to-apples comparison while they're shopping, was really important to them, and the stress that they went through to find it was an important part of our findings. That leads to, is there maybe a possibility here, another solution, making their policies more easily acceptable, and make it easier for them to do that analysis?

Yeah, when you do have things that require context as an important part of your understanding, observation is always a good thing to do.
Adam: One of the things that's interesting about this process. Jason's presentation talks a bit about and introduces the concepts of a mental model and a process called gap analysis. What happens when you're thinking about a new product or service, where there are no existing solutions available? How do those two pieces come into play?
Brian: When you're talking about a new product and a mental model, and doing the gap analysis, what you have to gap against, I think you can create the model and then just take a step back and start to insert possible solutions, right? Instead of doing a gap analysis, you do some brainstorming and you try to innovate on what people are experiencing.

Do your synthesis on the top part of the model, and on the bottom part of the model is where you outline opportunities. Those are things we've done in the past, where we've actually skipped the process of doing the inventory of how the products and services of the client are supporting the mental model, and gone right to possibilities.

You can use it to understand where some experience gaps might be, because you're looking at how people are motivated, what their behaviors are, what they're thinking, feeling, and believing about an experience, agnostic of technology or brands. When you look at it that way, it's easy to take a step back and see where some of the pain points and gaps are in the experience, regardless of whether you have a product to match it against or not.
Adam: Jason, I know you had something to share about how you modify this process for a new project versus an existing, ongoing project.
Jason: To add on what Brian was talking about, the top half of the model versus the bottom half of the model, I think it's, as you progress over time with your understanding of what's happening within the customer's experience from beginning to end, and everywhere in between, hopefully you continue to do things like usability testing, going out and visiting with your customers, rolling up your sleeves, doing various different types of research activities and listening to what's going on into the organization.

That's really important, first and foremost, because this type of model is foundational. It's meant to be continued to be built upon.

What does that look like? That looks like, "Hey, we completed the first project, and we blew this thing out and we printed it out on a plotter. It's 10 feet long, and we put it on the wall. Awesome." It's now a physical thing that sits in the office, and people can go up to it and understand what's happening and transpiring in the customer's mindset.

Now, as things progress, and you gather additional insights from usability testing or other research activities, we will typically wind up adding Post-It notes or just writing directly on the model itself. Those things will constitute from new insights from customers, quotes, emotions, feelings, activities, right? We start to fill in some more of the gaps that we maybe didn't have as much depth in in the past.

As well as, obviously, as organizations move forward, there's a constant influx of new digital or physical products and services that are being introduced. You need to be able to make sure that you understand, "OK, the purpose of that tool is to serve the management of X service."

All right, awesome. Let's add that to the bottom of the model, where we think it belongs, and as we continue to move on, we'll have more discussion in an upcoming usability lab, when we have a few minutes to ask about, "Hey, are you using X tool? What do you think?" Right?

That gives us an ability to continue to move these things forward. In terms of a management and update standpoint, it's really up to the team and the organization, in which, when there's a gigantic amount of Post-It notes all over the wall, [laughs] all over the model, then it's typically a good indicator of, "OK, let's pull these back into the model itself digitally and reprint the mental model."
Brian: I know of some teams out there, UX teams, that actually formed a mental-model team to continuously update the model, because what they found is, the first time they did it, they did it with one segment of users, and then they have a different business unit come to them and say, "Hey, what about these customers?

Because we're really interested in this part of their experience." They did more research, and they found that they had to augment it. What started out as a 12-foot-long model wound up being 60 feet long after a couple of years of continuing analysis.

What I think is great about that story is that they actually have a mental-model team that's dedicated to that. They actually go out and proactively advocate for the mental model and the customer point of view.

That's an important lesson. This is an artifact that is meant to be worked. To Jason's point, it being foundational, you continue to build that foundation so that it becomes very strong, and out of that comes many different ideas, and you can attach all kinds of different projects to it, especially when it's more of a mature model, and get something out of it.
Adam: Brian, one of the things about our audience is they're always looking for resources. During the virtual seminar, when the mental models piece came up, a lot of folks were looking for, "Is there a digital online tool that you can recommend for this?" Is there, or is the wall of Post-It notes that Jason referred to and a couple of white boards still the best way to go?
Brian: I would love to find that digital online tool. [laughs] Just kidding.

I use Post-It notes for a very specific reason. I know Indi Young, who developed this methodology, uses Excel, and that's her way of doing it, but I use Post-It notes, one, because it helps me see the big picture. It's kind of like the matching game. When you've got your affinity clusters, I can take a step back and start to see patterns visually on a big wall in front of me.

The second thing it does is it gives me the opportunity to bring in stakeholders and immerse them in the research and get them talking about patterns, get them engaged, get them holding a Post-It note in their hand and doing something with it. We're together as human beings in a room, solving this problem we have of clustering all this data into something meaningful.

I think there's a very valuable part of using the Post-It notes, but in the end, you have to get to something that's digitized and that you can then print out.

Like I mentioned, Indi uses Excel -- I've used Excel as well -- to sort. She has, I think on her blog someplace, there is a Python script that you can use to extract the data from Excel, and I think it pumps it into Visio. You can also use something like OmniOutliner, where you can basically put the outline of the mental model onto OmniOutliner and then just drag the Outliner file onto the OmniGraffle icon, and it will ask you, "How do you want to build this?"

An instant org chart. It's a workaround, but it takes your data outline and puts it into an org chart, and then it's easier to then manipulate that into all the boxes and clusters and towers that you need to have.

There is not one tool out there that I've found that is a mental-model builder. Maybe that's an opportunity for someone out there to create such an application and deliver it to the market, because I think there's more you can build on top of that kind of application, not just the mental model. There's much more data that you can get out of that and slice and dice for other initiatives.
Jason: I think that would be an area where we would be excited to collaborate.
Brian: Yes.
Jason: [laughs]
Adam: One of the things that we run into is, every once in a while, something really cool gets labeled a certain way, and then there becomes a bit of confusion with how it might relate to something else. Jason, can you talk a bit about the difference between the experience map and a mental model?
Jason: Sure. I think you're right, Adam. We will use terms. In our profession, we have a tendency to debate the hell out of different definitions and terms and titles and roles and all sorts of fun things. I guess, at its most important level here, the thing that we need to understand is the mental model provides a foundation.

Think of it as a data store, a database, a way of saying, "Here's what we know about our customers. Here's how they think, act, they feel, they behave." Also, what are the things that they use in those spaces, amidst all those activities and feelings and beliefs that they have?

If you look at it that way, right, as this collection that you can build on over time, you can start to say, "OK. Now, what's an experience map?" Well, that really is mapping through various scenarios for a persona of a customer landscape and saying, how do I start to pull out some of the emotions, the feelings, the actions, and the tools that are currently being used, or, also, future-looking, right, forward-looking, that we may need to create, innovate, find, to support those activities.

That becomes more about the flow and the interaction of the experience across various different products and services that your organization may have. If you think about it that way, that starts to, hopefully, clearly define the distinction between the two. Because you can throw in something like customer journey map as well. We will typically use the term "experience map" and "customer journey" pretty interchangeably.
Adam: It's certainly an interchange of definitions and labels that we're hearing.

I'm going to pitch the UIE virtual seminar library we're calling "All You Can Learn." We have a virtual seminar with Indi Young, talking about her mental models. We have one from Jeff Patton on story mapping. Can you compare and contrast those, explain the difference?
Brian: My understanding of story mapping is that it's a somewhat similar outcome but a different approach.

I'll start with the mental models. The data that comes out of the mental models is derived from non-directed interviews. These are open interviews that are really driven by the participant and somewhat guided by the moderator. It's them telling us their story.

It's telling us about their experience, why they're feeling certain things, what they're thinking about, how they're motivated. All of that data then winds up on, for me, Post-It notes, and then we organize that into clusters. It's a bottom-up kind of approach to building the understanding of the person's experience.

From what I understand of Jeff's story mapping is that's more of the internal view -- what do we already know about customers, and how do we take what we already know about customers and build a model to help us understand what we develop first? What's the first priority? What's the second priority?

I find that something that would be really useful in the development environment, especially when you know pretty well what your customers need. You may be the kind of organization that's in touch with your customers. Or maybe you are the customer, because you use the product, right? You can start to develop an understanding for what people are going through based on your perspective or based on research you've already done.

I think the main difference is the amount of research and the type of research that goes into developing these. One is bottom-up and the other one is top-down, but hopefully it's informed by other things you've been doing in your organization.

I think they're both useful, but the context for a mental model is more about generating. It's generative research, right? What are the new ideas that we can find to delight people? What are we missing in the experience? Story mapping is taking things we already know and outlining it in such a way so that we can assign priority and map that out for a development plan. That's my understanding of it.
Adam: Brian, talk a bit about facilitating the process. Earlier, there was this mention of the mental models team. Really cool concept. When you're collaborating with a project team and getting into the mental models, designers and people working on projects like that, there's this tendency to immediately jump to the solutions. How do you avoid that?
Brian: I'm not sure you avoid it if you have low-hanging fruit that you can tackle, right? I've done mental-model research where we're into the third interview and the client is hearing things that they can fix right away.

There's nothing wrong with going ahead and attacking low-hanging fruit. I think what you need to do is, when the solutions start to be very broad and strategic and longer-term, that's where I think you can run into a slippery slope of narrowing your focus too much. It's OK to have those solutions pop up. I think, early on, you put them on a parking lot and you let the process flow as it's meant to.

The thing about uncovering solutions is insights take time. It is not something where you get 1,300 data points -- which are a [laughs] recent mental model that I did, 1,300 Post-It notes -- where you have 1,300 Post-It notes on the wall, and then you organize it and you have you answer. That's where I think we struggle, as consultants, with our clients, because we do this work and we come in and deliver something that hopefully they believe is wonderful, but it feels like magic to them. Well, it's not. It's a lot of sweat equity. It's a little bit of luck. It's good data.

It's all of those things that go into developing insights and innovation, and time. Time is probably the most critical part of this. It's like a good stew. It might taste good the first day, but you leave it in the fridge for a couple of days, and wow, it does get better. We've all had those experiences with food where, just let it sit there and everything meld together. That's how my brain works sometimes. I need to sleep on things. I need to let my brain figure out the problems while I'm napping. We get more clear that way.

It's important that we express to people that this is not the final answer. This is a living document that is going to hopefully give you potential solutions for a long time to come, maybe for the next 5 to 10 years if you've done a really good, deep, and broad research effort.
Jason: One thing I would add on is, I think Brian hit it on the head there. There's a part of the process that's about gelling, and that requires a variety of different disciplines and skills, to look at what's coming out of the wall and speaking to you, if you're using Post-It notes that put them all over the place, and examining those, right? You really have to look at the statements that are on those notes and really analyze it.

It's a tough process. It's one that you have to set expectations going into it as clearly as you can. It's iterative, right? You may go down one path and find out that, "Oh, we're starting to organize it the wrong way," and you find a bunch of nuggets that inform you on, and almost pivoting at times, right? I mean, we've had that situation with a number of projects.

That's an important thing, to always, if you're doing this as a consultant, or if you're doing that internally for your own company, you have to make sure that you keep communication at a high level. Doing this should also be as an open space, right? If you can find a highly visible space within the organization, within the office, to do this, I would highly encourage you to do that, because you need lots of really great insight and understanding and people who have answers to help inform a little bit more of the depth as you get into it.

That's really important, to get that point across to those stakeholders, that it's not the design team or the mental-model team, just a bunch of designers getting in a room and throwing Post-Its on the wall, and then at the end we have this wonderful looking thing, right? It's an iterative process that requires involvement across the board, and you do that in such a way that I think you have to think about the structure you create when you invite stakeholders into it.

As a designer, just as much so as we're designing the synthesis of the data and how we're going to do it, the plan, we have to design the experience for those stakeholders, because that's where they're going to soak up all that knowledge, if they weren't directly participating in all the research themselves, and take that knowledge and think about those customers and bring that to their daily business lives, right? That's key.
Brian: I think it's also setting their expectation for what kind of research this is. It's not research to solve their problems only. You can find solutions to problems. Let's think about it in terms of possibilities. What's possible? What are we missing? What can we read between the lines of what people were telling us, or the data, to find something? That's insights, right?

If it's framed around problem resolution, well, maybe that's just usability tests. If you want to do some core generative research and pull something strategic out of this for the long term, and really deeply understand the people you're trying to serve, I think it's about possibilities more than anything else. That takes time. It's not an instant-solution tool.

Those opportunities will present themselves throughout the interviews and through the data analysis, and I think we should go after those low-hanging fruit if it makes sense. Let's keep perspective and consider the possibilities that come out of the data, more as strategic outputs than anything else.
Adam: Jason and Brian, this was great. Thanks for spending some time with us.
Jason: Thank you very much for inviting us. Appreciate it.
Brian: Yeah, thank you.
Adam: To our audience, thanks for listening in and for your support of the UIE virtual seminar program. Goodbye for now.