The SpoolCast with Jared Spool

The SpoolCast has been bringing UX learning to designers’ ears around the world since 2005. Dozens and dozens of hours of Jared Spool interviewing some of the greatest minds in design are available for you to explore.

Episode #151 Lou Rosenfeld - Beyond User Research Live!

October 28, 2011  ·  12 minutes

Listen Now

Download the MP3

UX professionals have made a lot of progress in large organizations. Companies realize the importance of connecting with their users more and more. User research is becoming firmly rooted in many organizations as companies try to produce better products and services for their users. But user research itself can be narrow in focus and full of biases. Lou Rosenfeld of Rosenfeld Media, suggests that by breaking down the silos that exist between other research practices, we can create a complementary research experience. This will produce even better analysis and therefore, better products as a whole.

Show Notes

UX professionals have made a lot of progress in large organizations. Companies realize the importance of connecting with their users more and more. User research is becoming firmly rooted as companies strive to produce better products and services for their users. But user research itself can be narrow in focus and full of biases. Lou Rosenfeld of Rosenfeld Media, suggests that by breaking down the silos that exist between other research practices, we can create a complementary research experience. This will produce even better analysis and therefore, better products as a whole.

In an attempt to map out organizational structure, Lou offers a set of dichotomies. In terms of research, web analytics folks and UX professionals both bring important insights to the table. But they focus on different things. It’s this separation of insights that lead to the silo effect. Even though these insights would be completely complementary, the cross-pollination that would require this enhancement to the research often is not occurring.

It boils down to the differences in how people think. User experience people tend to shy away from quantitative data and take a more qualitative approach. Neither is a bad approach to take, but the differences between empathetical and analytical thinking, for instance, provide vastly different results.

By combining the efforts of these different practices we can arrive at tremendously useful insights. For example, Lou explains that by adding data to typical personas you can enrich them and enhance the design process. The personas may then align closer to the analytics data simply by adding what they would search for, resulting in a deeper understanding of your users.

Lou is presenting a UIE Virtual Seminar, 8 Better Practices for Great Information Architecture: Closing the Findability Gap on November 3. There are new opportunities for Information Architects to add significant value to projects. There exist new metrics for measuring engagement with your site visitors. These measures will guide you towards design decisions that let your users find what they're after. Learn more about Lou’s seminar.

Full Transcript

Lou Rosenfeld: We have this fragmentation problem which I've already said things live in silos not just content but now insights that ought to help us figure out what to do with content and other design issues.

We've got differentiation. We don't really understand what that CRM stuff is about. I've never seen one of those things before. Yet, I sense it might be good to look at if I'm doing any kind of design work.

And then most importantly this combinatorial issue, the synthesis of all those insights into something that approaches an organizational brain, an organizational or institutional way to make smart design decisions.

So that's kind of what we're facing. In my limited experience with this, I've tried to map it and I think a lot of us are pretty good at doing this sort of mapping of an organization and how it works. It's almost like the same sort of urge that we use, that got us into doing things like site maps and wire frames.

I came up with a bunch of dichotomies. I couldn't map it so let me run through some of these dichotomies. What I'm finding is, there's a lot of people who are really good at figuring out what is going on and there's a lot of other people often not the same that are really good at figuring out why those things are going on.

So for example, people draw on information that comes from analytics research, the quantitative data. They may learn something really interesting. But it's all behavioral stuff. They don't really know what was going on in a user's head.

They can draw up and infer interesting hypothesis but they can't test those hypothesis. That's something that people who are really good at doing user studies for example like a lot of us, are really good at.

We, on the other hand, aren't always so good at knowing the right questions to ask. And I'm kind of going to start focusing a bit on two areas of practice, web analytics and user research but these are the ones I know best.

This is really even more complex when you introduce all the other perspectives but let me just focus on these two. A lot of web analytics people can tell you what is going on. They can't tell you why. A lot of us can tell you why things are the way they are but we don't know what to test necessarily.

We don't have the right questions to explore without the data to help us figure that out. There's a whole kind of a breakdown between qualitative and quantitative people. I love this diagram with the two brains in there. I wish I had come up with it.

I'm not sure how well you can read it but numbers versus emotion, analysis versus empathy, the brain versus binky? Is that what it is? So you know, we have different ways of looking at problems, different ways we try to solve problems and we often are comfortable with different types of data or evidence to help us solve those problems or at least to help us understand what the problems might be.

So that's a big breakdown. A lot of what I'm saying right now, I'm trying to make a point and by making that point, I'm going to over generalize quite a bit but I think a lot of us kind of would fall into one of these categories.

I don't know that anyone is equally comfortable with qualitative and quantitative data. I've met very few people that seemed to be able to do that. In many cases, I think some of us make for qualitative studies because we're really uncomfortable with quantitative data or vice versa. It's just the nature of how our minds work and what we're comfortable with.

A lot of us are in the business of making sure our organizations reached their goals. Web analytics people as an example, they express goals as KPI, things that are measurable, key performance indicators.

A lot of us in this room have been trained to think more on behalf of the user and what their goals are and how to identify them and make sure they're using them. Sometimes those things are very easy to mesh together especially in commerce sites for example. Often, they're not.

We have to resolve these things but we're not always so good at it because usually, whoever is making the decision has a bias in one direction or the other. In effect they're thinking with half a brain.

I think a lot of us are really good at measuring the world that we know. Certainly, again, on the analytics side, you start with your KPI based on metrics and you say, "I'm going to look at all that data and figure out whether we're performing against the goals that we've set out for ourselves as an organization.

Are we doing well? Are we not doing well? Contrast that with looking at data for patterns, looking at data for things to emerge that were unexpected. That kind of emergent data analysis is really looking to learn about the world we don't know and therefore we don't know how to measure.

And then yet another, I'm sure there are more dichotomies. There's a breakdown between the comfort level and understanding of statistical data versus descriptive data and you could have people who make very strong arguments, garbage in, garbage out in both cases and they'd both be right probably in both cases.

But that's not how they see it. Usually, we have a bias toward one direction or another. We like one, we like the other, usually not both but they tell us very interesting, but different things that often fit together nicely, as we'll see.

I've tried to, like I say here, reduce this to a very over generalized, over simplified set of dichotomies that I've just gone through. This is just a summary of what those are. And this is just for two areas. This is just for web analytics and user experience.

But if you look at these, I hope what you're starting to see is not just the differences but the fact that they come together quite nicely, that they're very complimentary. That's where that's where combinatorial effect that's coming in where the insights that one has fit quite nicely with the insights of the other.

Now, I can't map this. It's just not in my wheelhouse but I bet some of you could. What I'm really hopeful for is that someone like Alex Osterwalder who wrote the Business Model Generation book. Is anyone familiar with it? Fantastic.

He actually created and published it with a bunch of people, created a whole new business model around publishing just to do one book. Amazing. But he did a whole bunch of mapping of essentially business models. It's over simplified but damn it, it's useful.

We need something like that to take all these types of insights and put them together in a way that would be really useful for us especially making design decisions. So without a map, why bother even trying?

You know, if we can't map, this is really a hard problem, what's the value of jumping in? Well, for one, we can really, really learn quite a bit from each other's data, right? So let me give you an example.

This is one of my favorite things in the world. It's a little bit of site search analytics code little snippet of stuff. All you really need to know is that if you look at it, the orange stuff like "vincense plate" is what was searched.

There are a few other things that you can maybe figure out, an IP address so you know who it is, the time-date stamps, you know when it happened. The zero next to the last bit of information is how many search results there were.

Now, look at another line. Same time, roughly two seconds later, same IP address. Now they're searching license plate and I got, I think it's 146 results. Interesting, what's going on here? Maybe they spelled it right but well what happens next?

Oh, it's a different user and they're searching on a real mouthful. This is a state government site and this user was searching that site for Regional Transportation Governance Commission. People search things that long? They know what those things are even called? Do you know what government agencies are called?

As you're looking through this, I bet you each one of you are already putting on your analysis hats and saying, "You know, obviously typos are an issue. How would I fix that problem? Maybe I would turn on the spell check on the search engine."

Now did they get what they wanted when they were searching on the license plate or not? You don't know. Is that a common thing that people search and what about this mouthful in the last line?

Basically, each one of us should probably have a whole bunch of different ways of looking at this data and we would start ferrying very different hypothesis, just a couple of examples.

I think a lot of people from, to over generalized analytics community, would be wondering things like are we converting on license plate renewals. A lot of other people like me would be saying, "What are people searching for the most? Is it license plate coming up a lot?"

If so, are we giving that information very easily, are we presenting it on the main page so they can renew their license plate easily and so forth. So we look at the same data, tiny little snippet of data and we probably are all starting to come up with different conclusions or at least different hypothesis and thinking about what we do next differently.

What the next action would be could be very different if you're looking at this as an interaction designer versus an analytics person versus a content strategist. Another way we can really benefit each other is by helping improve each other's design tools.

So I grabbed an Adaptive Path persona and you know again, I love site search analytics but there are lots of other types of analytics out there that you can do this with but I threw some site search analytics data in there.

So you got your typical persona stuff, right? And then, why don't we add some data? Wouldn't that enrich in a new way "what does Steven search?" Now I can actually go to my analytics people and say I could use some of that data.

In fact, maybe my personas might match up well with your audience segments. Maybe you can start putting these things together in some new and far more powerful ways. We can really help tell each other's stories.

I love this example. Adaptive Path again, Jeff Veen and a team were working on a product to make analytics data more easy to understand. I think it's called Measure Map. Is that right anyone? Measure Map.

Google liked it. In fact, Google basically bought Measure Map but they really bought the team. And they'd already purchased the analytics application that they were going to make into Google Analytics but they wanted that team to work on it, to help tell the story of the data in a way that maybe someone from the web analytics world wouldn't have thought.