Episode #7 Spirits, Claws, and Analytics — A study in superstition and science
Listen Now
Much like superstition, something that is believed to be an important metric may not apply to the reality of your product or service’s experience. Understanding the behavior of your users, introducing some science, is what leads to greater context and insight.
In this episode of the UIE Podcast, Chris Callaghan of McCann UK talks about his experience of joining a team and seeing the superstition first hand: A roomful of folks attempting to derive meaning from numbers, but having the same conversations over and over. Straying from superstition and introducing science started with the simple question: “do we know if anyone outside of this room can use this?”
Kate Rutter joins us to help dispel some of the myths surrounding analytics and offers insight on how to come to true meaning.
Full Transcript
Jared: This is the UIE Podcast. I’m Jared Spool.
There’s this scene in the movie Toy Story that takes place in the pizza shop. In this scene, Buzz Lightyear finds himself in a rocket shaped amusement filled with little green aliens. (Buzz wishes to commandeer their rocket ship to get back to his home galaxy.) He asks the aliens who is in charge and they all point up to The Claw. “The Claw is our Master. The Claw chooses who will go and who will stay.” A few moments later, when the evil kid Sid has used The Claw to get a prize, it snags one of the aliens who says “So long my friends. I have been chosen. I move on to a better place.”
What we believe is predicated on the information we have at that moment in time. This doesn’t just happen in the movies. In the 1800s, when someone had a fever, a common treatment was to open all of the windows to allow the spirits to escape. Like the Aliens’ acceptance of the The Claw, in those days people accepted this as the right way to deal with disease. Our knowledge about germs and viruses wasn't fully formed. Germ theory wouldn’t come about until the 1850s.
One of the traits of being human is we love to give credit to things that don’t deserve it. This is the nature of superstition.
And there are few things in the world of design that involve more superstition than analytics. We see the reports of numbers and we want to attribute something bigger to them.
Chris: Every month or every period the same people would get together in the room and discuss what happened in the previous periods. They would look at the analytics, give headlines. And this was a good five-, six-hour meeting that's pretty much the same stuff every single month.
So when I joined the business I got an opportunity to sit in on a few of these meetings as a newbie. Kind of sit in there quite quiet. Just watching the same things being said by the same people over and over again. So I simply asked the question, "Do we know if anyone else outside of this room can use this website?"
I'm Chris Callaghan.
I'm the UX and optimization director at McCann, Manchester, UK.
Jared: Chris’s client kept the trains running. Literally. The meeting was about the ticket-purchasing website. The client knew a lot about trains, but about analytics? Not so much.
They didn’t know what they didn’t know and they thought they’d understood what they were looking at. I mentioned Chris’s story to someone I love talking about analytics with..
Kate: I'm Kate Rutter, I am a strategic sketcher and recovering UX designer. Metrics and analytics and looking at the quantitative data has absolutely changed what I put on an interface and why, and how I hope it will make the humans using that better in the world, and how can I measure that? So it's the one feedback mechanism that I think directly, has changed and improved my designs.
I was listening and hearing the story about the same people sitting in the same rooms with the same numbers, trying to find that insight, that meaning out of them. My first inclination is, were there animal bones involved?
Because really, it's like throwing runes. It really is. And what's unfortunate is I think our teams, that rely on these, on the quantity of numbers that some package or some expert has said, that these are the way that you measure effectiveness of digital products. Like, they have normalized a lack of understanding, is what's happened.
Jared: “Normalized lack of understanding.” That’s probably the best definition of superstition I’ve ever heard. And it completely describes how so many organizations think about analytics.
This normalized lack of understanding comes from the willful acceptance of the interpretations without question. This is not to say that these beliefs are inherently wrong, they just don’t have the full picture. A few thousand years ago, a giant lifting an orb every night that was made of muenster was a fact that everyone knew and believed to be true.
It wasn’t until a few people began observing things differently. They injected science and changed our beliefs about what is truly happening. It turns out, observations are important in the scientific process.
Chris: We hadn't done anything observational. It was nothing qual related. No one could tell me that they'd watched or listened to someone and could tell me whether people could do this easily or not. Just a very simple question really. So we had this analytics and analytics is important but there was just no why in there. We were just seeing the same numbers and numbers move up and numbers move down. Train travel is seasonal. People travel around Christmas and holidays and things so we got to see those numbers move up and down but no one could say, fundamentally, whether there were big issues on the site, whether it was perfect, or anything in between.
Jared: At the heart of analytics are numbers. Ten. One thousand four hundred fifty one. Forty Two. Ninety seven. By themselves, the numbers don’t have any meaning.
But we’re human. And we want the meaning. And when the numbers don’t give us the meaning, we make our own. We create our own superstition.
Kate: You know, these numbers had meaning at some point. I think when we're first looking back at the early inklings of what I'd call consumer activated web, you know, when it was really getting quite a bit of visits, quite a bit of traffic starting, these were the numbers we had. Like, was someone landing somewhere on a trackable digital page, and were they clicking on something? And where did they go from there?
And those are just the basic, basic, bare bones. And then there became this whole culture about how to analyze these numbers, and what they mean, and associating meaning, and trying to trace causation and correlation through actual human behavior.
And I think that became, not snake oil, ‘cause there was a there there, but it became this analytics theater. And so as more and more people had digital products and the skill set was lowered, so it was democratized so all kinds of people could do things but really its just the bare basic bones and those bones don't have meaning anymore.
Jared: Democratizing the analytics meant coming up with a common language. Like Page views. Bounce rate. Conversion rate. The words were tied to the meaning and the superstition was extended. We want page views to go up. We want conversions to go up. We want bounce rates to go up. Or down? I know we want bounce rates to do something, but I can never remember what they’re supposed to do.
Kate: The companies who are driven by their quantitative analysis, they tend to be very rational and very causation directed. They want a metric and they want to be able to move that metric intentionally. What's unfortunate is, just like any system, you can game it, so people have these numbers, they start using these numbers almost as weapons, and it’s an interesting power struggle, and now the numbers don't mean anything because you can always interpret them in a way, you think, will benefit your department or division or make your sales look better.
Chris: I think some of it was at face value. So if a number went up that was a good thing. But from my point of view, at the time there was a number of different products on the website. So you could go and buy a ticket through the normal routes and then you could buy a ticket through a different type of mechanic which offered a smaller range of tickets but it was seen to be a good thing because it could offer them at low prices. So I would say, "If someone buys it in one place, they're not buying it somewhere else." If you click one place you're not clicking somewhere else. I don't think we were having this level of discussion. I think it was a green light, numbers have gone up. That's good. Red light, numbers have gone down, let's do something about it. It was at that sort of level really.
They were saying things like, "Okay we see these analytics that kind of matches with what we're seeing in our information packs in the business or it's not or what was the reason for that?" So again, I would say the same dozen people in the room having the same conversations about the same numbers.
Jared: The quantitative data tells you very little about your users. It tells you a whole lot about your design. If a link is clicked. If users are pressing a button. Where someone goes from the homepage. This is all good information to have, but it’s only half of the information you need.
Your design exists to help the user accomplish a task. The numbers can tell you if that’s happening but can’t tell you why or why not. To break those assumptions and superstitions, you need to introduce that qualitative side of the equation.
And that’s exactly what Chris did. He decided to move past the superstition. He introduced Science. Step one? Usability testing.
Chris: The core things we were wanting to test are can you buy a train ticket? So we were asking questions that would encourage people to maybe use the calendars, maybe use some drop down lists. So we carefully crafted the questions to try to uncover a lot of stuff. But the core question was about people going away and buying tickets.
Jared: Chris tested the ticket booking application with a handful of users who were, interestingly enough, waiting in a train station coffee shop for their train. This little application of science yielded new insights for the team.
Chris: There was a big red button on the mobile journey which said, "Find my train," or something like that. And it was a big, red button. Huge, you know it's like hundred pixels high and the full width of the screen. All you had to do was tap it and no one tapped it. They felt like it looked like an ad. They didn't want to tap on this big red thing. They preferred to click on the boring gray buttons further down the screen that were grouped together that looked like buttons.
So, for me, as a researcher, you're kind sitting there with your nails in the desk thinking, "Why are you not clicking that big red button?" Which, essentially, they weren't doing. But when you heard people they were saying things like, "Well, it's next to that kind of carousel slider thing so it looks like an ad. I know I should probably click on it but I don't want to." And things like find my train. I'm not wanting to find my train, I'm wanting to get some tickets.
And then when we took these observations that people weren't tapping this button and we could understand why and completely agree with them why they wouldn't click it. When we looked at analytics and it was happening at a huge scale. Something like 50% of people were just going all through the website backwards and forwards, up and down when they just needed to hit this red button, basically.
That was mind blowing because that's the one job that our mobile site needed to do was allow people to book tickets and it was struggling.
Jared: Suddenly, because of the new data from Chris’s research, they had a different story to tell. They knew the why.
The numbers were there all along. But the team didn’t know where to look, because they were hidden by all the other numbers.
It’s not that the old numbers they watched were bad. It’s that they didn’t tell a useful story. When Chris’s research uncovered new numbers—better numbers—they had a story that not only informed the team what was happening, but gave clear direction on what to do differently.
It’s tempting to want to focus on numbers. The numbers are concrete and companies lean toward being more pragmatic. However, the folly in all of it is the belief that quantitative data is more scientific and reliable, just because numbers feel scientific.
Absent the knowledge we have today, you could dream up any number of reasons that the Sun rises. Even just the term itself, “sunrise”, is technically a legacy term. But until the observation that it was in fact the rotation of the Earth that caused the perception of the rising Sun, that notion was accepted.
It’s comforting to assign meaning to something or believe something is happening in line with your assumptions. It allows the veil to stay in place for those companies and those superstitions to carry over. Those numbers that come from the analytics package feel good, but they don’t carry their own meaning.
Kate: For business in general, I think that they use numbers as the number one assessment tool. And not meaning, and not purpose, and not effectiveness, it's an efficiency play and that's characterized by our industrial heritage and I think it's a very hard thing to get out of that culture.
So analytics theater has fulfilled that need emotionally for companies that are terrified, they don't know where they're going.
Jared: Chris solved this problem by finding his own numbers. Numbers based on observing real passengers trying to book their tickets. Chris’s numbers filled the organization’s need to use numbers, but now they were numbers that actually helped improve the product.
By doing the research with actual users, Chris began to know what to look for in his analytics. He arrived at deeper meaning and the real truth of what’s happening.
Chris: The case that we're talking about with this find your train button on the mobile site, I think we observed three out of five, or something like that, have this particular problem. We couldn't understand why we had this problem. So when we went to analytics, in particular on the mobile where we didn't use the visual analytics actually. We were looking where people would go to next because this is the home page, it's the landing page. We would expect, just like the other sites, that a lot of people would want to buy train tickets.
So, what we were kind of looking at here is, if we're seeing three out of five people in the lab have problems with this button and move through the site up and down, backwards and forwards, that's a lot of interactions before we can actually get them to the booking matrix system. We would expect to find similar in the analytics if that's true. And we found the analytics but on a much bigger scale than we thought. It was absolutely staggering.
Only 25% from the homepage went into the booking matrix. And then what we could look is the first interaction was to the tickets and timetables page. That button that provided the most information scent. And then the next interaction after tickets and timetables was looking at train fares. After that it was back to the homepage again. And after that it was somewhere else.
Jared: That means 75% of people were not doing what the booking site was designed for them to do.
Kate: You can't just adopt kind of industry standards of what gets measured, compare it to others, then call yourself good, because your customer behavior and adoption and usage and retention and the financial value that your customers deliver to you in exchange for value that you deliver to them, those equations are all off if you're looking at the wrong numbers.
Jared: Chris moved his team beyond superstition. He introduced them to science. Science that was tailored to the needs of their project to increase ticket sales. Science that gave them the Why behind why their users behaved the way they did.
Chris: For me it's quite nice because, when you kind of look back and did the retrospective, you know, "Okay we heard people didn't want to tap it because it looked like an ad and maybe the words were wrong," and things like that. But when you look at it from a visual psychology point of view it made absolute sense. There were five or six gray, dull looking buttons that look like buttons further down the page. So they look like buttons. They were grouped together so those are all my options. Anything outside of that is something different. So when you look at some of the visual hierarchies and things, yes, I can completely understand why people don't want to click that.
Kate: Even if you have these insights and you're like, oh, well, duh, and they're obvious in hindsight don't spend a ton of time in your companies for not doing it sooner.
Jared: I often refer to user research as “The science of the obvious.” It’s the science part that’s key. That science moves us beyond the superstition. It gives us the power to answer the why question. And that power fuels our ability to improve our designs.
Kate: Learn the thing, make it better, and go learn something else. Stop trying to like, pre propose that you could have done this, coulda woulda shoulda done this. I see a lot of blame happening when metrics wake up calls and usability studies happen, and I think that's unfortunate because the work that companies, businesses do, is hard.
And we just gotta like, celebrate success when we can, and then move on to the next challenge.
Jared: Celebrating the success of learning something and moving to the next challenge. That’s a plan I can totally believe in.
This UIE podcast is brought to you by UIE’s All You Can Learn library of fantastic UX presentations and seminars.
We just recorded Kate Rutter’s wonderful UX Immersion: Interactions talk, Finding The Narrative In Numbers: Making The Most of Metrics. In this talk, she walked through how metrics are used to shape and influence our work. We also have my UI21 presentation, Is Design Metrically Opposed?, where I talk about building quantitative metrics directly from qualitative studies.
You can watch both of these presentations, and 298 more, at UIE’s All You Can Learn library for one low monthly fee. Just visit AllYouCanLearn.co for more information.
Also, if you’re in an organization that is looking to hire more UX designers anytime soon, I want to point you to our new school in Chattanooga, TN called Center Centre. Our students are learning what it takes to become an industry-ready UX designers and they need your help.
To help them learn the craft, they need great projects to work on. Companies supply the projects and, while they’re at it, they get to see what our students are capable of. It’s a great way to help grow our field while you’re doing preliminary recruiting. If you have a project that you think might work, please get in touch. You can learn more at Center Centre’s website. That’s C E N T E R C E N T R E dot com.
The UIE podcast is produced by myself and Sean Carmichael. We'd like to give special thanks to Kate Rutter and Chris Callaghan for appearing on this episode.
You can find more information about the UIE podcast on our newly launched UIE Podcast Network website: U I E dot F M. Go there now and look at all the great shows we’ve put together over the years.
This podcast is part of the UIE Podcast Network. Thanks so much for listening and, as always, thanks for encouraging our behavior.