The SpoolCast with Jared Spool

The SpoolCast has been bringing UX learning to designers’ ears around the world since 2005. Dozens and dozens of hours of Jared Spool interviewing some of the greatest minds in design are available for you to explore.

Episode #174 Adam Connor & Aaron Irizarry - Discussing Design: The Art of Critique

July 13, 2012  ·  28 minutes

Listen Now

Download the MP3

Critique is an integral part of the design process. Contrasting from feedback, critique is more focused and specific. Often, rather than a gut reaction, it is framed within the context of a dialogue. It is centered around arriving at an understanding.

Show Notes

Critique is an integral part of the design process. Contrasting from feedback, critique is more focused and specific. Often, rather than a gut reaction, it is framed within the context of a dialogue. It is centered around arriving at an understanding.

Adam Connor brings his insights on critiquing to his virtual seminar, Discussing Design: The Art of Critique. In it, Adam discusses the different types of critique and the challenges and rules associated with the process. In keeping with the theme of a dialogue, Aaron Irizarry joins Adam for this podcast as they catch up with Adam Churchill to tackle some unanswered questions from live seminar.

  • Can critique still happen when the goals aren’t clear or agreed upon?
  • What is the difference between a critique and a design review?
  • How do you redirect that conversation when people are compelled to jump directly to solutions?
  • Should suggestions be strictly forbidden?
  • How do you handle context and roles in a critique?
  • Should critiques be structured? And if so, what should it look like?
  • How does the delivery of usability test results relate to critiques?
  • Do you keep attendees of critiques consistent or introduce new people into the process?

Full Transcript

Adam Churchill: Welcome to the SpoolCast. Recently, Adam Connor joined us for his seminar, "Discussing Design the Art of Critique." In the seminar, Adam offered up a reality check on critiques. He described how to give, receive, and best of all act upon feedback, while confidently guiding your projects through beneficial feedback loops. With the right approach to critique and collaboration, your designs could be stronger than ever. Now this seminar, "Discussing Design: the Art of Critique," has been added to UIE's user experience training library, that has presently over 90 recorded seminars from wonderful topic experts just like Adam Connor, giving you the tips you need to create great design. Adam joins us to circle back on some of the great questions that our audience asked. Often, we're fortunate to have really smart people like Adam, come join us and talk about something they've been thinking about a lot. In this case, Adam tells us that critique is a dialogue. Therefore, to give this topic its due, we've got another really smart person, who Adam spends time having a dialogue about having a dialogue. Enter Aaron Irizarry. Welcome, Aaron.
Aaron Irizarry: Hey, thank you.
Adam Churchill: Hey, Adam.
Adam Connor: Hey, how's it going?
Adam Churchill: So Adam, for the folks that weren't with us on the virtual seminar day, can you give us a brief overview on what you covered?
Adam Connor: Yeah, so to kick things off, we framed up critique against feedback. We talked a lot about feedback. But really feedback, it's nothing more than just a gut reaction. Some people put more time and effort into it. Most people are just going to give you a gut reaction. Critique is something that's a bit more specific. It really focuses in on how a design is or isn't achieving certain goals and framing that in a context in a dialogue, like you said, to kind of help the creator understand those impacts. So we talked about the two sides of critique, there's giving critique and there's receiving critique. And how important intent is on both sides. When you're giving critique, it's all about trying to understand what the designer was trying to do, how they tried to do it, and if it's working or maybe if they have more steps to go to get there. When asking for feedback or critique, you should be doing it, because you want to understand the impact of the design decisions that you've made so far, and where you should focus your efforts going forward. We moved on to talk a little bit about the ways to integrate critique into a design process and there's three major areas that Aaron and I have called out over time. There's the stand alone critique, and that's nothing more're working on a design for a certain amount of time, maybe you schedule these regularly, maybe you do them impromptu. But at some point, you say to yourself, OK, I really need to understand how this is working. Am I on the right track, am I hitting the goals, and following the principals that I set out to. So you grab some people and you sit down for an hour. You quickly bring them up to speed on the context, the things you're trying to do with the design. And then, you start to critique. There's also ways to integrate it into design reviews and other meetings you already might be having. There are some challenges there, which we talked about in the session, and there are some tools to help overcome some of those challenges. And then, we also talked about how critique can be used in collaborative workshops for people familiar with design studio and other kinds of collaborative sessions. Critique has a really useful place in those sessions as this bridge between divergent thinking or generative thinking, coming up with lots of ideas, and then moving to convergent thinking and consolidating those ideas, iterating upon them, eliminating some of them. So, critique is a really powerful tool for that evolution in a design. We also talked about some rules when critiquing, whether it's internally or with clients, with other areas of your business. The three big rules are, you need to avoid problem-solving and design decisions. When you're critiquing, you're analyzing. You're not there to be coming up with other ideas or recommendations or deciding on the things you're going to change. You've also got to remember that everyone is equal, and this is a big challenge for organizations with a lot of hierarchy or a lot of egos. Everybody in that room, they're on equal ground. It doesn't matter whether you're an executive or a junior business analyst. Either way, what you say matters the same amount. And then, everyone participates. You've got to make sure that you have that dialog with everybody that's there with you. Silent people can really throw off the dynamic in a room. So, those are the three big rules. The other really big, really important thing that we talked about is the importance of goals, design principles, scenarios, and personas. These are the things that you are designing to and for. These are the reason you're designing and the reason you're designing the way you're designing something. So, these tools, these goals, these design principles that you've decided on, the scenarios, the personas, hopefully all the people you have with you, they know of them, they've agreed to them, they understand them, and these are the ways you set context for everybody. These are the ways you keep conversations from going off into personal preferences, because you can direct things back to, "How does that help us or not help us achieve this goal?" or "Why does that matter in this scenario that we're designing for?" or "How does that impact this persona that we identified?" Those are some very, very crucial tools to having these productive conversations around a design, because without them, you've got people with slightly different interpretations of the problem, different goals that they'd like to see achieved, different philosophies when it comes to design. And, without these centering tools, those elements of individuality across all the participants in the room are going to get out of hand. You want people to bring their perspectives, but you also want to be able to center them on this set of goals, on this set of desired outcomes.
Adam Churchill: Adam, what happens when the goals aren't clear or agreed upon? Can critiques still happen?
Adam Connor: It can, but it gets to be very, very challenging. Inevitably, what's going to happen is, somebody's going to start talking about how an aspect of the design does or doesn't meet a certain goal, and then the conversation will turn to, "Well, is that really a goal that we're trying to achieve?" So when the goals aren't present, when the goals aren't agreed upon, what you can use as a centering tool is just a predefined list of design principles. There's lots of these out there. Abby Covert just released a poster of her set, which I believe she calls, "Heuristics for Information Architecture Critique" or something along those lines. I believe Jared's talked about principles. Todd, I believe, has a set of interaction design principles. There's lots of collected sets out there. These are general principles. These are like best practices. And you can use those and critique against those. But I would say you need to make every effort to hold off on critique when you don't have the goals defined, and take a step back, and get everybody in a room to define those goals. Because that's going to make everything easier going forward.
Adam Churchill: Aaron?
Aaron: Yeah, definitely Adam hit that perfect. It's something that I've found helpful is by sending out that reminder before the critique session of "Here's what we're critiquing, and just a reminder, here's the goals to keep in mind as we take a look at this." Sometimes you can unearth that disagreement around the goals. So, "Wait, are those our goals?" Or, "I thought they were this." And then, you possibly have the chance to put the critique session on hold. You've unearthed that there's not an agreed-upon set of goals. You can address it then, and then use that moving forward to go into critique, and it becomes a little bit more valuable. I think you possibly can still critique if the goals aren't there. It's just a lot tougher. And your results aren't going to be what they could.
Adam Churchill: Aaron, how do you define a critique session versus a design review? When is a critique a critique?
Aaron: It's interesting. Surrounding critique, there's a lot of words that are used so interchangeably. In the beginning of our session we talk about the differences of critique and feedback. But then we keep using the term feedback very regularly. I think sometimes the same thing happens with the design review. For someone who's going to be just working in an organization, and that's just part of the process, sometimes it's very hard to tell the difference. A design review is really centered around signoff and approval, and "Let me get this done and get the stamp of approval from stakeholders and managers so I can go on to the next thing." Critique isn't that. Critique is so much more of a continual process of analysis and evaluation. So you're not really ever going to be shooting for approval and signoff in a critique session as much as opening up better discussion around improving what you're designing and working on. And so, they are different. Similar things may happen within them within each setting. But a design review is so much more about accomplishment and getting that thumbs up, and even sometimes a validation. And that does happen. It still happens in organizations. I still deal with them. And you can critique in that setting. It's a little more difficult. But critique is so much more about the product itself and its journey.
Adam Connor: I think it's also important to try to think of why a critique would happen versus why a design review would happen. Design reviews are pros fast, best driven. Critiques should be designer-driven. It should be something the designer is using as a tool to iterate with. So a designer calls a critique because they want more information to use in making design decisions. There should always be this expectation that when a critique happens, the designer is going to continue to work on the design with the information they've collected from that critique. Whereas a design review, like Aaron said, it's more about approval. It's like, are we done yet, so that we can go do something else? And that's a big differentiator between the two. The intent in a design review is not an intent that lines up with critique very well.
Adam Churchill: Adam, how do you redirect that conversation when people are compelled to jump directly to solutions?
Adam Connor: In a design review, when things start getting into, "OK, we need to do this instead," "You need to change that to this," or even in a critique, when people start throwing out, "I think it would be better if we did it this way, and we put this on the left and turned that into green..." And you need to bring it back and say, "OK, we're here to analyze the solution. Those are great ideas. Let me take note of them and I'll follow up with you later, but let's focus on what we're looking at now and the strengths and the areas we're improving on this design so that we know what to do as we move forward, we can make informed decisions. Like I said, the problem with getting into solutions and recommendations is you start to think about this other thing that doesn't exist. You start to get into conceptual thinking and you've moved away from analysis. Redirecting the conversation, it's really a facilitation skill. There's a lot of facilitation skills involved in critique. It's about stopping the conversation. You don't have to be abrupt and say please shut up. You can just say thanks, it sounds like you have a lot of great ideas. I'm going to take note of some of those, I will follow up with you later, and then turn the conversation back by maybe calling out one of the scenarios, calling out a persona, talking about how does the way it's designed now not work in this scenario or not help us with this principle.
Adam Churchill: Aaron?
Aaron: Adam hit that right on the head. It really is facilitation and redirection. It's going to be people's natural inclination in design review or a critique setting. They're going to have ideas. As they're analyzing, ideas will pop up, but, like Adam, it's re-harnessing that and putting it to the side to revisit later. It's definitely tough and really working on facilitation skills proves to be really helpful in that situation.
Adam Churchill: Aaron, should suggestions be strictly forbidden? This question comes from Jason who comments that in their critiques he feels that the suggestions that come up can be helpful in moving the discussion forward.
Aaron: This is tough because I'm not strict. I will sometimes loosey-goosey about stuff. You get to work with a lot of smart people and sometimes those people are going to have great insights so it's hard. You don't want people to feel like they can't contribute or they're not a part of the product building process. It's really more around the education of the right time to do that. I really try to avoid them and I think this really tails right off what Adam was saying in you just have to redirect it a little bit. In our critique presentation we always talk about the follow-up and I think that's a great opportunity to go work with someone one-on-one on maybe some of the suggestions that they had. You can even explain that to them like, "Hey, I'm going to be doing some follow-up after this session, could we talk about those suggestions a little more in-depth then so I can really give you my attention and focus at that time?" Should they strictly be forbidden? Yes and no. I'll try to avoid it. I'll say that. Sometimes it's hard to steer the conversation, but you do the best you can. If it's working for them, I would hate to tell them not to do it, but I'm not in a situation with Jason. I would avoid it if possible.
Adam Churchill: Adam, what do you think?
Adam Connor: I think you get into a tricky situation when you say you absolutely cannot do this or you try to throw in some penalty. If you do this, I will take the finger. We've seen people do this with the I like and I don't like rule around critique, that you can't say I like and I don't like. It's the same thing with this. As soon as you tell somebody they can't do it, it's inevitable that they are going to do it. It just happens naturally. It's a reflex, it's instinctual. Knowing when to cut it off, getting a feel for they've said their piece, this is as far as this needs to go now, I can stop this, I can document it here, we can follow up later. That's a really good facilitation skill to have. As far as finding it helpful in moving the discussion forward, I would say which discussion are you trying to have? Are you having a discussion about what the design should be? In that case, you're concepting anyway so, yeah, it's clear, but you're not actually in a critique at that point. If you're having a discussion around an analysis of the design decisions that have been made so far, then it's not useful in that discussion so that's where you do want to let it go as far as it needs to, stop it, and follow up later.
Adam Churchill: Adam, what about context and roles? This question comes from Duncan and the example he offers up is a situation where business subject matter experts are addressing business objectives where developers are addressing technical objectives or issues.
Adam Connor: One of the great things about having a cross-functional team is everybody's got their little pocket of expertise and you can harness that in a critique by involving people from different areas. You can ask developers directly about how a design is or isn't meeting specific technical objectives. You can ask business analysts directly about business objectives. At the same time, you can give everybody a chance to voice their thoughts on things that are outside of their expertise. One of the things Aaron and I try to encourage is to think about what it is you want to analyze in the critique session. Are you looking specifically at a set of business objectives and, in that case, maybe you want to focus your attendee list on being more business centric people, but still include a couple technical people or a marketer or something like that. Try to choose based on personalities and roles and skill sets who you're going to bring into the session. Don't necessarily constrain it to just that. You can let other people get a chance to talk about other things because you never know where the greatest idea or the greatest piece of feedback is going to come from. It could come from somebody who's in a completely different area of the company. Leave it open a little bit, but make your selections in an informed way based on what it is you're trying to get out of a session.
Adam Churchill: Aaron, what do you think?
Aaron: I definitely agree. I've found it so helpful sometimes to schedule my critique with individuals who the subject matter is super relative, people that will actually be using, maybe, what we're building if it's an internal type of admin tool or something and in a certain department of the company that's going to use that. I make sure to bring them into those sessions so that they can really help shape the tool they're going to use. I think each of the people--the business analyst, the developer--have a specific skill set that makes them good at what they do. It's a way of thinking. It's an approach to what they do. Having that variety in a critique session, even if they're not just talking specifically to their area of expertise but starting to apply the way they think towards other areas as you move through the critique process can provide great perspective and can be extremely helpful.
Adam Churchill: Aaron, a question that came up during the seminar, I thought this one was really interesting, is do you have a standard spiel that you give at the beginning of a critique to explain how it should be structured? I'm going to add onto that. Can you say a bit about the structure? Should it be structured and what does it look like?
Aaron: Definitely. I don't always get up and give a 10 minute presentation to start the process, but what I do, and we've recommended people do, is in a more kick-off setting you're sending out the materials ahead of time and you're explaining what's going to be critiqued, what the goals are. Not necessarily how you're going to reach those goals, but you start setting up that framework ahead of time. As we've discussed the rules of critique, Adam discussed them in his overview in the very beginning, bring those rules with you. If you have to post those on the wall really do a bit of home and actually leg work in the beginning--bring those rules with you. If you have to post them onto the wall, really do a little bit of homework and extra legwork in the beginning to inform everybody about how the structure's going to be set up. That way, when you do get in, you can refer to your previous correspondence with them. I think structure is really huge, because you get a group of people in a room around a product. A lot of them have ideas. We're all brilliant, so we're all going to have something to say. If you don't have the right structure, it's going to be really difficult at times to harness the information you need to help refine the product and make it better. And so, there's a lot of different tools, a lot of different little techniques, whether it be using something like a round-robin technique, where you go around the room and ask each person for their specific insights, or you do something where there's a quota, where you ask, "OK, we're all going to take a look at this, and I want to hear two things you have concerns about where this is meeting goals, but I'd also like to hear one thing, one area where you think it is meeting goals." The idea, and I picture it this way for me, is that you set up this template, this framework that you are dropping everyone into and facilitating and getting the information you need from them, because not everyone is adept to being in a critique process or giving feedback in a constructive manner. So the more framework you set up, the safer place it becomes for this type of dialog to happen. And so, in the beginning, I do try to review over the goals, the purpose of the session, what we're going to try to accomplish, the rules, the format we're going to use. We're going to use the round-robin format or another format. And then when we're done, a little bit of closing around the same thing, letting them know that I'll be following up with an email and that as needed, I'll be following up with individuals one-on-one. It really helps to have a good structure in place. It just helps things go smooth. And it helps people feel a little bit more comfortable, getting used to the critique process.
Adam Churchill: Adam, talk a bit about how the delivery of usability results relates to critique.
Adam Connor: I've seen it go two ways. I've seen critique serve as a lens for construction of a usability study. So, in a critique, you might identify areas of concern, things that you might think might cause usability issues. When you go to construct the script for the study, decide what areas of the design you're going to focus the study on. You might focus more on those areas that you've called out. So it can help inform a study, and then the results can be used to confirm or eliminate suspicions coming out of a critique. The other side is sometimes the usability study will show problem areas but won't necessarily be able to draw conclusions as to why a problem area is a problem area. And that can be addressed in a critique. So you can use the results themselves to inform what you want to discuss in a critique. So you find out that something is causing an issue, people aren't understanding something, they aren't getting a concept out of the design. So, you spend some time in the critique looking at, "OK, these were the goals we had for this section, and these were the principles we were adhering to. This is the feedback we heard in the usability study. Why isn't this working?" And you can critique that way.
Adam Churchill: Aaron, when you have critique sessions ongoing, do you have the same attendees? Are they consistent, or are you introducing new people during the process?
Aaron: I believe it's a little bit of both. It's always good to have a variety, and it's always good to get new perspective. Also, with critique, and being that it's so centered in communication and this dialogue, it becomes very important to find people who fit into that role well. You may find out through critique sessions that some people aren't really cut out for this, or they're just not that interested in it, so they bring a level of un-productivity to the process, so it's good to switch people out. Some people may really enjoy it, too. They really may like the conversation, and talking about the product, and feeling like they're putting their mark, their stamp on it, and they're really engaging, so I would continue to use those people. Like we were just saying a little while ago, sometimes subject matter can determine new people to invite, but I think that it's OK to have the same people. Not all the time, you should switch it up, but it really is going to come down to who's participating. If you're dealing with difficult people, you usually don't want to invite them over and over again, because it will cause difficulty every time, and that can really derail the whole process, but I think it is OK to bring the same people. There is a value to someone who's been a part of the product building process over time, and the conversation is fresh in their head, so I do think there is a good point to that. I think it's fun to have variety too, and that's quite all right.
Adam Churchill: Adam, what's happened in the successful critiques you've observed?
Adam Connor: Well, I think a lot of who you invite over time also has to do with the element of collaboration that you want to keep within the project. If there are key people you have to collaborate with because you're going to need them in order to make this thing work. You don't want to suddenly cut them out entirely from critiques, unless you're going to replace that with some other way of talking to them, and getting their feedback, and working with them. You want to make sure when you stop using people that it's not something that's going to severely inhibit your ability to collaborate as a group. To what Aaron said, just choose who you bring in based on what the goals are for the session. Who's the right subject matter expert to talk about it? You can have smaller critiques and then larger critiques. If you want to do an impromptu critique because you want to look at this small little feature, and it matters to a specific set of subject matter experts, you can do that, and then have a slightly broader critique with a few more people in it later on that maybe talks about that section a little bit more, but also covers some other things. Just think about that element of collaboration, and, "How do you keep that going? How do you keep people feeling like they're informed, they're involved, they're providing good information and contributing in a productive manner," and I think you'll be pretty good.
Adam Churchill: Well, gentlemen, that was awesome. Thanks for taking some time out of your days to circle back with me on these questions. My understanding is we're going to be lucky to see both of you this fall at UI17 in Boston.
Adam Connor: Yeah we're looking forward to it.
Aaron: Yeah, very excited.
Adam Churchill: All right gentlemen, thank you. And for those listening in, thanks for your time and for your support of the UIE virtual seminar program. Remember, you can get all the details on upcoming seminars at Goodbye for now.