Transcript: Innovate on Demand, Episode 2: Human-Centred Impact Evaluation
Todd: I'm Todd Lyons.
Natalie: I'm Natalie Crandall.
Valeria: I'm Valeria Sosa.
Catherine: I'm Catherine Charbonneau.
Jordana: I'm Jordana Globerman.
Todd: And this is the Innovate On Demand podcast.
What's an Innovation Lab? What is Rapid Impact Assessment, and how is it different from Iteration? And how can acknowledging failure increase connectedness to the people affected, and bring a better outcome? More honest experiences from the field, on this edition of Innovate On Demand.
Valeria: Welcome! Welcome, Catherine. Welcome, Jordana. Perhaps you can tell us a little bit about yourselves.
Catherine: I work in the Innovation Lab. This is Catherine. I work in the Innovation Lab at ESDC - Employment and Social Development. And basically, I'm the manager of projects in terms of design. So, these are projects that anybody from the department comes to the Innovation Lab, and we help them using different human-centred methods and mix them together tailored to their needs. And we bring the perspective of Canadians to those projects. And they vary in size and scope and in length of time. Yeah, yeah.
Jordana: I also work in the lab with Cat. I'm a designer and a facilitator. So when we say design don't mean computer Adobe standard design, although there is a little bit of graphic design involved in what we do, but more strategic design, designing for programs, for delivery of services, and for different innovative solutions to different problems that we see happening around ESDC.
Natalie: Thank you. Welcome. It's very interesting. I keep meeting public servants who tell me that they work in an innovation lab. So tell me a little bit about what that actually what that means. What does that?
Jordana: Yeah, so I think for our Lab, it's a in some ways, a bit of a misnomer. Like we call everything innovation that is a bit different in the government, or that falls out of the traditional ways of working. But what we are is primarily a Human-Centred Design Lab, which means that we bring the user into everything we do. So we try and create a conduit between the government and actual Canadians.
Catherine: Yeah, and I think in answering your your question, a difference with being an innovation lab is that we have a bit of space. We're working in a tent in a way that allows us to re-frame, and rethink maybe some problem areas where maybe more operational team may not have that space that is required for that. So that's the difference -- the main difference, I guess -- with working in an operational or policy shop versus an innovation lab is just having a bit more space to do those kinds of thinking,
Natalie: It's maybe some space also to try different methodology. I heard you talk about how you work.
Catherine: Yes, we bring behavioural insights, we have a team of behavioural insights in the lab. And we also have a lot of like, skill sets like Jordana, who designed in terms of design thinking, which is that methodology really is all about rethinking from the perspective of the end user at the forefront of any ideas. And that's really the main difference with traditional policy development or traditional program design.
Natalie: Coming back to concept of people first.
Catherine: Yes, exactly.
Valeria: I have to say, so, as a Free Agent, one of my first assignments was at ESDC Innovation Lab, and I have a lot of fond memories from my time there. But I also came back to the Free Agents. And I strongly recommend that every Free Agent should probably do at least one stint in an innovation lab because of everything that you learn when you're in that space. And the ESDC Innovation Lab in particular, they're focused on the human aspect and element and trying to bring it in. It's just so different in comparison to how we work as public servants on a regular basis. So…
Jordana: Yeah, it's kind of a funny idea, too, because you wouldn't think that that would be such a massive, innovative step, just going out and talking to Canadians before you design for them. But it has such a big impact.
Natalie: Maybe you guys could walk us through one of your projects and some of the methodologies and processes you're using to to deliver this.
Jordana: Sure, yeah. So Cat and I recently, were working on a project for the New Horizons for Seniors program. So it was an evaluation for that program. And part of this evaluation, we worked on what we call a Rapid Impact Assessment. So this in itself is a bit of a innovative methodology. It's not used all the time. But it's much faster than a standard evaluation, much more iterative. So what you do is you design an alternative scenario to the current program, and they call this a counterfactual. But this is just another scenario or group of scenarios that are just as logical, just as possible, just as legal. And then you use that to compare the actual current program with it.
Catherine: The neat thing about it is oftentimes when we sit in public service -- you have all kinds of -- everybody has ideas. We should have designed this this way -- it makes no sense. Or there's something in a program space or a policy shop, there's something off already. But the evaluation, and doing an Rapid Impact Assessment does is gives you in a way the flexibility to imagine if you were to redesign the program, or if you were to rethink an area that you really don't think there's something off there? Well, you can re-imagine that alternative. And I guess the neat thing, in terms of what we brought to this I thinking is, in the past counterfactual or other departments in the government had tried Rapid Impact Assessment. And the alternative scenario that they used was the absence of a program. So you have the program, and what if so basically, what if the program didn't exist? So what we wanted to bring in this, this kind of process was, what if we actually designed this program and rethink this program from the perspective of the actual beneficiaries. So that's a human-centred part that we brought in. And so with the use of design thinking and talking directly with seniors talking directly with front line organizations, we were starting to really bring forward some of the irritants and the pains and challenges that they faced with the current design of the program.
Jordana: And so our counterfactuals were very different. They're actually full different programmatic tweaks or entirely different programmatic structures. And those were created collaboratively with Canadians. So we had a few different workshops. But it actually was, it was really exciting for the lab, because we had our biggest external workshop, or what we call focus project. So it's not our like full year project. And we had 80 different -- about 80 different people. There were seniors, there were program beneficiaries, there were small businesses, and they're all coming together to talk about the challenges they were facing, what could have served them better, and to work on different types of programmatic tweets that might yield a much better program for them in the end. That was really exciting.
Valeria: What's the difference between that and iterating?
Catherine: It's actually a great question, because part of it is, the Rapid Impact Assessment allows you to look backwards, look at where are the things that are not really working well with the program, but also allow us to look forward. If we were to make some changes to the program, they may not need to be radical changes, that could be incremental changes and tweaks. Basically, the rapid Impact Assessment allows for that iterative process to happen, maybe a little bit more with more agility, rather than waiting at the end of with a summative evaluation. So the rapid impact assessment is basically what we think made is the case with a New Horizons for Seniors program, it will be a line of evidence to support the summative evaluation, but already give some insight and some new evidence to the program, so that if they want to make some incremental changes, while they have some of that evidence now from having discuss and talk to users. And the one thing I would add with what Jordana mentioned with the size of the workshop, is it was very important for us to have multiple perspectives and multiple voice. And that's a big tenet of design thinking. And we had purposefully in that workshop, people who actually hated or didn't like the New Horizons for Seniors program, and purposefully would not apply to the program. And we had people who weren't successful in terms of applicants as well as applicants that were successful. We had the technical experts, so those who are internal to the department and knows the ins and out of the operating context. So all of those different subject matter expertise are all brought in into this, this big event of 80 plus people. And I guess it came out of a failure. And I wanted Jordana to talk a little bit about that failure.
Jordana: Oh yeah, it came out "uhhhh" of but there's a really good lesson in this failure, which is that it's important to admit when you failed at something or you never get better. Yeah, so we had held a workshop before that. And that was a mix of internal and external folks. So we had people from government, side by side with senior-serving organizations. But part of the issue was that we It was a two day workshop. And on the first day, it was just internal stakeholders. So second day, you had all the people arriving. And they say, Oh, by the way, you guys missed a lot day one. And there was just this, this huge jump of content that they weren't filled in on. And there's also a dynamic that was set up in the room where they were coming into this. And it seemed like they were a bit shut out. And to add to that they weren't exceptionally happy about everything that they had received from the program. And so they had stuff that they wanted to get off their chest, they wanted to use this opportunity to share it. And they felt that they didn't really have it because they were kind of forced into this design for a day to get to an outcome.
And another issue that we realized with that was that there wasn't maybe enough conversations with them leading up to the workshop. So it would have been helpful to introduce them to the process to find out more. What did they really want to get out of that day? So, I goofed it.
Catherine: So we picked up on that. We picked up that when they left they weren't so happy. The externals, especially. So we picked up on that. And we didn't want that to continue.
Jordana: No. And so I called all the externals and I had feedback calls with them, to see what we could have done better and to learn from it. And for one of my calls, I talked to actually the head of the Council on Aging, and explained to her the setup, explained to that we would have loved to have more external guests, but it just didn't work out that way. And so she offered to connect everyone for this massive workshop. So that was a huge win for us. But it only came because we followed up with our tail between our legs.
Catherine: And I guess the cool thing here -- and what we realize is as soon as we're done -- I hung up the phone, we're high-fiving in our cubicles. We're really excited. But at the at the end, we're not owning the program. So we have a client, it's just amazing. Bruce works with us in Evaluation Directorate. And we just literally ran from our office to his office to say, hey, there's a support entity, we need to go there, because part of it is we wanted to check it from a privacy standpoint, can we and can we do this? Is there agility in the organization, we like to actually react to this support entity. And so because the Lab has this mandate, but at the end of the day, the client is the project authority. So Bruce in Evaluation had to had to say whether or not he wanted to pursue this. Good for us. And we were quite convincing, too.
Jordana: We said it was very exciting day for government. We were running up, bursting into his office. Bruce!
Valeria: For the record, Jordana's doing a lot of actions to go along with her-- [laughs]
Catherine: And so he just gave us the green light. And then we also had the support of our own director in the Innovation Lab, Jeannie, she both Bruce and Jeannie supported this. And then within a week, we turned around this massive event. And everybody came in and it was a -- we also were able to bring during this event, a few people that were technical experts of the program to come in to answer also questions that the public may have, in a way that wasn't too detailed into the inner workings of government. Because Joe and Jane Canadian don't really care about how the inner workings work, but they do have something off their chest that they want to talk about, and to have the technical experts there with us, gave some credibility to the whole workshop. So it was set up in a way that we played out all of the frustrations that they have around the program. So--
Jordana: And it gave that really nice opportunity to the externals that they had wished that -- some of them who were able to attend the first one -- had wish they had, which was there is a about a half an hour dedicated to just them asking questions to the program. And they never have the opportunity to do that. So that was an amazing opportunity just for externals to be able to communicate in a one on one conversation with government. They were really happy about that.
Catherine: So how we turned out to develop the counterfactual through this exercise is, obviously, we can't ask them in the field: do you like this or you don't, because it's become more of a opinion conversation. And we're not in the interest of opinion, we're in the interest of what actually happens. And to surface the actual behaviours of how the program makes them react. So what we ended up doing is we we created tension cards or they're not, it's kind of like building on -- yeah, it's like a balancing act. That's what we call it. So we played out the elements of the program that each program has to make choices. So for instance, is it one hub? Or is multiple hubs, the operating structure? Is there one stream or multiple streams? And how do the streams come in? So we played out all of the different components. And we put those on cards, and we asked the participants to evaluate what was that what was the right balance between those two tensions.
Jordana: And it's almost like a game, too. They had these playing cards so they could throw them out, or they could trade with each other to have a different one if they didn't like it. And then it got them to deal with more nuanced programmatic tensions then they might have been able to engage with otherwise, just because they're not within the program. But I had some really interesting output from it.
Catherine: Mm hmm. And so, based on the input we got from the cards, we started having some really interesting qualitative feedback to support the development of the actual alternative design and what that could be. So then we collected the different input from the multiple tables that we had. And then we sat with our internal clients with the evaluation Directorate, as well as the program to look at all that input together with them, and started thinking about the alternative design, what could that alternative design be? And so it was all from what we heard. And so we were very excited about this, because it's the first time that a rapid Impact Assessment actually took that kind of shape.
Natalie: I find it really interesting, because one thing that you're saying is resonating so loudly for me right now, which is, so many times when I speak to public servants about landing something or a project or moving in a new direction that they really feel was innovative. There's always that moment, which you guys really actually articulated so well. The excitement and the passion that you felt when you went running in his office is still palpable. And I really I find it, it's very interesting to me how there seemed to be two things that are part of that magical set of ingredients for all of these things. And that is employees who took the time to look at things from a different perspective, and see that there was an avenue forward wasn't one that would have been contemplated in another context. And the other one is leadership that says, Yeah, let's do it. Yeah.
Catherine: Yeah, we were fortunate. And we know the recipe, some of the conditions for this to have worked as obviously with a client with our Evaluation Directorate client, but also the program. Now the folks in the program space, of course, the counterfactual creates an inherent tension. Because the people in the program space, they know their program very well. And they know what works and doesn't work. It's not the lab that comes in and tells them what doesn't work. They have a sense already. But they work within bounds of constraints and work within bounds of resources and all kinds of timeline, what the Innovation Lab doesn't necessarily have. So, when we support them in exploring what that alternative design may be, there is sometimes some tensions around, what do we keep? And because they know, these are the bounds of the program. And then we come in and we say what if we change that frame of mind? What if we were to change that? And that can be met with both, 'Wow, that's an interesting thought' to some resistance, because it's maybe not the right time, you might not be -- whatever the reasons are there.
Jordana: It's just been done a certain way for such a long time that it's hard for people to really break that from their framework.
Catherine: Yeah. So there. So the alternative design allows for just a, I guess, a pause in terms of the current way of doing to pause and say, What if that would work? How would that look like if we were to do a program that would look this way? So the space the evaluation provides and the space the rapid impact provides, in a way is that moment of being able to imagine like we were saying, like…
Jordana: It's like a generative evaluation, which you don't normally get. You're normally only looking backwards, but this is projecting its future potential as well.
Valeria: So I'm also curious about the timing, like how, so this method, in comparison to traditional evaluation method? What kind of timelines are we looking at? In terms of…
Jordana: So this began in November and it's wrapped. Like our part is pretty much done. There's another workshop that's supposed to come out of it, that other consultants will get. So it's pretty fast for traditional evaluation.
Catherine: So it's interesting you asked this question, Valerie, because even the evaluation client, when we talk to other departments who have attempted the Rapid Impact Assessment, because their alternative is no design. It's much, much faster. So our rapid impact is, is is a bit longer than other rapid impact assessments. But there's still -- it's still faster than waiting for a full summative evaluation. So and because the way we've worked, we've had lessons learned, it's gonna likely not take as much time I suppose if we were to redo another rapid impact on another program. So it is it is really gathering this new line of evidence to help with program iteration. Yeah.
Valeria: So, I guess the way you refer to it, so this is a methodology? Where did it come from? Where did it originate from? How common is it?
Jordana: So the Rapid Impact Assessment methodology, I think it's been around for a little while, I'm not exactly sure how it originated, but our approach to it is entirely new. And that's something we've developed in the lab merging that with design thinking and bringing the user perspective into it.
Catherine: Yeah, and there is a Treasury Board guideline directive around Rapid Impact Assessment. And so they highlight the thinking around why do a Rapid Impact Assessment. It's relatively new. There has been -- so the details of when it was presented, I don't know. The client -- our client and evaluation would know more.
Jordana: Like it's been happening for about five years or something.,
Catherine: Yes, I was going to say three or four years, but maybe five. So there was a high level presentation around like 'This is a new methodology evaluation should try and attempt'. And so our Evaluation Group really wanted to try this innovative, innovative method. And they leverage the innovation lab to support their work.
Jordana: And this was the pilot as well.
Catherine: They didn't know -- they didn't know we were going to bring the human-centred approach. They just -- it was more of a curiosity of what it is you guys can do. And then when they explained to us the Rapid Impact Assessment, we -- kind of what you were saying, Natalie -- just went, 'Oh, my gosh, we could totally design from a human -- from the actual beneficiaries'. And we could totally do this. And we started being very excited about that could be an interesting new way of doing a Rapid Impact Assessment. So, yeah.
Jordana: And I think they were gung ho about it. But I think it was only until the second or even the last workshop that they really saw the value it brought. And but so as a pilot, it seems to have gone very well. And they think they would like to do it again. So…
Valeria: Any final thoughts before we wrap up?
Jordana: All I would say is, I think another thing about this, this project that's been interesting for me, and actually, I should say, I only joined the lab -- I only moved back to Canada in September. So this was the first main project that I saw from beginning to end in the lab. And I guess what, what has been really interesting to me, is that it seems like a small thing on the surface, it seems like a small thing to do an evaluation with a more user-centred approach. But it's radically different from anything that's been done before. And I think it's a really good idea example of how innovation doesn't necessarily mean a big overhaul of what we think of in terms of a program or a service, it can be a new approach to something that we've always done just doing it a bit differently. So that was what was really impactful for me.
Catherine: And I guess the other thing, from my perspective is also we set up the conditions to be able to be agile. So there was a privacy protocol to allow us to do that we're working with a client. So that agility allowed us to when there's a snowball effect of something that really is interesting that we want to pursue, then it's a quick turnaround to, to be able to actually follow that lead or follow this initiative. And part of it is when we started the actual thinking of what that project would look like, we didn't know that we would have an 80 plus people workshop, or we didn't know that we would have a completely flopped workshop. So all those things. Yeah, obviously, we didn't plan that flop workshop, but part of it is, I guess, the thinking behind working in innovation space. Again, I tie back to your original question is that the iterative process of a project, we do have in mind what it is we want to achieve out of it, but we don't necessarily know exactly what that will become. And I hope that there is appetite and interest for that kind of thinking moving forward. And I know it's not always that kind of thinking for that right timing. But if we can do it, it really reaps great, I guess, benefits and the user stories are so powerful, and they are applicable to so many other areas. Yeah.
Valeria: No, this is a great approach. Thank you so much for coming and sharing this with you.
Natalie: Thank you for your final final thoughts. That was fantastic.
Catherine: Thank you for having us. It's really fun.
Jordana: Yeah, thank you so much.
Todd: You've been listening to Innovate On Demand, brought to you by the Canada School of Public Service. Our music is by grapes. I'm Todd Lyons, Producer of this series. Thank you for listening.