Nonprofit Program Evaluation Made Simple with Chari Smith

It's March Madness people. No, not the sports ball March Madness. It's my 13th anniversary of being in business. And I have a goal of hitting 100,000 podcast downloads. Here are three ways to help. One, download one or more of your favorite episodes, including setting your fundraising mindset with Ria Wong, ethical storytelling with Calliope Gleros, what the best fundraisers do differently with Sabrina Walker Hernandez, and my special series on what's next in social media for nonprofits, just to name a few. Number two, share an episode with a friend or a colleague. You can go to pod.link backslash nonprofit nation to find descriptions and links to all the episodes. Number three, take a screenshot of the podcast and share with your network. Be sure to tag me so I can find it and share it out also. I truly appreciate all of you, your time, your attention, and your passion to make the world a better place. Now let's get to today's episode. Hello and welcome to Nonprofit Nation. I'm your host Julia Campbell, and I'm going to sit down with nonprofit industry experts, fundraisers, marketers, and everyone in between to get real and discuss what it takes to build that movement that you've been dreaming of. I created the Nonprofit Nation podcast to share practical wisdom and strategies to help you confidently find your voice, definitively grow your audience, and effectively build your movement. If you're a nonprofit newbie or an experienced professional who's looking to get more visibility, reach more people, and create even more impact than you're in the right place. Let's get started. Hi everyone and welcome back to another episode of Nonprofit Nation. Just really excited that you've joined me today, my guest, and a question for you. Are you overwhelmed by the thought of conducting a program evaluation for your nonprofit? Do you don't know how to structure it? Maybe you don't even know where to start, and there are many ways to do program evaluation, which makes it difficult to know which model is best or which format to follow. But help is here. My guest this week is Shari Smith, founder of Evaluation and Into Action, and Evaluation and Into Action helps nonprofit professionals create realistic and meaningful program evaluation processes. And Shari believes evaluation should be accessible, practical, and usable. And she's on the podcast today to discuss her book, Nonprofit Program Evaluation Made Simple. Get your data, show your impact, improve your programs. I love that title. And the book outlines a clear approach, like a blueprint, filled with real world stories as well as examples of evaluation plans, surveys, and reports. So welcome, Shari. Thank you for having me here today, Julia. Yes, I'm excited. And I want to mention that this is being released on March 15th. There's an upcoming event that your coordinating survey design made simple. It's going to be on Tuesday, April 18th, 9 to 11 a.m. Pacific time. Registration information is in the show notes. So make sure you click on that link. And you can use the discount code, get your data. That's all one word, no spaces, get your data for some money off that registration. So thank you for that. That's for non-profit nation listeners. So I appreciate that, Shari. Of course. Happy to. Yes. And where did you get your start? We usually start with your journey, your journey into non-profit work, and then how you started to focus on program evaluation. Yeah, that's a great question. So my journey into program evaluation started here in Portland, Oregon in 2001. I landed a job at the Northwest Regional Educational Laboratory. That is a mouthful. A few years ago, they branded another called Education Northwest. And there I learned about gathering data through a specific framework to help these programs understand what is going well and where improvements might be needed so that they are doing data-driven decision making. And I just fell in love with it. I just love the whole process. And then in 2005, I broke out on my own and started evaluation into action. And that gives me the opportunity to work with a range of organizations at Education Northwest as the name implies. It was all education programs. So I have the great honor of working just with so many different organizations today. And you wrote this book, which I understand in writing a book. It can be incredibly challenging, but also incredibly rewarding. So yeah, talk about your book, Who Is It For and Why Did You Write It? Well, my book, Nonprofit Program Evaluation Made Simple, I Wasn't Planning to Write a Book. I don't know how many people out there are like, yeah, I'm going to write a book. Okay, I wasn't planning on it. I do a lot of different workshops at different kinds of conferences and for organizations. And every time I did a workshop, I would have someone come up to me and say, Hey, do you have a book? And I was always like, no, do not have a book. And so one time in 2017, I was presenting on building a culture of evaluation at the Oregon program evaluators network. Also referred to as open because it's easier to say. Anyway, and so after I presented there, someone came up to me and said, do you have a book? And without a thought, what tumbled out of my mouth was not yet. And I thought, wait, wait a minute, universe or manifesting it. And this little spark started of like, yeah, you should just write a book. And it's really primarily for nonprofit organizations, nonprofit professionals. But it can also be for evaluators who are looking for a different way to approach program evaluation and to understand how to do program evaluation collaboratively with nonprofit organizations. So that was kind of the start of it. And what I get really excited about with the book is there is a companion website. And in the book, there's the web address, as well as a passcode. And on that companion website are real world evaluation plans, reports, templates tools. And I update it on a regular basis with the generous permission from the organizations that I work with. That's incredible. So to not just have the book that's going to give people all sorts of fantastic ideas and how to's, but then actually having the templates and the cheat sheets and those kinds of documents available to them, I think that's wonderful that you did that. That's fantastic. I learned by example, like, you can teach something to me all you want, but actually show me like an exact real world example. So it would be too long to put in the cumbersome to put in a physical book, an entire evaluation plan. So on the companion website, there are full evaluation plans on there. And you can kind of see the structure that I talk about in the book and how it's applied and shows up differently and is customized for different organizations. And I'm sure in the research for this book and also just in your work every day, you come across some maybe common pitfalls, some common challenges. Can you talk about some of these commonalities when nonprofits are planning to do a program evaluation? Yes, I can. So I have a big grin because I get really excited about talking about some of these common pitfalls and the easy fix to avoid them. And so there are a number of them. I'm going to talk about one in particular that I probably see the most common. And that is to be realistic of what you can collect. I cannot tell you how many organizations I've met with and they hang their head in shame and they admit to me, well, we did a survey and now we found no one has the time or the skills to analyze the data. So they're just in the filing cabinet or they're on the network or they're in our survey monkey account. Like no one has the time or the skills to analyze the data as well as synthesize it into a report so the data can be used. So the data are just sitting there, participants spent their time completing the survey and no one's using the data, which breaks my heart. So this is such an easy fix. This is where you bring in that really organized person to be at the home of the program evaluation planning process. Someone with that project management kind of gene in them because in the beginning when you're doing a survey, one of the questions needs to be, oh, if we're doing a survey at the end of it, who's going to analyze the data? Who's going to synthesize it into a report? How are we going to share out the findings with the participants in the community at large? How are we going to use the data? So answering those questions in the beginning when you're also creating the survey itself will ensure that not only do you gather the data, but then you actually are able to analyze it, use it and communicate it. Having the end in mind is so important, especially when I teach marketing plans and digital marketing. And it's exactly like you said, there's just so much that happens and oh, we have to set up all these platforms and oh, we have to create all this content and we have to register all these URLs and there's so much franticness around the platforms and the tools, but very little thinking about the why, why are we doing this? What do we hope to accomplish? How are we going to use these channels? So thinking all of those questions through, I would imagine is sort of the key pillar of any program evaluation. Absolutely. Project management and planning is key, right? You have to think about it with the end in mind. I love that because I think that's just really important to avoid that really common tip. And also, I love that you mentioned the people component. You know, who's actually going to do this work? This sounds great, or maybe a funder requires it, or maybe a board member asked for it, who's actually going to do it? So that leads me to my second question because it's a team effort. You can't just be the development director or the grant writer or the marketing person or the executive director and just sort of create this in a vacuum. So how can nonprofits build this buy-in with their teams and address kind of any staff resistance or fears that they might have? Oh, this is a big one. This is what got me so excited about writing the book to begin with because I felt like this was a missing link. People were jumping into creating a plan or doing surveys. They were jumping into those pieces. It's like building a house before you've created the foundation. And building that buy-in and addressing staff resistance or fears helps to build that foundation so that you have a solid house that you're building on top of that, right? Program evaluation at score is a learning opportunity. And oftentimes, people are doing program evaluation because funders require it, right? I'm sure that you've seen this too. Like, we have to get these data because the funder is requiring a report in two months or whatever it is. So there's like, you know, there's panic with it. There's a resistance because, oh my gosh, I'm already so busy doing all of these different things. I don't have time together the data. Now, when people feel mandated because it's required by funders, they show up differently, right? They may also have a fear of like, what if the data show that we're not meeting our goals? You know, will we lose the funding? Will I lose my job? Will the program be discontinued? All of this fear, whether it's conscious or not, I think feeds into how people show up and do program evaluation. If we flip that script, if people change the narrative to, don't we need data to know if we are making the difference we intend to make? And then they put themselves as the nonprofit and the driver's seat of determining what kinds of data they need to collect rather than it being mandated by funder. Oh, I will say you still have to gather the data required by funders to, you know, satisfy those report requirements. But there's no reason why you can't have a conversation with a funder once you have a plan in place that works for you as the nonprofit and say, this is what we're already doing. Can we provide you these data? Will that fulfill the requirements that you have? Julia, I'm not going to guarantee that I'll work 100% of the time. But my experience has been when I work with organizations and they have that conversation with the funder, the funder is just super happy that there's a program evaluation plan in place and they're like, that looks great. I'm not going to guarantee it, but that's been my professional experience so far, is that having that dialogue is really important, right? About why are we doing program evaluation work? Like, what is it all about? If it's a feeling like it's mandated, there's going to be resistance. If it's feeling like it will help us understand, if we are doing the best possible job at serving the people we want to be serving, people show up differently. It's a mindset. So once that mindset shifts for people, then we make sure everybody is on the program evaluation train before it leaves the station. Because what I saw happen early in my career is that some people be on the train or maybe someone higher up would require everybody to be on the train, and that doesn't work. Don't do that. But once everybody is on that train and leaves the station, everything that follows gathering data, those pitfalls that we were talking about, those don't occur because there's buy-in to doing it, so then you're successful at measuring what matters. The mindset shift is incredibly important, and I'm thinking about the mindset shift that often happens, has to happen in fundraising, where fundraisers to be effective really need to move from this mindset of, oh, I'm bothering donors or donors or fatigued or donors want to hear from me, to I'm providing them with an opportunity to make a meaningful gift and to have agency and to be involved in something that they're passionate about. So this mindset shift you're talking about around evaluation rather than something, oh, it's on the to-do list or it's on our plate or we have to do it, and it's a slog to let's see how this can actually improve our services or maybe shed light on some things that we knew were happening that we are excited about, and then we can help implement things that are working and maybe get rid of some things that aren't working. So that mindset piece is just so important because I know people, they are going to see this episode, they're going to want to jump right into, now we're going to do more a little bit more of the tactics, but what tools should we use in surveys and how to get more people to respond to surveys, but I want to talk about measurable outcomes. Now I remember when I was a development director and I worked at a domestic violence shelter and we got a grant from the United Way, and it was when United Way was shifting over to impact on logic models, and we could no longer just rely on outputs. We had to create measurable outcomes and it was so challenging. So how can nonprofit struggling with this, how can we start to create these measurable outcomes? Well, that is a great question because the key is that it has to be a collaborative process, and I joke around when I do, you know, workshops and whatnot, collaboration is probably one of the most words I say the most frequently, because collaboration is key for a successful program evaluation process, and it starts with making sure everybody is in the room to define those measurable outcomes together that either collect, use, report or touch the data in any way, and this is where you can make sure the data that being collected, it's realistic to collect because program staff will raise their hand and say, no, no, I know we can't collect that. Let's maybe compromise and talk about collecting these other things or whatever they might be. So creating measurable outcomes collaboratively really ensures that when grant writers need to put those measurable outcomes into a grant proposal, they know with confidence program staff are gathering those data, right? Because too often I've heard the story that grant writers do that in silo and they put in outcomes to secure the funds, and the program staff may or may not be already gathering those data so that can create some tension and some report writing at 2 a.m. Easy fix, easy fix, you have to collaborate. I want to share an example with you. I work with Project Lemonade, and they have an internship program for foster youth aging out of the system. So we brought together staff, participants and partners to define measurable outcomes. That is what exactly do you expect to have changed as a result of the program activities. So together, talking it through, we created seven measurable outcomes. And I just want to read one of them for the listeners today, and that is to improve life skills such as communication, social interaction, and working in a team. By having that very specific measurable outcome, development staff know now what to put into grant proposals and program staff are gathering data around that particular outcome that can be reported out. So everybody's literally on the same page of what they expect to have changed. I will also say measurable outcomes like you were saying earlier about impact models and logic models. Measurable outcomes are a cornerstone whichever model you choose to do. You have to have those measurable outcomes in that, right? Because what an impact model or a logic model basically is a visual summary of what your program does and the change is expected to make. Measurable outcomes are a part of that story. They need to be in there. But just so that to drive the point home, it has to be done collaboratively. Do not assign it to one person to go off into a cubicle and create it by themselves. It has to be a collaborative process. What about specifics in these outcomes? Because I found that when I was writing grant applications and working with some funders, they might require you to put numbers in. We will improve this job skills by 50%, something like that. So how do you calculate those numbers or do you recommend if you don't need to put them in leaving them out? Well, I think when you put in, I just refer to those as targets, right? And yes, you're right. For some funders, they do require that. And I feel like it's somewhat subjective. What do you feel like is realistic to expect to have occur within the program that you're doing, right? Are you doing this program three times? If you're doing it three times in over two months and say classrooms, how much change can you truly expect? So I think this is an opportunity sometimes to have that conversation, potentially have that conversation with the funder on expectations. But in terms of how to measure it, then yeah, you need to gather some baseline data to understand where people feel they are initially, perhaps in improving life skills, for example, and then measure that over time. So you could report out the percentage that have changed to see if that target has been met. Okay, that is helpful, because I imagine these evaluations, and I know you talk about this in the book, there's quantitative and qualitative outcomes that you can measure and talk about. And I usually recommend putting a story, if there's a narrative place in the report that you can actually add characters beyond the character count, like to put that kind of context, that story in. Amongst all the numbers, I think it helps tell a better, fuller story of what the organization's impact is. Yeah, I 100% agree. I think you have to have the numbers and the stories to be able to talk about progress towards a particular outcome, right? So when you let when you report it out, you restate the outcome, and then you share the numbers and the stories that align to that outcome to show to what degree progress was made. And that's how the alignment works, right? So you create the measurable outcome, you gather data that align to the measurable outcome, and then you report out the data aligning back to or pointing to that measurable outcome. So there's consistency throughout. That's important. And one of the key components of a program evaluation is the survey or surveys. I know there's a lot of myths and misconceptions and pitfalls and challenges of creating a survey. So what goes into a useful survey so that we can measure what matters? Well, that's where you build on those measurable outcomes, right? I think that is an excellent point because it leads perfectly into this idea of alignment. You have to have alignment between your measurable outcome and your survey items. Julia, one of the most common questions I get is Shari, we need to do a survey when we're not sure what to ask. And I say, well, what do you expect to have changed? You have to define the impact you expect to see occur, then you can measure it. If you just create a survey, just kind of like, you know, let's just brainstorm what we want to ask, then it's not going to align to what you expect to have changed. So it's harder than to tell your story, right? Once you have the outcome, it provides kind of the plot line for your story, right? So I'm going to talk through another example. I work with Northwest Real Estate Capital Corporation and they manage a number of affordable housing communities. In 14 of those communities, they have a resident services program. So we set up a program evaluation to understand to what degree, what is going well, and where improvements might be needed, right? So collaboratively, I don't know if anybody's keeping track. That might be the fifth time I say collaboration, but collaboratively, we define five measurable outcomes to meet everybody's data needs, right? So that all staff are getting the data that they need. So one of those outcomes is around building community. And I'm just going to read it to you. So you have like the and then I'm going to read off one of the survey items because I want everybody to hear the realignment. So when you create your outcome and you're creating your surveys, you shouldn't be creating your survey items based on what's in your outcomes, right? So one of their outcomes is to improve socialization within the community, which will lead to reduced feelings of stress and isolation. So again, all about alignment, we did a lot of different survey items to measure that particular outcome. But one of them is that residents completed on the survey because of the resident services program, my overall stress level is and they have four choices, better, the same, worse, I don't know. So do you hear that alignment? I'm hoping everybody listening is not either. Yes, I hear the alignment. So right here, the alignment of, yes, it's not a throwaway survey question. It actually is going to lead into the measurable outcome. So once you have the data, right, you have the number, in this case, it's a quantitative piece of data. So then you can report back out X number of people or X percentage of residents said and whatever they had to say about feeling like their stress level has gotten better, the same, or worse because of the resident services program. You're asking them to comment on did this change because of the program, right? And then you're reporting out those data. So again, we asked more survey items than just that, but we were able to then speak about building community and what that looks like. And so it's really empowering to have those data to understand what is really going well and where improvements may be needed. Like you really uncover things you would have otherwise not known. Another thing with survey design, so you're measuring what matters is go through your survey. And if there are any questions you're asking because like you feel you should take it off your survey, only ask questions where you will use the data in some capacity, either to report out to funders for program planning or for some other purpose. There's a lot of times I feel people ask for data because they feel like they should. They really should ask. Really, will you use it? Use it with the lens of will you use it? If the answer is no, don't ask it, right? That's how we measure what matters. So how can we get our survey response rates up? I know this is something that people do struggle with. I'd love that first of all, I want to comment on the piece of, will you use this? Because I work with clients on donor surveys or constituent stakeholder surveys around what would you like to hear from us? What kind of stories, marketing content, where are you on social media, that kind of thing? And I always say, do not ask a question unless you plan on using that data. Because it takes away time and it decreases probably the survey completion rate if you just have filler questions. So short and sweet, I always agree with that. So what is your secret to increasing survey response rates? Communication before, during, and after. So ideally, before when someone first signs on to your program, you can include, if you do an agreement or there's a registration form, you can include a little paragraph of by participating in our program. You agree or you will be asked to complete surveys, participate in focus groups, whatever your data collection methods are. Basically, you're setting an expectation when they first become a part of your program. So you're communicating beforehand, hey, we are going to be asking you to participate in data collection activities. And your input is really valuable to us. Please plan on participating when asked, right? So you're setting an expectation. Then when you actually have a survey that you're doing, maybe a month before you do the survey, it goes out in the newsletter, maybe on social, wherever your audience is, you want to let them know, hey, in one month, we're going to be sending out a link to a survey. The purpose of the survey is to understand what you think about our program. We can only improve if we get feedback from you. So then you're setting it up, right? Then once it goes out, I typically will leave it open for at least 10 days. So while it's open, there are reminders, right? Maybe three or four days in. That's a good benchmark, 10 days, okay? That's generally what I do. It can vary. Sometimes it's longer. Sometimes it could be as much as a month. I work with organizations on what's going to make the most sense for their participants, right? But 10 days is generally what I do. So you send it out and then you send a reminder. Thanks if you've already completed the survey, just a reminder. It's due by and remind them of the date, remind them why their voice is important. And also let them know you're going to be sharing back the results. So for everybody listening right now, raise your hand if you've ever done a survey and you never hear about the results of the survey you completed. Almost every time. Absolutely. I bet you are all raising your hand right now, right? Or what's going to be done with the survey responses? Right. So just like before, when we talked about some common pitfalls and being realistic, the same is true here. You want to have an intentional plan. It doesn't have to be a robust plan, just an intentional plan on how you're going to share out the survey results. So during the survey, you're reminding them about it. And then after the survey closes, one last email to everybody. Thanks everybody that completed the survey. Our response rate was whatever it was that we look for in the next month or whatever it's going to be to share a summary of survey results and then do that share that survey summary. Right. So then a survey summary can be very high level, very simple. Where the here are the five things we learned, here are the five things we found were going well. Here are our action steps we're going to take based on the data. We look forward to you continuing to provide your feedback in the future. Wow. And that really leads into my next question, which is reporting out this data. So we've spent a lot of time and we've been really intentional creating these measurable outcomes, crafting a survey or crafting some kind of data collection mechanism. We were collecting the data and collecting these stories. So in your blog and in your book, you talk about visual communication being important because and I totally get that. No one wants to see a bunch of numbers on a chart and using data visualization to create compelling reports. So what are some ways that small nonprofits can do this kind of on a shoestring? Well, you said it, right? Gone are the days of just text and tables, think goodness, because those were like some long reports. We want interesting graphics and it doesn't have to be really fancy to be honest with you. If you are honest to your string budget, you can just change the font size, change the color, put it in your brand colors. So perhaps when you're sharing out, you know, 87% said this, your 87% is larger. Maybe it's a 16 point font and it's green if that's part of your brand colors, right? With a statement that also supports that 87% so you have the numbers and the stories and outcomes. I do have on my website under case studies, there are some examples on there. You don't have to have the book to access my website. You can just go to my website, go to the case studies page and there are examples you can look at because again, I think learning happens by example. And what you'll see there is like the usage, like I was saying of different font size, different colors, photographs, different graphics. So you are visually sharing with people the results. It's so true that a picture is worth a thousand words. So I don't think you have to, I'm not a graphic designer. You don't have to be a graphic designer to do this. You just have to think about visually, how can you communicate the results? I mean, you're still going to have words on the page, but there are opportunities. And if you need support, yes, I talk about it in my book, but one of the great data visualization gurus is Anne Emmering and she has a blog and she has so much great information. She has, I think she even has classes and academies and so on. And she does a wonderful job of breaking down like, here's how you maybe have shared demographics in the past. And here's a more interesting way to share your demographic data in the report. So she does a lot of wonderful work of sharing by examples. So again, I do talk about it in the book, but if someone's really looking for some more information, I do recommend her as a resource. That's fantastic. That's really incredible. And I know that, you know, just kind of shift gears here a little bit, that you are incredibly creative. And we've talked about writing a book, which is a one way to be creative, but you write songs, musicals, and plays. So how do you keep up this creativity and tell me about your creative side and some of the things that you do? Yes, I have been creating and particularly writing songs on the piano most of my life. So my son just started taking piano lessons. Then you'll love this book then. Okay, so the, so I was a piano teacher a million years ago, like a couple careers ago, because I started out wanting to play piano, wanting to be a rock star. Yeah, she said you went to Berkeley. I went for it. I started out at Berkeley College of Music. That's not where I ended, but yes, my plan was to be a rock star. You can, you can all see how that went. But I do love writing. My most recent creative adventure was in March of 2020. When COVID first hit, my daughter is an artist. She was in high school at the time. And we did a children's picture book called The Piano. And it's about the friendship that evolves between a musician and her piano and how it evolves over time as she gets older. And it's told from the perspective of the piano. Oh my gosh, I need this for my son now. I wrote the original story actually in 98 before my daughter was born. But my daughter's always been an artist. So we just busted it out. We got very focused since no one knew what was going to be happening in spring of 2020. It was such an uncertain scary time. We just channeled that energy into a creative project. We got very lucky and found a publisher. Black Rose Writing published it almost a year ago, February 3rd, 2022. And it's just the best part of that creative project was creating it with my daughter, Elle. Because, you know, she's a freshman in college now. And, you know, when our baby's leaving us, it's a little bit hard. But yes, we have this book forever and ever that we created together and just our Sunday meetings and talking through the process. It brought joy into a time that was extremely uncertain. So yeah, that is something that we did. So thanks for asking about that. Do you want to talk about what your dream is? Because you said it to me, you wrote it down. Oh, right. So my dream. It's your dream musical. And I would, I would, I love this. I love musicals. And I think this is amazing. Okay. So anybody listening that maybe has some funding dollars around, I want to collaborate around this, my dream is to get a gig writing a musical about a day in the life of a nonprofit. Because I love writing musicals. It's just so much fun. And I think that being a part of the nonprofit sector for, you know, a couple decades now, I could in my head like already start, I'm sure you could too, right? Start to write like what would that look like? I also think it could be really fun at a nonprofit conference instead of a keynote. Let's have a mini musical about a day in the life of a nonprofit. I think that's a fantastic idea. Or some, I love that idea. Do you follow Voule? So Voule's latest thing, because, you know, as we're recording this, it's right around Valentine's Day on Instagram is to do reels about like romantic, those like bodice-rebraying novels around nonprofit professionals. Oh, I haven't seen that. I'll have to check that out. Yeah, they're hilarious. And the ASMR for nonprofit professionals. So I think, I think Voule might be a great collaborator for you. Okay, Voule, if you're listening, let's do it. Let's write a musical. Yes. Okay. Awesome. And also I wanted to ask you about your upcoming event. I talked about it a little bit, but if you could just give us sort of the who, what, when, where, and how people can register? You bet. So it's survey design made simple. On Tuesday, April 18th, from 9 to 11 Pacific Standard Time. I am so excited about this because we're going to, it's been very interactive. And we're going to dive into what to ask, how to ask it. I know we already talked today about increasing your response rate, but we're going to get deeper into all of these different topics. So at the end of the two hours, people have concrete ways to design their surveys with confidence. So they're truly measuring what matters. And I know you have it in the show notes. You're going to have like the link. I have it in the show notes. And yeah, just make sure to use the code, get your data, all one word, all one word and all lowercase. Okay. And where can people find out more about you, the book and some of your services? You bet. So my website's the best place, evaluationintoaction.com. There's a lot of information on there about my services, about different resources. There's a webpage dedicated to my book. So I invite you to check that out. And I hope you enjoy your program, evaluation, learning, and journey. Yes. Well, thanks, Shari. And I will be sure to update all of my listeners on the evolution of this nonprofit day in the life musical. Oh my gosh. It would be so I'm telling you, someone's listening that coordinates conferences, reach out, we're going to make it happen. All right. Thank you so much for being here. Well, thanks for having me, Julia. It was really wonderful. Well, hey there. I wanted to say thank you for tuning into my show and for listening all the way to the end. If you really enjoyed today's conversation, make sure to subscribe to the show in your favorite podcast app and you'll get new episodes downloaded as soon as they come out. I would love if you left me a rating or a review because this tells other people that my podcast is worth listening to. And then me and my guests can reach even more earbuds and create even more impact. So that's pretty much it. I'll be back soon with a brand new episode. But until then, you can find me on Instagram at Julia Campbell, seven, seven, keep changing the world. You non-profit unicorn. You're welcome. Thank you. Bye.