How to Create Donor Surveys That Improve Your Fundraising
If you don’t regularly survey your target audiences, your fundraising efforts are merely educated guesses. And if your surveys aren’t carefully crafted, you’ll end up receiving misleading data.
But done well, surveys can empower you with real, actionable, and surprising insights into what your current and prospective donors do, think, and need. Smart surveys can turn your fundraising strategy into a fine-tuned, sophisticated engine of revenue.
In this webinar, we examine all of the components of a great donor survey project—including how to build it, deploy it, and analyze the results. Perhaps more importantly, we’ll identify and help you sidestep the countless pitfalls that plague so many surveys.
By the end of this webinar, you will be able to:
- Identify the types of surveys—and when to use each
- Create a survey strategy that gets you the data you actually need
- Analyze the data objectively
Hello and thanks for joining us today for “How to Create Donor Surveys that Improve Your Fundraising.” We’re gonna be talking today about how donor surveys can help you learn more about your donors so that you can better gauge them. We’ll be discussing strategies to get the answers you’re looking for, tactics you can use to build well-crafted surveys, and highlighting considerations for analysis.
I want to say right up front - this is not a session for quote unquote data nerds, but rather it’s a deeper dive into how execute a meaningful survey for the rest of us. Those of us that may not have formal training in research, but still need to do the work within our organizations. And this likely goes without saying, but most of you will not be surveying your major donors, you typically will have a relationship with them, you might take them out to lunch. Doesn’t mean you can’t ask them some of these questions, it just means that as you move through this session today, know that you are more than likely going to be having one-on-one conversations with those types of donors. This is more for your smaller individual donors, peer-to-peer, things like that.
And then everything else I say here applies to most surveys. So for example we are going to talk specifically about donor surveys today, but if you need to run a survey about your brand, or about your website, or any sort of communication survey, event survey, these lessons will hold true for those examples as well.
Alright, let’s get started.
Who am I?
So, I am Rachel Clemens, I am the chief marketing officer at Mighty Citizen. We’re a branding, digital, and marketing agency located in Austin, Texas. And we serve mission-driven organizations. In practicality that means we help non-profits, associations, government, and education (primarily higher ed), increase their revenue, boost awareness, and better their communities. Typically we do this through branding, fundraising websites, online marketing, analytics, you name it, it’s got communications related to it, we probably do it. I feel like it’s important to speak on donor surveys, because it’s important that we understand the motivations that drive our donors. So when we do things like campaigns, or websites, or any other kind of work, that we really understand what our audiences need from us, what they’re struggling with, and how we can help them and better engage them.
What surveys can tell us
So, there are a lot of things that surveys can tell us. The first of these is what motivates our donors to give? Why are they giving to us? How can we better engage with our donors? We can find that out through surveys. Who are our donors - not just from a demographic perspective, but also from a psychographic perspective. Why aren’t certain audiences - you can insert your audience here - giving more? So for example: why aren’t younger audiences, or younger alumni, giving more? Why aren’t our peer-to-peer donors giving more? And how good is our stewardship? Are we stewarding our donors in a way that is resonating with them?
If I were to walk into your organization and start asking these questions - I’m sure I’m going to get answers, and whether or not the room agrees on their answers, is debatable. You know, sometimes people have different feelings about these answers. And that’s what we want to avoid, we want to avoid those feelings that people have about the answers to these questions or their gut instincts, we want to get to the truth underlying these questions.
Research Kills Opinions
We often say here at Mighty Citizen, research kills opinions. If I walk in and ask those questions in your organization, may get lots of opinions on the answers but unless those opinions are founded in research and grounded in research we don’t know that they’re actually true. So, we want to be able to have data available to us, that allows us to make sure we’re moving in the right direction, and to allow us to push back on initiatives that we know the data tells us just won’t work.
By the end of this webinar today, you’ll be able to:
- Describe the shortcomings and benefits of donor surveys. There are shortcomings - we’re going to talk about those, as well as the benefits.
- You’ll list the six principles of an effective survey. We’re going to go through - that’s the strong strategy we do at the very beginning before we even craft a survey question. It lets us know what we’re really trying to accomplish with our survey.
- We’ll talk about writing good survey questions. It’s really easy to write bad questions, we want to avoid that, we’ll talk about making sure we write good ones.
- We want to identify the three unique types of surveys that we might need to do with our audiences.
- And we’re going to look at analyzing and interpreting our survey results.
The Shortcomings and Benefits of Donor Surveys
First things first, let’s talk about the shortcomings and benefits of surveys. And you will want to consider both of these before you conduct a survey. Let’s talk first about the case against surveys.
The Case Against Surveys
People tend to estimate, if I give you a scale of one to ten, you know, your reaction in answering that is usually, “well it’s kind of like a seven.” Like, human nature is to estimate. A lot of, “that sounds about right,” happening when we fill out surveys, and that adds up over time and volume.
People tend to give inaccurate answers. People will either sometimes tell you what they think you want to hear they could be swayed by bad questions - we’ll take a look at that in a minute.
They tend - if you ask questions about the future, they don’t know how to answer those questions, we don’t have crystal balls. Humans have been shown to be really kind of terrible at predicting the future. So if you ask me, you know, “are you likely to give again in the future?” I can tell you about my intention to give again in the future, but whether that is actually accurate - hard to say.
Bad questions look like good questions. We have a saying around here that “bad questions don’t smell.” Meaning that don’t identify themselves as bad. So, from a writing perspective, it’s really easy to write bad questions without realizing it, and then from a respondent perspective it’s really easy to answer bad questions without realizing that they’re bad. What I mean by “bad” is typically, biased; but that’s not always the case. We’ll take a closer look at those today.
It’s hard to reach statistical significance. Statistical significance is basically the knowing that you have asked enough people, gotten a high enough response to be able to hold your answer up as truth. So statistical significance means truth. It also typically means a lot of money and a lot of time; because you have to make sure you get enough respondents, you spend a lot of effort to do that - most of our clients are not in this for statistical significance. They’re not in it for absolute truth - they want to use, case studies - i’m sorry, surveys as an indicator of truth. They want to get enough research done so they feel strongly about how they can move forward, but typically we are not reaching statistical significance. And I think that’s important to know up front.
Keep in mind that surveys are wide but shallow. Meaning - you can reach a lot of people with surveys, but you can’t go very deep into their feelings about something via surveys. Most people just will not fill out a very long survey that allows you to do that. So typically, you want to marry your surveys with stakeholder interviews, or one-on-one interviews, and/or with focus groups. We often, when we work with our clients to do research, will do surveys first, so that we get an - some understanding of the lay of the land, and then come back and do one-on-one interviews with key stakeholders to see, you know, to drive into that a little deeper to say, “you know we’ve been hearing X, Y, and Z, what do you think about that?” So keep in mind that typically, you want to marry your surveys with some additional form of qualitative research.
4 Benefits of Donor Surveys (F-A-C-E)
So those are the sort of the case against surveys, there’s lots of benefits to surveys as well.
The first being that they’re flexible, and can be conducted in lots of different ways. Via phone, website, email, social media potentially, so they can be done in lots of different ways and over lots of different time periods.
They also allow you to remain anonymous, or allow your respondents to remain anonymous. This can be great for getting really frank feedback, or unbiased feedback. We’ll talk more about that.
They’re really cost effective, they typically have a low investment - in terms of dollars, with a potentially high return. You get to learn a lot of information about your audiences for what can be very little money. They’re also pretty quick to launch and see them through. You know if a survey is open for two to three weeks, to the public, or to your audiences - typically start to finish on a survey can be done in a couple of months.
They’re also extensive. No other method provides such a large number, or potential, for number of responses. So you can get a lot of information really quickly.
So, should you conduct a donor survey?
The answer to this, I believe, is yes, but only if you’re willing to do the hard work. And here’s what I mean by that - just like Indiana Jones wants to swap out this bag of sand for the golden idol, there’s a great reward on the other side that you have to take a step of faith and a willingness to change and to take a little risk and not knowing what answers you’re going to get back. Do not undertake a donor survey, unless your willing to make a fundamental transformation in your non-profit, because the information you learn may have real implications and changes that are needed for your organization. Not just from a fundraising or a communications standpoint, but say you come to find out that someone, that your audience that you’re surveying really hates your program.
Ok that might require a fundamental change to that program, so you’ve got to be willing to go into a survey knowing that you’re going to learn something you didn’t expect, and that you need to react and make changes based on that new understanding. We often talk about a jar, a goodwill jar, between you and your donors. And when you ask a donor to take a survey, you’re asking them to make a deposit into the goodwill jar. Now you in turn need to take what they give you, and make real change. And that is your deposit back to the goodwill jar. Ok, so there’s no point asking a donor to give if you’re not willing to give in return.
Six Principles of Effective Survey Design
Let’s talk about the Six Principles of Effective Survey Design. You’ve now decided you’re ready and willing to do a survey, so let’s talk about where you start. This is how you set up the strategy for your survey, before you even begin to write a question. I want you to think about these pieces before you get started.
So first of all, a survey must be purposeful. That means you need to establish right up front what your goal is - what are you trying to accomplish?
What’s the number one thing that this survey need to do for you? Now you may find that you have lots of different goals as part of the survey, that I want you to list on priority order so that later when you start to have you know, some conflict between your goals you stick with the priority order and that helps you make decisions.
What would you do with the information once you have it? This is really important. A goal can’t be just to learn or to know. So I’ll give you an example: if you’re going to ask a question of your audience, in your strategy I want you to set up - what is the question you are asking, and then I want you to follow every question you ask with - what we’ll do with the information. Ok, so “what motivates our donors to give?” might be your question. What we’ll do with the information, “we will use this information to better communicate our mission and value to potential donors.” Ok, so I want you to have an answer for every question you ask of what you’ll do with that information. If your answer is, “because we’re curious.” It’s not a good enough question. Again, think of the goodwill jar. You need to actually do something with the information you’re collecting.
Who has to have buy-in in the survey? So, I’ll give you an example - I’ve spoken on this a few times with a woman named Julie Shannan who was formerly at an organization called “Girlstart” and you know, we were doing a donor survey with them and I asked them, “Ok, who needs to be involved? Do you know, does the executive director need to be involved?” And she said, “I don’t think so.” And then, we got halfway through the project and the executive director wanted to add a few questions to the survey. So that’s fair, that happens all the time, but you want to make sure that you’re thinking of that as early as possible. Get everybody that might need to be involved early on in the process in the room and working on those questions together. Sometimes that’s event people, it’s communications, it’s development, it’s programs, it’s your ED, or your executive team. Ask yourself who needs to have buy-in early on?
You may have different surveys for different goals so keep in mind some of your goals for your surveys could be learning about an audience. Maybe you just need to learn more about your peer-to-peer audiences. There’s lots of examples there so think about what you might want to learn about an audience; or you might want to compare audiences. I’ll give you an example of this: so I mentioned “Girl Start,” we did a donor survey with them, and their goal was to find out why their donors give to them. And they had two audiences they really wanted to look at. They definitely wanted to look at donors that give less than $500 a year, and they wanted to look at those over the last two years. So this is how we’re polling our audience. Donors who have given gifts between $1 and $499 in the last two years - not including peer-to-peer donors. Or donors who had given $1 to $499 in the last two years - only including peer-to-peer donors. So they were comparing peer-to-peer versus non-peer-to-peer donors under $500 over the last two years. So those are the two audiences, so they wanted to compare how, the motivations for giving in those two groups. So, that’s a good example of how you can compare audiences.
A good survey is targeted. Which donors are you going to target? You can slice and dice your donors any number of ways. You might want to look at them by age, by donation amount, by how often they give, how they give - would they give by email, or online giving, or mail. All this to say, you do need to have confidence in your database, so you need to make sure you feel like your database is a good representation of your donors and has solid data in order to do a good donor survey that is segmented.
You may want to target by different relationships. Maybe it’s monthly donors versus one time donors, younger donors versus older donors. There’s lots of different ways to look at targeting your audiences.
You may want to target by different delivery methods. If you have an audience that you know only gives via direct mail, maybe you want to target those people via a direct mail survey. Ok, so there’s lots of different ways that you can survey your audience. Now, I’m going to talk almost exclusively in this session about online surveys, because they are so much easier to do, to administer, and also to analyze. So I’ll be talking a lot today about online surveys, but you can survey via phone or via mail if you feel your audience is going to respond better to that. And again, it kind of comes down to which target audience you’re trying to appeal to.
How many people should I survey? We get this question a lot when we’re talking about segmenting our targets for surveys. Typically it depends on the purpose of your survey. If you are trying to get an idea of the broad appeal of your brand, you might want to survey lots of people. If you’re trying to get a feel for a certain program, you may just want to survey audiences that have engaged or given to that particular program. We typically say that small and representative is better than big and broad. If you’re trying to really drill in on a particular question or answer that you need from your audiences, it’s a little bit better to segment the audience a little smaller and get their representative voice in terms of how they would answer this survey. I will add a note here, a bit of a caveat, that says if you have an audience, you want to make sure your audience is big enough so that when you survey them you get enough responses to feel like it’s enough data to actually base some analysis on. So for example, if you survey, you know, your audience has a hundred people in it, and your response rate is expected to be about 5%, then you’re going to get 5 people to fill out that survey. That’s just not enough data to say, “oh we have definitive answers to our questions, because 5 people responded.” Surveys are meant to have many responses, you want to make sure you have enough that you feel like you can hold it up and say, “this is representative”
Surveys should be short. That’s Napoleon by the way, for those of you who may not have gotten my witty joke. Surveys should be short, and if they are short, they produce a higher response rate. So typically we see, you know if you’re talking about any kind of survey, your response rate is typically between 2 and 5% usually. With donor surveys we do personally seem to see that go higher, probably because these audiences are just generally more engaged with us, they care about our organization and our mission. It’s not a surprise for us to see survey responses well above 10%.
Ideally your survey takes fewer than five minutes to complete - it should be nice and short, easy to fill out. You want to let your audience know up front how long that survey will take. If it’s going to take 3 minutes to fill out, you know, let them know in the email or in the communication at the front of the survey, “this survey should take about three minutes to complete.” You want to make sure they understand how short and easy it’s going to be.
Ideally your survey has fewer than ten questions, and fewer than that if you include a lot of open-ended questions. We’re going to take a look here in a minute about open-ended questions but those are the questions where - you ask a question and you ask them to fill in a response themselves, it’s not the click of a button. So, those just take longer to complete and you want to ask as few of those as possible.
Good surveys are delivered well. You need to make sure that they, that your survey is intuitive and works on multiple browsers. Work with whoever does your Google Analytics, have them pull the most popular browsers for your website and those are the browsers you want to check your survey on, because those are the browsers that your audience is using. So make sure you have tested it on many browsers that your users may be engaging with you on.
Also make sure it is mobile friendly. A lot of us check our email on our phones while we’re out and about. If we check our email on our phones and it has a link to a survey, that survey is also going to be opened on our phones; so make sure that your survey works well, functions, looks good on mobile.
When you send a reminder about your surveys, you’re going to send an email about your survey letting people know it’s open and asking them to complete it. When you send a reminder, you know, a little bit later, go ahead and remove those people who have completed a survey already. If you keep everyone in that email list, those that have already completed the survey are going to start to doubt whether they actually completed it, whether it went through, maybe you didn’t get it, maybe I should complete it again. There’s lots of things that are going to create doubt there, don’t do that to them, and don’t do that to your organization or your brand. Go ahead and clean out and remove people who have already answered your survey.
Often, surveys are incentivized, not always but incentives can be good because they increase your response rate. They can increase your response rate by 5 to 20%. So, what do we mean? Incentives are basically offering someone something in exchange for completing your survey. Keep in mind that you don’t want your incentive to be too large, it can bias your results. So for example: if you tell people you’re going to give away a $500 gift card to Amazon, for example, that is a really big lure. And a lot of people might just fill out the survey to get into the opportunity to win that gift card. So you want people who are maybe lured by the incentive a little bit, but maybe also they just - they want to help you out by filling out the survey.
Many of our clients - we’ve seen our clients do all sorts of things across the board with incentives. Some people have given away, you know, a $100 gift card - that’s a little bit more reasonable. And they’ll do that as part of a drawing, you know, like five people who fill out the survey will get a $100 gift card. I’ve had clients give away CDs that they had in warehouse and wanted to get rid of. This is for public radio stations - so they were in demand by their audiences. And they mailed out a CD to every single person that answered the survey, that’s a great incentive. It does require lots of fulfillment, so make sure that your incentives are easy to allocate. You guys, consider in the case of CDs, by the way I’m dating how long ago this might have been because we are talking about CDs and not streaming, but they had to mail out each of those CDs there were costs in terms of the team, you know taking time to do that, and also in postage. So just make sure you’re thinking about those things.
Keep in mind too, that if you want to run an anonymous survey that you cannot necessarily have an incentive. If the survey is meant to be anonymous you won’t know who they are, and therefore you can’t send them an incentive so don’t put out the feel of an incentive and lure people, if you actually intend for it to be an anonymous survey.
Your survey needs to be tested over and over again. You only get one shot for people to fill it out so make sure that you have dotted all your Is and crossed all your Ts. Read your questions aloud, once you’ve got them into Survey Monkey or whatever software you’re going to use, make sure that you read your questions aloud because you’re more likely to catch grammatical errors when you read out loud.
Test is over and over and over again, I cannot stress this enough. Test it on all those browsers, test it on mobile. Make sure that if you have a logic tree - so if you ask a question at the top of your survey and their answer depends on what question they get next, make sure that you are testing out all the scenarios that they may encounter.
Test with people who haven’t seen it before. This means, you know, it’s all well and good to test with other people inside your organization, but make sure you also test it with people outside your organization. We tend to use jargon and phrases that we know that our audiences don’t always know. You may know that your young leadership program is called the “Young Leadership Alliance,” but other members of your audiences don’t have any idea that’s what it’s called. So, I always like to look at a neighbor and say, “hey neighbor, would you mind reading this survey and let me know if you don’t understand anything.” And they tend to be a good metric.
How to Write Good Survey Questions
Alright, so we’ve looked at how to set up our strategy, what every good survey needs to be successful, let’s talk about writing good survey questions.
There are two types of survey questions: closed questions and open questions. Closed questions are measurable, and open questions are revealing. Let’s take a closer look at what I mean by that.
Closed Questions are Measurable
Closed questions give answers that your user can select from. So here’s an example: “In which time range were you born?” The options are: “before 1940,” “1940-1959,” “1960-1979,” “1980-1999,” and “2000-now.” Now this question, they’re lumping people in 20-year time frames. What you’ll notice about this, is I’ve given the user all the options they have to choose from. I have also made sure these questions are mutually exclusive, meaning if this said “1940-1960,” “1960-1980,” and I was born in 1960 - there would be two correct answers to my question. So you have to make sure that you do not have that happen.
Closed questions provide a list of acceptable responses, they are typically multiple choice, yes or no, checklists, etc. They’re easier and less time-consuming on both ends of the survey. Meaning they’re easier for you audience to answer - it’s kind of a ‘click, you give me options, I pick one.’ It’s also easier on the analysis side for you, because if you use something like Survey Monkey, or Google Forms, they will give you bar charts that show the answers really visually as well as numerically. So, it’s much easier to see those and analyze them.
Because you are providing the list of acceptable responses, be sure not to bias your audience. So, be sure that you are not leading them in a certain direction. And I’ll show you more of that in a little bit.
So here’s an example of a survey from “This American Life.” So, “This American Life” is a podcast and radio show, which I adore. Well, it was an eight page survey. We’ll talk a little bit about that in a minute. Bit over the top, clearly the audience was people who really love “This American Life,” but this is an example from their survey of their closed questions. So they had, “what is your age?” And it was a drop down. And then, “what is your gender?” And, “what is the highest level of education you’ve received?” I want you to pay particular attention to number 16, “what is your gender?” Depending on your organization, and your culture, it may not be enough to list ‘male’ and ‘female.’ Ok, times are changing and sometimes people don’t want to answer the question or they have an answer that is ‘other.’ And you’ll notice that here. So, make sure that you are being cognizant of your audiences, cognizant of the culture and the time that we’re in, and also aware of your own brand to make sure you are really tackling all of the potential answers to your questions. Even something that might be seemingly as simple as, “what is your gender?”
A lot of us want to ask our organizations if we’re named in their Will. I put this in there because this is a great question for some of you to get at your planned giving. I won’t read it out in general, but it’s basically asking you know, it’s letting, it’s kind of an education point, at the same time that it is asking people a question. So, “hey bequests left to our <charity> by people in their Wills, are of enormous benefit to our work to help <answer your cause here.> Many people like to leave money in their Will or Trust, in their will or trust to our <charity,> have you included a gift to our <charity> in your Will?” And there’s lots of answers here, this is provided by the way from Sean Triner at Moceanic. They have lots of information about this which I will point you over there for more information about including a question like this in your survey.
Open Questions are Revealing
Unlike closed questions, open questions are revealing. And these are the questions again, where you ask a question and you ask the respondent to type in their answer; to think about, consider, and then respond with their own text. Some examples include: “why did you choose to become a donor?” “what’s the one way we could best show our appreciation for you?” “what one word would you use to describe our Scholarship program?” So you’re getting their input into the questions you want to know.
Open questions allow respondents to answer in their own words. That’s really important, and I love this because I’m on the communications side. I really love the answers to open questions, because they tell me the words that my audiences are actually using to describe our organization, our cause, our programs, and I can use those words right back to them and show them that we know why they care about us.
Open questions provide unanticipated insights, there is always something that comes out of a survey that is unexpected. Always, always, always. Sometimes they confirm things, and a lot of times they surprise us with new information. And nothing does that better than an open question.
Open questions typically encourage more reflection - I have to think a little harder about what I’m going to write and formulate it. And therefore, they tend to be more accurate because I’ve given them more thought.
Keep in mind that open questions require more human time to analyze. So, much like closed questions are easier for both ends of the survey, open questions are harder for both ends of the survey. Your respondent has to think a little deeper about their response, has to formulate it in their head, has to type it out; and you - on the analysis side of the survey - have to read all of those responses and begin to tag them, as similar to each other, formulate them, see the patterns, they just take more time to read and understand. Remember in your surveys, you can always have a survey that does not have an open-ended question, that is always an option. So keep that in mind too.
Here’s an example of the only two open questions that were part of that “This American Life,” survey - remember it was eight pages. Only two of the questions were open because somebody has to go through and pour through all of them and those questions were worded like, “what do you like most about this podcast, and what do you like least?” So, pretty open-ended.
Should I ask closed questions, open questions, or both in my survey?
If you’re wondering to yourself should I ask open questions, or closed questions, or both in my survey? The answer is probably both.
This goes back to what is the strategy for your survey? What do you need to know in the end, and can you get that with just closed questions? If you can get answers to everything you need to know with closed questions, why not make that easier on yourself? But that’s not always the case.
If too many open questions are needed to get the answers that you want, or that you’re looking for, is this the right research method? If you find that you really need open questions only for your survey, then you probably need to go interview people or do focus groups for something like that.
You will likely ask no more than 1-2 open questions in your survey. And keep in mind that your survey is ideally under 10 questions total, do not - I would really try not to ask more than 1 or 2 open questions.
Dos and Don’ts for Survey Questions
Some Dos and Don’ts for your survey questions.
Make sure you give clear instructions up front. If you’re asking about a particular program, make sure you make that very clear for a particular time period.
Ask one question at a time. This is really easy to mess up. It’s easier than you think, especially when we have the idea that we want to get our survey in under 10 questions, sometimes people end up combining questions in an effort to do that. So be on the lookout for that.
Ensure your response options are exhaustive - you looked at all the options for the closed questions, and that they are mutually exclusive. We talked a little bit about that.
I recommend that you save personal info until the end of the survey. Often times you are sending this to people already familiar with you; if they’re your donors you are already have their email information. No point trying to collect that up front, they’re more likely to fill it out once they’ve already committed to answering the full survey.
Here’s some don’ts.
Really work hard not to use jargon or technical phrases. It’s very easy to do this because you live it every day. Make sure that someone outside your organization is looking at your survey before you set it up.
Try not to frame your questions in a negative or positive. It’s really easy. Like right now, we’re in political season and I’ve seen all these things about, “approval polls,” and I’m like they should probably be like “opinion polls” because what if I don’t approve? You’re kind of setting me up to approve by calling it an “approval poll.” That’s a good example.
Use loaded words or phrases. Make sure you’re looking at how you’re using words. For example, you might want to use ‘tired’ instead of ‘exhausted.’ ‘Exhausted,’ has emotional connotations with it.
Try not to bounce between topics and time periods. What I will typically do is write all the questions I intend to ask, what I plan to do with that information, and then I go through and I start sorting them based on the topic of the question. Sometimes if you’ll notice there is just a natural flow with a survey that makes sense about why you would ask that question next. So at the end make sure that you go and look for that flow.
Just don’t do this, don’t do this. I love “This American Life,” I got to this page of their survey, at the top it says, the survey has six more pages, nothing to kill my motivation more than that. And then, a wall of scales and they are one through ten, which is a lot, and there’s a lot of them to answer. I kind of got here and sort of gave up pretty quickly. Also this is answer 10, so I did not get very far, before I was sort of put in front of this wall-o-scales and didn’t want to fill them out.
What’s wrong with this question?
So let’s do a little Q and A, you can play along on your end. Feel free to answer out loud. What’s wrong with this question?
What is the most affordable and most fun summer camp?
Hopefully you have guessed that that’s two questions in one. The word ‘and’ means the user doesn’t know which metric they’re evaluating. How can we solve this? We can turn it into two questions. “What’s the most affordable summer camp?” and “What’s the most fun summer camp?” Now some of you may be thinking, “that’s a really subjective question,” and that’s ok. Sometimes you want a survey that’s just getting a feel for people’s perceptions, and that’s alright. So with this you’re going to figure out what they think is the most affordable summer camp even if it’s not reality. If one of your talking points is that you are really affordable. Is that coming through? So are they hearing that from you? Do they think that you’re an affordable option? So, surveys can be great for getting perceptions.
What’s wrong with this question? How often do you visit our website? The options are, “often,” “sometimes,” “regularly,” and “never.” I’m hoping on the other end that you said, “well these things mean different things to different people. My regularly is your sometimes, right?” So here’s how we’d solve it. You want to be more clear with your options. Which option most closely matches how often you visit our website? “I never visit your website,” “once per year,” “once per month,” “once per week,” and “more frequently than once per week.” There’s a couple things here, one - most people know if they’ve been to your website in the last seven days, they know in the last 30 days, beyond that they know maybe in the last year but they don’t, two months can feel like nine months. So, but these are better representations of the time frames that people can actually remember. The other thing I want to call your attention to is, “I never visit your website,” you have to remember that you ask the question that may not apply to everyone. So you have to make sure that you’re really looking at all the options that might apply and again, “more frequently than once per month.”
Ok, last one, what’s wrong with this question?How much do you enjoy our <annual event name>? So insert your own event there. How much do you enjoy our <annual event name>? Hopefully you realize that we have biased them, by asking them how much they enjoyed it. Meaning we are expecting a particular answer. This is the negative or positive question, this is the positive - how much did you enjoy it? You may want to make sure your question is impartial by asking, “what is your opinion of our <annual event>?” or “using one word, how would you describe our <annual event>?”
Three Common Donor Surveys
Let’s take a quick look at the three common donor surveys.
Which Survey is Right for Us?
The first of these is the Deep Dive Survey. This is typically used to understand motivations or priorities among your donors. So your audience is typically donors, your questions tend to be mostly closed, and a few open. And this is the case sort of across the board - I’ll show you one where it’s not the case, but generally only want a few open questions in your survey. For channels, you want to do these in person and online, if you are deep diving into your donors, make sure you are actually talking to some of them as well as doing a survey. That’s really important.
There’s the Micro Survey. Micro surveys are really good at getting demographics and preferences. I’ll give you an example of this: when someone goes to your website and donates to your organization, they donate on their, on your donation page, and then they get a thank you page. And those thank you pages are usually really boring, they are so under-utilized. They say things like, “check your email for a receipt,” well that’s anti-climatic, I just gave and I’m feeling great about it, and I’m feeling great about your mission. Instead, why not ask them a few questions about them that help you segment them later and engage them more? “What’s your age? What program do you most care about?” and then you list your programs. So this is just a few, no more than say three to five, questions that are solely closed questions. They’re very easy for your people to fill out. One example, I gave to the ASPCA, I did not have - they did not have a survey on their thank you page, but they did send me a micro survey via email, almost immediately - that had four questions. Super easy, all closed, great, loved it. And the channels are online for this only.
And then lastly, your brand surveys. These are important when you’re thinking about maybe going through a re-brand or understanding perceptions of your organization. They tend to have a goal of assessing awareness. And often, you want to send it to donors, but also to prospects. When we’ve done these in the past, we’ve interviewed people who have engaged with our organization and also people who have never engaged but maybe should have. Like they, they’re in our ideal wheelhouse, they’re in our target, but they have engaged with us for one reason or another - we want to talk to them as well. Again, most of your questions are closed with a few open. And again, you can do this online - this is where you might have a more broad survey, but I would also pair it with some in-person interviews or focus groups.
Four Simple Survey Analysis Techniques
Let’s talk quickly about analysis, I could spend a lot more time talking about analysis, there’s a lot of ways to get this wrong, but let’s take a look at a few of them.
Read all Responses for Patterns
Make sure you read all of your responses for patterns. And here I’m talking specifically about open questions especially. Read your survey responses multiple times. I try really hard, when I have a survey it’s so tempting to as the survey responses come in to look at them, I really try to wait until my survey closes so that I’m not making sort of early judgements on the information that I’m getting. I’m not, kind of, too quickly making calls about what I’m seeing. So, I like to wait until the survey closes and then you’re going to read through the survey one time, just kind of read it, don’t be worried about cataloguing anything just go through and kind of get a general perception of what you’re hearing and look for those patterns that are going to emerge. And after you read through it you will kind of say, “you know what there are kind of three big takeaways I take from that information. Things I heard multiple times.”
Begin to name, on your second reading, and group those responses to a typical question together. So for example: if the question is around how someone feels about your certain program - your summer camp program, there are going to be some patterns that emerge about how people feel about that program and so you can begin to tag those and group them together. And when I say tag I’m speaking specifically about something software like, Survey Monkey where it will let you allow tags to responses. That allow you to more easily put them together. You can also consider creating a Word Cloud.
You’ll notice throughout this presentation that a lot of my questions have said, “what one word would you use to describe <blank>?” And the reason for that is, it’s really easy to take those one-word answers, which are freely given by them, so they are not a closed question where they are selecting from an option because there’s no telling what people will choose when you’re asking them to give you a word, but you can turn it into a word cloud. Word clouds are quick and easy options for analyzing open questions especially when they’re like one-word answers. You want to make sure you only run a word cloud on a single open-ended question. Don’t run a word cloud on multiple open questions that’s just going to muddle your results. And more than anything, the beauty of word clouds is that they show what content you want to use in the future. And I should just back up here a minute and say if you haven’t seen a word cloud before, these are answers to one words that people use to describe a summer camp. And so you’ll see the most popular words are the biggest words, so ‘play, kids, love, playing’ these are the kind of words I’m going to want to use in my communications back to those audiences, because that’s how they think about it.
Be Careful with Numbers
Be careful with your numbers. When you’re using scales, for example: 1 through 5, be careful about using averages. If my question to you is, “do you agree that cats are better than dogs?” which by the way is a biased question, but if we asked, “do you agree that cats are better than dogs?” you might get five responses that include answers a 2, a 2, a 4, a 4, and a 5. And it begins on a five point scale and I’ll show you this in a minute. From that, from those answers, you would create an average which is 3.4 which is kind of ‘meh’ neutral as somewhat, barely agree. But if the responses were two 1’s and three 5’s your average would also be 3.4. So what’s missing here with doing averages is that those things, it doesn’t show the discrepancy and the big divide in the second set of numbers.
A better way to look at numbers is to look at them on the scale and to show them on the scale. So you would show for example: 40% of respondents ‘somewhat agree,’ 40% ‘somewhat disagree,’ and 20% ‘strongly agree.’ So that you can see sort of the divide if there is one.
Focus on the Big Picture
Be sure you’re focusing on the big picture, it’s easy to focus on the specifics and the outliers. For example, if you just threw an amazing event and one person just absolutely hated it, you tend to kind of put a lot more emphasis on that one piece of feedback. When overwhelmingly it was positive. So, just know that human nature is really easy to do that.
Instead, try to notice the big items. What are the things that come up over and over again? What are kind of the big takeaways that get repeated?
Try, again if there are outliers or strange anomalies, remember that you can always test them again. Say you have a 10 question survey and one of them is about your summer camp, and you find out that most of your donors either don’t understand it at all or really don’t like that program. Do a survey that is solely around that program. Ask more questions, deeper questions, always test again.
Try to be a little bit more skeptical of good news than bad news. If you see good news also feel like you might need to investigate that. Don’t feel like you only have to investigate bad news, sometimes an overwhelming glut of good news can be a question mark too. Are we not challenging ourselves enough? All kinds of things right? So just make sure you’re keeping an eye on both of those things. And remember that donors are predisposed to like you.
Confirm, Then Change
Finally, make sure you’re confirming your results and then changing. So surveys are just one of the many research tools at your disposal. I’ve talked about focus groups, I’ve talked about informational interviews, one-on-one interviews. Again a lot of times we will survey up front and then come back with those more in-depth methods to dive a little deeper into what we’re seeing; to confirm what we’re seeing in our surveys, or to get clarity around them.
Keep in mind that you can conduct a survey more than once. So you’re going to go through a rebrand ok? You can conduct a survey before the rebrand - get ideas about perceptions, what people think about your organization, and then come back a year or two later after rebrand and see if the results have changed.
Be sure to make changes in your organization once a clear picture emerges. Again, no point doing a survey unless you’re willing to heckle with the results. In other words, sorry I should’ve said wrestle with the results, unless you are willing to do the hard work that comes with survey responses sometimes then, you know, you want to consider whether you really want to do one.
Start small with the change to make sure you’re headed in the right direction. So say, you know, to go back to the example of the program - that you get more information about the program, you start to make some changes, maybe six months later you want to survey that group again see if their perceptions have changed, if they’ve noticed a change, you can always start small and see if you’re headed in the right direction.
Quick recap. Remember that surveys are an inexpensive way to gather lots of data. They do have some downfalls and some pitfalls, so make sure you know those going in and that you’re considering them as you build your strategy and your questions.
For most, surveys are indicators, not 100% truth because again, we are not getting statistical significance. So use them as indicators, and as some form of data. If you get a lot of responses then that’s just more information that lets you know that you’re on the right path.
Start your survey with a goal and a plan. Again, don’t jump right to questions. Make sure you know why you’re doing your survey, what audiences you’re going to target, what questions you’re going to ask, and what you’re going to do with the information once you have it.
Good survey design is purposeful, targeted, brief, delivered well, sometimes incentivized, and tested.
Make sure you’re writing clear questions and providing context as needed. We looked at bad questions, do not ask bad questions. Make sure you’re removing bias and really asking other people outside your organization to read through your questions as well.
Make sure you’re reading the results, you’re avoiding quick assumptions, and don’t let the numbers fool you. Really look at numbers, not so much as averages, but as where they live within the context of your question. All of this is in an effort, again in the end, to get to research that can help inform us and really cut through the opinions that we might have about our audiences and get to the truth of what is at the heart of our audiences and what they care about.
Thank you for joining us! If you’d like more resources from Mighty Citizen, including webinars, and free how-to guides, and templates - you can get those at mightycitizen.com/tools. We have a fundraising campaign metrics template as well as a donor survey guide and a communications tool for communications surveys. So go check those out, mightycitizen.com/tools and I hope you’ve learned something. Thank you!