How Donor Opinion Can Steer You Right

July 20, 2018      Kevin Schulman, Founder, DonorVoice and DVCanvass

Yesterday’s post ccataloged a multitude of reasons surveys and donor listening could steer you wrong.

That said, there are many things we must ask donors that are both easy to answer accurately and vital for us to know:

  • How easy was the online donation process?
  • How committed are you to the organization?
  • Are you a cat person or dog person? (and other donor identity questions)
  • And, of course, et cetera

Also, just because we can’t ask people what their motivation is doesn’t mean we can’t understand their motivation as the result of a survey.  It just means we can’t ask it directly if we really want an accurate answer.

So how can you get this valuable information?  A few thoughts from some of DonorVoice’s commitment studies and pre-test tool surveys that predict how messaging will perform before the first email is sent or the first stamp applied:

Select your audience very carefully. When we want to assess how a direct mail audience will respond to messaging, we will select a panel of people who are 50-55+ years ago and have given to similar organizations in the past year.  While this has some selection bias, it biases the panel toward what an acquisition audience will look like.

But don’t worry too much about channel. We recently collected a large sample of offline donors for an international relief organization and compared them to the online donors.  There was no substantive difference.  The people who have self-selected to donating to your organization are more like each other (for your purposes) than they are like their demographic counterparts.  Another reminder why demographics are near useless for segmentation.

Don’t tip what result is desired. With our pre-test tool surveys, we randomize different aspects of a mail piece or email to the A or B condition.  This way, there is no preferred condition that we might subtly suggest to the donor.

Have professionals help with your survey design. The largest set of errors come in in the design of the survey.  There are plenty of questions that might sound good but lead to bad results.  There are simple traps like leading questions (e.g., “How much do you like our organization?”) and double-barreled questions (e.g., “What do you think of the type and amount of communications you get from us?”) that most know how to avoid, but there are hundreds of factors like this and even seemingly minor changes (e.g., did you include an “I don’t know” category) can have major impacts.

Never ask why. We think that we like the donor relations person because she was crisp and efficient. Actually, her voice subconsciously reminded us of our favorite 2nd-grade teacher.

So don’t ask your donor; ask math.  All you need for this is an outcome variable (e.g., level of satisfaction with the organization) and all your input variables (e.g., satisfaction with all experiences with the organization). Then, a bit of modeling later (even a regression analysis can get you some data), you can tease out what’s working, what’s not working, and, just as important,  what doesn’t matter.

By asking about people’s ratings of various aspects of their relationship with you and their overall satisfaction, you can accurately create a model around what creates satisfaction for your donors.  In fact, if you marry this to behavioral data, you can even project the financial value of increasing satisfaction, loyalty, or commitment. Nothing warm and fuzzy in this process.

This will not produce causal data.  Let’s say that, controlling for everything else, people who like your acknowledgment letters a lot also like your organization a lot and give more. People who don’t, don’t.

Even if this is correlation, not causation, it is enough evidence to make your acknowledgment program a point of emphasis. After all, does it matter whether your acknowledgments cause someone to be more committed or whether more committed people like your acknowledgments more? Either way, acknowledgments are important.

Here’s how Operation Smile made this work:

Set your hypothesis and assessment criteria first. If you can, get it in writing.  This will guard against leaders only accepting the results that bolster their claims.

What steps are you taking to get donor opinions that steer you on the correct path?

Nick