What About People Who Don’t Answer Donor Surveys?

June 21, 2018      Kevin Schulman, Founder, DonorVoice and DVCanvass

When you are in the business of asking donors about themselves and customizing their donor journeys based on that, you almost always get the question: “But what about those folks who don’t answer the survey?”

There are couple of answers to this.  The first is: keep asking.  If there’s a datum that you need to effectively talk to your donors, it becomes mission critical to get it.  We’ve talked about how one organization increased response rate and average gift by 15% each by asking for additional data at point of acquisition and customizing based on that.  Unless you have something else that can raise revenue per donor by 30%, this becomes the most important thing in your universe.

But not everyone will answer despite your best efforts.

That leads to the second answer: not answering any surveys with any additional information is, itself, an answer.

I’ve seen lower retention rates for people who don’t answer surveys, but that could just be because people who don’t answer surveys don’t get customized appeals.  Thankfully, there’s a peer-reviewed study that found the same thing called “Donor Retention in Online Crowdfunding Communities: A Case Study of DonorsChoose.org.”

The authors, Althoff and Leskovec, put together a nicely predictive retention model for DonorsChoose.org donors based only on the first donation interaction.  The full study is worth a look, but skipping to the bottom of the page, they found four predictive variables as to whether first-time donors would give again:

  • The donor’s initial commitment through the means with which she enters the site (teacher-referred or not).
  • Their proximity to the project they are supporting.
  • The amount they are giving.
  • Whether they disclose personal information (in this case, a picture or location)

It’s this last one that supports “not answering is an answer.”  In fact, the difference between those who answered additional questions about their support was stark:

Retention rate Gave personal info No info given
Teacher-referred 31.8% 10.4%
Not teacher-referred 41.1% 16.4%

Translating out of numbers for the moment, if you had to choose in the model to know whether a teacher-referred the donor or whether the donor provided personal information, you would much rather know whether they provided information.  As the authors put it:

“This effect is very large with differences of over 20% in both cases (which more than doubles the return rate).”

And, while not wholly relevant to the retention discussion, yes, people who gave personal info also gave larger gifts:

Average gift Gave personal info No info given
Teacher-referred $53 $41
Not teacher-referred $67 $44

Now, this is correlation, not causation.  The authors can’t say, and I won’t opine, , whether getting additional information from a person caused them to return.

But that’s also irrelevant.  Think how much effort you put into knowing which donors are more likely to retain.  You are likely paying for at least one solution right now that purports to have a glimpse at this answer, whether it’s a list co-op, an external modeling firm, or even RFM analysis (bless your heart, as they would say here in Tennessee).  And if you aren’t paying for prediction as to whether people will lapse, you are paying by not predicting whether people lapse in the form of additional costs and lapsing donors.

If you can get key insights into this knotty, high-cost problem by seeing whether people answer a few questions that will also benefit them by customizing their donor journey, that sounds like a win-win-win.

Nick