TEST RESULTS: External Validators Are Vitally Important–Except When They Aren’t

August 22, 2018      Kevin Schulman, Founder, DonorVoice and DVCanvass

We’re looking at external validators – seals and such – in our week-long series on how to frame overhead and impact.  These validators were the second most important factors to get right, lagging only how overhead is presented (which we covered yesterday)

In the DonorVoice study with the DMA Nonprofit Federation, we looked at five conditions:

The most important news is that every external validator beats having no validator.  Even the lowest performing trust indicator (in this case, the testimonial and rating) substantially overperforms not showing a trust indicator at all.

Of the ones presented, the Charity Navigator rating performed best, followed by GuideStar and DMA certification, which were virtually tied.

This is despite the fact that DMA does not actually have a DMA certification, so a totally bogus seal still helped increase trust.  Please don’t tell the Russians – they’ve messed with our trust enough.

So the lesson here is that even if you don’t have a four-star Charity Navigator rating, you can bolster your credentials with a BBB rating, GuideStar seal, or even donor testimonials.

The trick, however, is this was done with a fake charity.  It had no brand name or credentialing and thus was uniquely reliant on validation.  What would happen if we did this with a real charity?

Turns out researchers have done this.  A researcher looked at Charity Navigator ratings specifically.  He found that a one-star increase in the charity rating led a 19.5% increase in charitable contributions.

But wait!  That was only the result for relatively smaller and unknown charities.  For larger charities, he found the third-party ratings had an insignificant impact on donations.

This goes to what we covered on Monday – people will look at efficiency measures, but only in the absence of other information about a charity (specifically, their personal preferences and identity).

Thus, these external validators may be more important as you are introducing yourself to donors (i.e., acquisition) than to those who already love you.

Tomorrow, we’ll talk about how donor preferences for how their contributions should be used can make a big difference for you.

Nick

7 responses to “TEST RESULTS: External Validators Are Vitally Important–Except When They Aren’t”

  1. ⭐️⭐️⭐️⭐️⭐️
    “Terrific content!” 🙂

    I’m seriously shocked at the results given how much variance in the seals exists. Great work.

  2. AC says:

    “Thus, these external validators may be more important as you are introducing yourself to donors (i.e., acquisition) than to those who already love you.”

    I’ve also found this to be true in testing with both large, established orgs and smaller less known ones.

    However, one thing that I would like to point out that I think holds an important place in this conversation – Integrity.

    I would never argue for my digital team to add heuristic/symbolic validators of organizations such as Charity Navigator myself, as although they do some good [like their tips for donors on their site], in the grand scheme of things, companies like them have done way too much damage in the past decade or so in actively pursuing and ensuring that the conversation in the public sphere is always framed ONLY around “cost of fundraising” or CEO compensation, to the neglect of the actual impact that orgs are creating in collaboration with their donors. They’ve mislead the public for way too long and have solidified these false metrics in as [in my experience] as the first consideration a <$100 annual donor [for example] cares about.

    I've personally found this shameful on their part, and my integrity [and thus that of my organization's], and commitment to seeing the sector flourish as a whole, would never promote their approach even for a 10-20% lift [there are many ways to accomplish this without making a proverbial "deal with the devil" [maybe a bit harsh there 🙂 ] and the cyclical snowball effect promoting them.

    Caveat – I believe that they have taken some steps in the past few years to attempt to rectify their metrics, although I'm not sure that rectifies the damage that they've done, consistently. Personal opinion of course.

    AC

  3. AC, agree with your concerns about Charity Navigator. I agree they’ve improved over the past two years or so, but the core of their financial rating is still overhead ratio (they have seven metrics, four of which are program expense %, admin expense %, fundraising expense %, and fundraising efficiency, which are all related to overhead radio).

    That’s why I wanted to do this test – to see what might work better. Even though CN is most effective, this shows that if you used GuideStar (which is IMHO a better measure) or donor testimonials, you can still get a bump from the control. And this full test also shows that you can more than close any gap by treating overhead, identity, and/or the way you tell your stories well.

  4. Tom Ahern says:

    Nothing to add, Nick … just keep it up!

  5. I fear we collude in perpetuating the overhead myth, and starvation cycle, by including these indicators. We should talk about impact, not overhead. Most charities overhead costs are ridiculously low compared with for-profit businesses. And because of that, (1) charities accomplish less than they could, and (2) staff burn out. That’s what’s sad, IMHO.

  6. For me, having the DMA seal (which doesn’t exist) as a successful validator shows some promise for additional testing. If a variety of validators work from the alleged watchdogs to donor ratings, what else might? Quotes from prominent supporters? Testimonials from those helped? Scripture for religious organizations?

    If those who use overhead to measure us are the rodeo clowns, these may all work as ways to refocus donors on real impact.

  7. That was the intention of my pun. Oh well.

    This reminds me of the value/trust of e-commerce/ssl seals, you know like “VeriSign Trusted”. Most are still in use today, but they provide very little in way of convincing consumers to make a purchase, but consumers abandon more frequently without one. https://www.entrust.com/site-seal-faqs/

    What would be nice is a seal based on things that truly affect efficiency, like some basic database and fundraising metrics: data hygiene score, time to acknowledgement (if any), opt-out policy adherence, etc. Or even one that impacts the entire organization, like employee turnover!