iBlackHole
The new M+R Benchmarking data are out; I highly recommend them.
I was looking to do a summary of them for you but, as with my Agitator post today, got stuck on one chart:
That’s right. The average mobile donation page conversion rate is single digits.
This is why while mobile website traffic is the only type of traffic that is growing and half of all nonprofit web traffic is on a mobile or tablet, desktop traffic still accounts for 76% of all online donations.
Yes, we will never get online conversion rates up to 100%. People go to donation pages for all manner of reasons – curiosity, misunderstanding, or muscle spasm.
But 92 percent of people who go to a mobile donation page do not donate. This would have to be a thumb-spasm-epidemic of epic proportions. (And, yes, I know the irony of writing this for the DonorVoice site. We are currently doing a redesign that will be mobile friendly.)
M+R has some hypotheses for why mobile conversion rates are worse than the odds of the Packers winning the Super Bowl this year (other than Aaron Rodgers):
“The reasons for these differences are complex and open to interpretation. Supporters may simply be more comfortable pulling out their credit card for a large gift while sitting down at a computer at work or home. Perhaps mobile users are more likely to be consuming nonprofit content on their phones when they’re on the go, resulting in smaller and less frequent gifts. Demographics may also play a role — groups that tend to give at lower rates or make smaller gifts (e.g. younger people) could also be a larger proportion of mobile-first users.”
I’ll posit another simple one. We aren’t asking our donors. We aren’t asking the ones who donated what we can do better. And, more importantly, we aren’t asking the ones who didn’t donate what the barrier was in the process. We talk about this issue in depth here, but the TL;DR version is that the only way you find out what fatal errors are is by asking those for whom the error is fatal.
Put another way, by asking your current donors, you’ll find out the errors that are annoying but that people can push through. These donors don’t know about the issues that you can’t push through.
How does measuring the experience of donors, and would-be-donors, pan out? One organization measured their donors’ experience on their site and donation form. With a few tweaks recommended by donors, they increased their conversion rate from 12% to 32%.
Before testing, their form had 50-60% of people saying it was “very difficult” to use. Text analysis of the free responses that people had gave us a few culprits:
- Their email and web confirmation pages were four pages long to print
- Their form wasn’t mobile friendly
- They had a huge image on the top of the page that made it load slowly
- Instead of simple $50 buttons, you had to read through multiplier copy that read “$50.00 to send $10,500 of X”
- They hadn’t incorporated PayPal
So over the period of a month, they started knocking these out one at a time to the point they were all fixed. The number of people who said the form was “very difficult” went from 50-60% to less than 1%. And conversion went up over 160%.
Was this an outlier? Yes; it’s one of the biggest differences I’ve seen in one month on one form. But I’ve also never seen donor, and would-have-been-donor, feedback not improve the conversion rate on a site.
After all, with a mobile conversion rate of 8%, there certainly isn’t a lot of room to fall.