Online Surveys Vs Telephone Surveys

November 15, 2012      Admin

Using the 2012 presidential election as the backdrop for our post on predictive analytics, I highlighted the work of Nate Silver, the wunderkind statistician who runs the 538 Blog at the New York Times, and the fact that he was driving the seat-of-the-pants pundits nuts.

Despite some incredulous political pundits, Nate correctly predicted the winner in all 50 states in the presidential election.

Because Silver’s model was built on the aggregated results from many, many polls in the ‘swing’ or ‘battleground’ states, he’s now gone back and analyzed which polls were the most — and least — accurate.

You can see his full analysis here and for the political junkies among Agitator readers, it’s a great read.

What’s important in terms of today’s post is not that he singled out the best (IBD/TIPP, Google Surveys, and Mellman) and the worst (Gallup, Rasmussen, and Mason-Dixon) pollster accuracy, but the fact that he addresses the channels through which the polls were taken — online vs. telephone and robo-telephone.

Each time we publish survey results from research done by The Agitator or by our sister companies DonorVoice or DonorTrends, we meet with some objection that most of our research is conducted through online surveys.

Here’s what Nate Silver found in analyzing the accuracy of online vs. telephone surveys vs. ‘robopolls’ or automated phone calls:

  • The average error in calling the election result was 2.1 percentage points for the 9 firms that conducted their polls wholly or partially online;
  • The average error for polling firms using live telephone interviewers was 3.5%;
  • The average error for the robopolls was 5.0%

Times they are a changin’. At the end of his analysis Silver muses: “Perhaps it won’t be long before Google, not Gallup, is the most trusted name in polling.”

Any Agitator readers want to share their experience with online vs. telephone surveys?

Roger