Skip to content
Thoughtful, detailed coverage of everything Apple for 34 years
and the TidBITS Content Network for Apple professionals

Lessons on Internet Surveys

We’ve been watching the results of our reader survey roll in, with over 2,800 responses so far. You can still vote, but I can likely predict how you’ll vote, based on current responses. In fact, the percentages of certain answers have been stable since the first few hours of the survey.

This fact – that not much data is necessary to draw accurate conclusions – goes against the strongly held belief among many survey professionals that a high response rate is necessary. In fact, for a proposed survey to win a federally funded grant, one of the most important criteria is a predicted high response rate, and media pollsters performing quick surveys seldom report their response rates because they’re so low. But according to Dr. Jon Krosnick of Stanford University, that belief turns out to be wrong, something that researchers are just coming to realize.

Dr. Krosnick spoke last week as part of a speaker series organized by the Cornell University Survey Research Institute, and although Tonya and I felt somewhat out of place in a room of academics, we were pleasantly surprised to find Dr. Krosnick’s talk engaging and accessible even to those of us who have no formal training in surveying or statistics. If you’re extremely interested in the topic, I encourage you to listen to the talk (26.6 MB MP3); for the rest of us, I thought I’d offer a quick summary of the non-intuitive lessons Dr. Krosnick imparted and a few other facts of interest to anyone who has been asked to complete
a survey in person, over the telephone, or on the Internet.

  1. Telephone interviews are not good substitutes for face-to-face interviews. They’ve become commonplace because their lower cost – between $1.50 and $6 per minute per respondent, compared with up to $1,000 for an hour-long face-to-face interview, once you factor in hiring and training interviewers, travel time, and so on. However, when examined on a number of scales, telephone interviews turn out to be significantly less accurate than face-to-face interviews.
  2. As much as telephone interviews aren’t great, mailed paper questionnaires suffer from even worse accuracy. The thought is that people tend to whip through questionnaires too quickly, thus reducing their accuracy (in one telephone versus questionnaire survey comparison that asked pilots about dangerous experiences, those pilots who completed the paper survey rated their answers as less accurate than those who were interviewed on the phone, and remembered fewer dangerous incidents). The attraction of paper surveys is that they’re cheap, but it turns out that once “the Dillman method” of sending reminders and multiple copies is employed, the cost savings over telephone interviews are minimal.
  3. I’ve already noted the third lesson – that low response rates aren’t nearly as much of a problem as previously thought. That’s a good thing, because Dr. Krosnick noted that telephone survey response rates are dropping precipitously; one ongoing survey that has traditionally had high response rates is seeing them drop by a half of a percent per month.
  4. Computer-based surveys turn out to be significantly more accurate than telephone surveys, perhaps because people subconsciously consider computers to be more human than a stranger on the phone. Plus, with computer-based surveys, if questions are asked one at a time, respondents can think about their answers without having the social awkwardness of telephone silence. There’s also some thought that people are more honest when not speaking directly to another person. The lesson here is that computer-based surveying over the Internet is big business already, and will only get bigger as it takes over for telephone and questionnaire surveys.
  5. The problem faced by most Internet surveys is that they seldom rely on a random sampling of respondents. Most Internet survey firms recruit users in a way that can easily lead to non-random groups providing results that are less accurate than those from a truly random sample. Apparently, there’s only one Internet surveying firm employing true random sampling – a company called Knowledge Networks, and in a test of a number of Internet survey firms and a well-regarded telephone survey firm, Knowledge Networks’s results were overall more accurate than any others. In fact, Knowledge Networks goes so far as to provide panelists with an MSN TV Web browser and Internet connection if necessary.
    In contrast, many other firms rely on people who want to earn money taking surveys, and as with so many other things on the Internet, it’s easy for these people to misrepresent themselves in order to participate in more lucrative surveys. Unfortunately, properly done Internet surveys end up being roughly comparable in cost to telephone surveys, though it would seem that costs could drop.


A Novice’s Conclusion — We found Dr. Krosnick’s talk utterly fascinating, and although we didn’t have time to chat with him beyond the Q&A session at the end, it would seem that some conclusions could be drawn from his lessons about the kind of Internet polls and surveys we see so frequently.

First, although there is absolutely no disagreement that a random sample is ideal, the difference in accuracy was not huge. When applied to a question that is likely to have relatively divergent answers (such as the age of TidBITS readers), useful conclusions can easily be drawn without worrying that a self-selected sample would be horribly biased in one direction. Attempting to distinguish between answers separated only by a percentage point or two wouldn’t be possible, though.

Second, even if the response rate isn’t huge, that wouldn’t seem to make much of a difference. We might end up with a response rate of less than 10 percent in our survey, but the added accuracy gained by a larger response rate certainly wouldn’t be worth harassing you all multiple times to answer our questions. Just how small that rate can be isn’t entirely clear, but single digits don’t appear to be a major problem.

Third and finally, unlike a survey gauging national voting plans, most Internet polls don’t attempt to use the results to predict the future, nor are the results likely to affect the future actions of other people. I can’t quite put in words why this seems like a relevant difference, but it’s related to the goal of the survey. If I learn what percentage of TidBITS readers regularly play computer games (28 percent), I can use that information when considering what articles to write, but I can’t see the publication of this fact causing people either to start or stop playing games. However, compare that to surveys that ask who you plan to vote for in the next election; your answer has the power to help sway the opinions of other
voters.

And in that thought is where I think the answer to decreasing response rates lies. Surveys can be intrusive and badly timed, but if it’s reasonable to complete them, they should be seen as a way of spreading your opinions to the rest of the world. It’s the same reason I don’t mind using grocery store shopper cards; I know they’re tracking my purchases, and I want the fact that I’m buying more organic and less processed food to be recorded prominently. So the next time you’re polled, consider it a chance not just to be counted, but also perhaps to nudge the world in the direction you want.

Subscribe today so you don’t miss any TidBITS articles!

Every week you’ll get tech tips, in-depth reviews, and insightful news analysis for discerning Apple users. For over 33 years, we’ve published professional, member-supported tech journalism that makes you smarter.

Registration confirmation will be emailed to you.

This site is protected by reCAPTCHA. The Google Privacy Policy and Terms of Service apply.