Non-response is a big topic in survey design and management, and also quite a convoluted and complex one. The following is our learning over the last half-decade or so, and provides some background that helps in understanding “no answer” and “non-response” in surveys.

survey questions

Non-response Bias – The Bigger Picture

Firstly, although it isn’t the main topic of this post, a few words about something called “non-response bias“. Non-response bias is where a survey does not accurately reflect your target group due to biases in the subgroup of people that respond (or don’t respond!). Imagine that your survey is for 100 workers, and 50 if then respond. This initially sounds like a great sample, but if the 50 that don’t respond are all happy with their work, while the 50 that do are not happy, then the survey results are not going to accurately reflect the workers (we have 100% of one group and exactly 0% of the another). A sample of the group is fine, but only if there isn’t a bias in that sample.

There are a number of different ways to handle this. One is to target a 100% response rate, although in doing this care has to be taken in how it is achieved to avoid biasing the responses by changing them! By way of a silly example, if I offered everyone who responds to the survey a lifetime’s supply of ice-cream to boost the response rate, and the survey includes questions about future purchases of ice-cream, that may well impact on the answers. Similarly, cajoling employees into completing an engagement survey may have a negative effect on their answers.

Non-response does not always have a negative effect on survey result (there is a large body of research in this area: Curtin, Presser, and Singer 2000 ; Keeter et al. 2000; , and Earl, “The Practice of Social Research”). Anything over 50% is usually sufficient, and over 70% is viewed as a very good response rate. There are a number of reasons for survey non-response:

  • Awareness (people were not aware of the survey due to communication failure).
  • Capability (people do not have the knowledge/competence/tools to complete the survey).
  • Motivation (people are aware and able, but unmotivated/unwilling/refusing).

Each of these can be addressed specifically. When thinking about non-response bias, a basic analysis will help you to see if there are specific groups that are under or over-represented in your survey responses (this is easily achieved in SurveyOptic using result splits and filters). Completion rates for surveys in SurveyOptic are often over 95%, so traditional non-response bias is generally not an issue, however that is not the whole picture when it comes to non-response.

“NA” – Question-Level Non-Response

At question level, there are many types of non-response:

  • Non-presented – the question has not been presented (seen), due to the person not getting to that point in the survey (yet).
  • Non-presented (skipped) – the question was not presented (shown) as it was skipped over (by the survey flow/skip logic).
  • Non-responded – the question was shown, but no answer was received from the participant.
  • Invalid response – the question was answered, but the response was not a valid one (e.g. invalid date, number versus text etc).
  • Null response – the question was shown and an empty or null response was returned (this is usually a web browser issue).
  • NA – the question was explicitly NA’d – the participant clicked/tapped a “No Answer”/”Not Applicable” option.

We do a number of things to control and reduce invalid responses, for example interpolation, where we convert text answers into numbers, e.g. “two” becomes 2, and so on, and by using custom input controls, for example a date picker for picking dates, so that only valid dates can be chosen, sliders for selecting number ranges, and number input fields on mobile devices to bring up just the number pad.

“NA” responses are frequently not supported in survey tools. When you dig into it, you can start to see why. “NA” has a nasty habit of being using interchangeably to mean “No Answer”, “Not Answered”, “Not Applicable” and “No Opinion” – however not all of those “NA”s are equal. Sometimes you do not want to code “NA” as an additional response choice, as it changes the percentages in your reports, and can distract from more explicit answers. Then there are times you actually want to report on it, for example in the case of a “No Opinion” response you may want to factor that into the response calculations. A rating of Not Applicable, just like Not Answered, usually removes the response from reporting, for example from the average score, and doesn’t affect minimum or maximums. SurveyOptic reporting is flexible enough to cover all of these options, so it is purely a matter of design choice.

The ‘No Opinion’ sense of NA is worthy of thought in survey design. If we force people to choose an answer, but they don’t have a definite opinion, all that we are doing is adding noise into our survey results, and clearly we don’t want that. Similarly, a refusal to answer can often be a signal about attitudes, or just point to a poorly worded question.

Computer Says NA

Just to keep our life interesting, in the world of computer programming, 0, “”, and [] (null) are sometimes treated the same, and sometimes treated differently. To make things even more interesting, this treatment varies by computer language. And, to add even more to the confusion, different web browsers also handle empty responses slightly differently. Thankfully that is generally a problem for the programmers, and is hidden away behind the scenes, but it is worth being aware that NA also has its own set of meanings in the computing/data analysis world too.

NA Problem

With all of that said, for a well designed survey, with a good response rate, non-responses are more of an opportunity than an issue. They help you understand more about your respondents (which is why you were doing the survey in the first place) and can also flag problems with question wording that can occur especially when respondents aren’t using their first language.

Hopefully that has provided some answers! If you need help with survey design, please just ask, we are always happy to work out a way to help.