Your Guide To Writing Online Survey Questionnaires

Your Guide To Writing Online Survey Questionnaires

When designing a questionnaire there are an enormous number of factors that must be accounted for. Many of these factors are diametrically opposed to one another and the questionnaire design team must make compromises between the conflicting interests. Here we will address some of the most significant factors and the trade-offs that can be safely made.

When designing a questionnaire there are an enormous number of factors that must be accounted for. Many of these factors are diametrically opposed to one another and the questionnaire design team must make compromises between the conflicting interests. Here we will address some of the most significant factors and the trade-offs that can be safely made. Because of the enormous scope of the subject what follows focuses entirely upon surveys designed for online fielding.

Prevent Respondent Fatigue

No matter what we tell ourselves, virtually no respondent enters a survey thinking “I hope it’s fun!” That's why you will need to constantly assess the degree to which any questions you add to the survey keeps it within bounds of the main topic to be understood. Respondents primarily take part in surveys out of interest in panel rewards and/or curiosity about the subject matter and completes will drop if the survey is too long.

Brevity beats dynamic design

Keeping these motivations in mind, we as researchers should seek not to entertain participants with decorative displays and dynamic question designs but to inform them as far as we are able to while maintaining their objectivity and streamline the questionnaire so that respondents don’t end up regretting taking part. We want the survey experience to be as engaging as possible but that won’t keep respondents engaged if the survey feels as though it’s asking the same question over and over from different perspectives.

The respondents you want to reach are too busy to take lengthy surveys

It’s an unfortunate dichotomy that the respondents that we want most to participate in our surveys (e.g. high income home owners, high volume contractors) are the same people who have the least free time to take part in surveys. The shorter the survey the more likely we are to get these respondents to choose to participate, which in turn improves the quality of your study sample. Respondents are informed in advance how long to expect a survey to take and if it significantly exceeds their expectations they are likely to quit and not return.

Keep It Simple

When designing your study's online questionnaire, it’s important to ensure that the questions you are asking get to the heart of what you want to learn about, and that the language of those questions is simple and unambiguous. It can be misleading to analyze responses to a question that was open to interpretation by the respondents.

Plan to achieve your study’s goals on the first attempt

If your respondents differed in how they understood a question then the responses gathered cannot be aggregated. The only option in that scenario would be to go back to the respondents and get clarification on what their responses meant, but on average 40% of respondents refuse to participate in follow-up surveys.

Include opt-outs everywhere

An opt out response is a response that records that the respondent does not feel able to answer the question, such as Don’t Know, Not Applicable, or None of the Above. Some researchers feel that including opt-outs is tantamount to letting respondents skip a question, however this ignores a couple of fundamental truths.

For one matter, if a respondent genuinely doesn’t feel comfortable with any of the responses provided their only option to continue with the survey is to pick a response at random. This can cause additional awkwardness downstream as they are asked questions that are based on the random response they selected. This also causes “static” in the analysis on the backend, reducing the validity of any conclusions drawn from that data.

Even the best questionnaire designers cannot account for all possible personal situations as they are trying to design a questionnaire that applies to an entire population. Allowing respondents to opt out of a question prevents bad data and reduces respondent frustration.

Further, if a respondent uses opt-outs throughout the survey and isn’t providing any valid information, they are easy to identify. They can be removed from the survey data with ease. If, however, they are put in a position where they feel they are forced to choose a response that doesn’t reflect their opinions or experiences they are much more difficult for the research analysts to identify. Believing that offering opt-outs will encourage respondents to not answer questions is equivalent to assuming that respondents don’t want to participate in our study. If that is true about a respondent then we shouldn’t want to include them in our data anyway.

Finally, we operate on a number of assumptions when taking a macroscopic view of a market. Learning about what your participants don’t know is a finding. Denying the possibility that qualified respondents may have blind spots in their knowledge can cause us to miss opportunities for customer outreach and to make certain that we understand the respondents’ wants and needs.

Leverage optional responses appropriately

Leaving a question optional is sometimes seen as a gentle way to allow respondents to opt out of a question without including an opt out response. If the entire rest of the survey has required responses there is no reason for a respondent to assume that this one question is optional. If, for some reason, a question is going to be made optional rather than include an opt-out response it is crucial that we include an instruction informing the respondent that this is the case.

In soft data surveys we are analyzing the data making some assumptions about the respondent’s intent, sometimes with insufficient information to do so. A respondent who leaves a question blank might have opted out of the question, from which we can infer an intent (e.g. no favorite store to shop, no preference on brand) however the possibility exists that they did not see the question or had lost interest in participating in the survey. This can be a significant shortcoming to making questions optional.

Design for soft data and hard data

“Soft data” refers to data that is based on opinions and anecdotes; what is your preferred brand and why, how difficult was the product to install, and how would you rate your satisfaction.

“Hard data” refers to data that the respondent may need to look up; sales figures, number of employees, and percent of their work that is residential versus commercial.

It’s not a bad thing to intermingle questions asking for soft data and questions asking for hard data in the same survey, however in a survey that is gathering soft data it may be beneficial to treat all hard data gathered as approximations.

Ask yourself, "Will they know this off the top of their head?"

If, on the other hand, the goal is to gather true hard data then not only is it probably in our best interest to limit soft data questions but the entire survey should be prefaced, informing the respondent that we would like them to be taking the survey when they have access to documentation on those numbers.

Be consistent with orientation, direction, and magnitude of scales

When analyzing the data it makes little difference whether your scale was presented increasing from left to right or decreasing vertically. What can have a big impact is whether the scales in your questionnaire were consistent every time.

Asking one question using a 1 to 5 scale and following it with a 1 to 7 scale can lead to confusion on the respondent’s part as they are having to learn a new scale for each question (and here’s hoping that they are paying enough attention to notice it has changed).

Worse still would be to use a scale that increases left to right followed by a scale that decreases left to right (e.g. a 1 to 5 scale followed by a 5 to 1 scale). Did they select the left-most option in the second question because they were extremely satisfied or extremely dissatisfied but didn’t notice the difference in the scale?

Design surveys for mobile devices

Time and again studies have found that roughly 60% of the respondents in any online study are taking that survey on a smartphone. Smart phones have specific limitations that must be kept in mind when designing a survey. They have extremely limited screen sizes, which means that

  • If there is a significant body of text or an image the respondent may not be able to see the entirety of the presentation all at once, having to instead scroll around and view it in pieces.
  • If there is an extremely large bank of attributes, the respondent will potentially have forgotten the exact wording of the question by the time they get to the last attribute.
  • Dynamic functionalities that look good on a laptop/PC may be inconvenient or entirely non-functional on a smartphone.

And, no matter what we tell ourselves, virtually no one is going to type out an extensive and detailed response to an open end on a smartphone keyboard. Keeping these types of concerns in mind when designing a questionnaire can prevent respondent frustration and improve data quality by keeping the majority of your respondents engaged throughout the survey.

Designing Your Market Research Survey

When developing a survey for your building materials company, there are several steps involved before you can even field the survey: determining your research objective, identifying your target audience, choosing a survey length, designing questions, and including termination points using screeners and red herring questions.

Studying building products customers is an entirely different ballgame than conducting a CPG study, not because of the process or techniques of market research deployed, but because of intense effort necessary to find and vet the right respondents to keep your study results accurate. In the Home Improvement and Building Materials industry, it can be a struggle to find genuinely qualified respondents, especially if you’re conducting a study among Trade Professionals. If you’ve conducted internal market research before, you know these pains just as we do.

Over the past 30+ years, our team at The Farnsworth Group has developed and proven our system for finding and vetting Pros like Contractors and Builders as well as ways to distinguish between light-, moderate-, and heavy-DIYers. We will work with you to choose the appropriate criteria for your survey and ensure you’re getting accurate insights from the right people.

Written by Jeff Shull, Online Fieldwork Manager

Jeff brings two decades of experience in the home improvement and building products industry to his role managing all B2B data collection efforts and developing programming solutions to yield accurate answers to complex client questions. Based in Indianapolis, Jeff is the master of programming, data analysis, and data tabulations here at The Farnsworth Group. When he’s not programming studies for fielding, you will likely find Jeff tackling a new woodworking or woodturning project.  

Meet the rest of The Farnsworth Group team >>