II Five tips to create more reliable responses
--By Allison Meserve, Health Promotion Consultant, Health Promotion Capacity Building, Public Health Ontario
Surveys are a common data collection method for informative, process and outcome evaluations, as they are a low-cost way to obtain data in a short amount of time. Creating a survey however, is not as simple as entering a set of questions into an online survey program. Designing surveys that will provide an accurate reflection of people’s attitudes or behaviours involves multiple steps, each of which takes time and thought.[1,2] Some steps in designing a survey include determining: the purpose of your evaluation and whether a survey is the right data collection method; who to invite to take your survey; how the survey will be provided (self-administered versus interviewer-administered; paper versus online); when the data will be collected; what questions will be asked; and how the data will be analyzed and presented.[1,2] The purpose of this article is to provide five tips for creating questions for a self-administered survey. While it is not possible to discuss all of the steps in detail within this article; helpful resources for conducting these steps are made available at the end.
II Five tips to create more reliable responses
Based on the evaluation questions, you may already have ideas as to what types of indicators will be useful to measure the needs of your community or the outcomes of a program. But how are those indicators turned into questions which will lead to reliable and usable results?
Tip 1. Know your respondents. Surveys should be written based on who will be completing the survey. The wording of survey questions goes beyond writing them at a Grade 6 reading level. For example if you are working with a newcomer community, you might consider translating the survey into the appropriate language for that community. Another consideration in survey development is the use of appropriate terminology. For example, if survey respondents are physicians you might choose to use the term ‘myocardial infarction’ as opposed to ‘heart attack.’ Knowing which terms your respondents would use increases their ability to answer correctly, as well as their willingness to complete the survey. [2,3]
The types of questions that are perceived as sensitive or intrusive can vary as well. Asking intrusive questions, such as those regarding income, sexual behaviours, criminal activity or medical history at the start of your survey may lead many people to stop the survey prematurely.  If possible, place questions regarding sensitive topics towards the end of the survey; doing so will allow respondents to become more comfortable with the questions being asked and gain an appreciation of why they are being asked about a sensitive topic.
Tip 2. Consider length The length of each individual question along with the length of the survey in its entirety matters. The perceived length of a survey affects how many people will start and finish a survey. If using a paper-based survey format, ensure there is sufficient white space for respondents to clearly read and follow along. If using an electronic-based survey, consider whether a new page is necessary (for example to allow for skip patterns). Moreover, reading long sentences prior to answering may result in respondents forgetting what was asked or could lead to misinterpretation of the question. As such, it is helpful to eliminate words that do not need to be included (e.g., instead of ‘has the ability’ use ‘can’; instead of ‘take into consideration’ use ‘consider’). In interview-administered surveys, it is recommended to include the answers in the question: “Are you very likely, somewhat likely, somewhat unlikely or very unlikely to eat ice cream at least once this week?” As the respondent will also be reading (rather than just hearing) all of the answer choices in a self-administered survey, it is better to shorten questions, such as, “How likely or unlikely are you to eat ice cream at least once this week?” 
Tip 3. Think about all parts of the question. A question in a survey is made up of three parts: the question stem, additional instructions and the answer choices or spaces. If each of the parts does not work individually, as well as together, the likelihood of inaccurate responses increases. 
The question stem is the words that make up what you are asking respondents, for example, “In which city or town did you go to high school?” One key consideration is to make sure the question is relevant to everyone who will be asked to take your survey.  In this example, consider whether some respondents might not have attended high school. If this is a possibility, include an answer choice of “I did not go to high school.” Or, an alternate method is to ask all respondents whether or not they attended high school prior to asking the follow-up question to determine the city or town. Another important consideration is to only ask one question at a time. A question that is asking two or more things at the same time, such as “Do you like peanut butter and jelly sandwiches and milk for lunch?” is called a double-barreled question. The person responding may only like sandwiches or drink milk for lunch but not both, therefore they may have difficulty answering the question.
Additional instructions are where you clarify how a respondent should answer or can be a place to provide definitions. For example, in the question stem above, you may also include, “If you have attended high school in more than one city or town, please write the last city or town where you attended high school.” Providing additional instructions will help to avoid confusion or frustration for the respondent if they do not understand how they should respond.
The answer choices or spaces are where you either provide the available answer choices or provide space for respondents to write in their own answers.  If the answer choices are provided, they should align with the question stem. For example, asking the question “How many days a week do you cook dinner at home?” and giving answer choices of “always, mostly, sometimes, never” creates confusion. Instead, you can provide a place for people to write the number of days or give answer choices such as “Every day, 5–6 days, 3–4 days, 1–2 days, I hardly ever or never cook dinner at home.”
Tip 4. Include the respondents Including respondents, or people who work with respondents on a regular basis, in the design of your survey will better enable you to achieve tips 1–3. Respondents are better able to identify: appropriate words or phrases; whether the questions are difficult to understand; if questions are placed in a logical order; if all the necessary answer choices are available; and if questions are unnecessarily intrusive.
Tip 5. Pre-test the survey Whether you are able to include respondents or not in the design of the survey, pre-testing the survey with respondents is always recommended. One way to pre-test a survey is to have people similar to the respondent group read the survey and answer aloud how they would determine their answer (cognitive interviewing).  Further, you could ask those who work with your respondents to review the survey for length, clarity and word choices. If possible, have someone who is familiar with survey design review the survey as well, as they may identify problems with questions or answer choices that would not be obvious to others but might lead to inaccurate responses. 
Writing and designing surveys is a complex task. Entire graduate-level courses exist to teach health promoters and researchers how to design reliable, valid surveys to produce usable results. By using some of the tips above and consulting the resources below, you can begin to create surveys which will enable you to better plan new programs and improve existing ones. If you would like help designing a survey, or would like someone to review an existing survey and provide feedback, please contact me, or other consultants on the Health Promotion Capacity Building team, at firstname.lastname@example.org.
Mathison S. Encyclopedia of Evaluation. Thousand Oaks, CA: Sage Publications, Inc.; 2005.
Newcomer KE. Using Surveys. In: Wholey J, Hatry H, Newcomer K, editors. Handbook of practical program evaluation. Third ed. San Francisco, CA: Jossey-Bass; 2010. p. 262-97.
Dillman D, Smyth J, Christian L. Internet, Phone, Mail, and Mixed-Mode Surveys: The Tailored Design Method. 4th ed. Hoboken, NJ: John Wiley & Sons, Inc; 2014.
The University of Wisconsin-Madison developed a short guide on survey fundamentals using “non-technical terms” available at https://oqi.wisc.edu/resourcelibrary/uploads/resources/Survey_Guide.pdf.
Burns et al. wrote an article on designing self-administered surveys for clinicians, much of which is relevant for any type of respondent: http://www.cmaj.ca/content/179/3/245.full.pdf+html.
Stats Canada has a very detailed guide called Survey Methods and Practice available in English and French:
The guide includes particular chapters devoted to designing the survey.
Bernard et al., wrote an article highlighting 48 common biases found in surveys, many more than there is room for here: https://www.cdc.gov/pcd/issues/2005/jan/04_0050.htm.
The Centers for Disease Control and Prevention created the Checklist to Evaluate the Quality of Questions available at https://www.cdc.gov/HealthyYouth/evaluation/pdf/brief15.pdf. The checklist can be used either when reviewing your own survey, or when asking colleagues to review.
Online course platforms like Coursera and EdX have free courses taught by researchers who specialize in survey design.