Surveys are one of the most commonly used methods of collecting and analyzing data from a targeted population (a specific group of people). Using a set of standardized questions and a variety of delivery methods, surveys can be used to quickly obtain facts, opinions, and behaviors from a large group of people, which makes it a great elicitation tool.
We love surveys!
You can find them being used in almost every industry, business, and field of study. Surveys can affect all aspects of our lives: the products that we buy, how we perform our jobs, the people we vote for, the healthcare we receive, and our lifestyles, just to name a few. We love surveys so much we even created an entire game show around them - Family Feud first aired in 1976 and is still going strong!
How Can a Survey Help Me?
A well-designed survey can provide you with actionable input to identify, prioritize, and improve products, processes, and services, as well as, provide a communication channel across an organization. Here are some instances where a survey could come in handy as an elicitation technique in order to:
- Identify user/customer perceptions and expectations
- Gauge the level of customer satisfaction with a product, process, service, etc.
- Identify "Pain points" in an application, product, process, service, etc.
- Identify potential improvements, enhancements, requirements, new features, etc.
- Prioritize proposed changes/enhancements, etc.
Surveys are versatile, relatively inexpensive, easy to administer, and can provide focused, measurable data on a wide range of topics. When administered across an organization, by role, customer demographics, etc., the results of a survey can change preconceived ideas and open a productive dialogue.
As with any elicitation technique, surveys have some disadvantages in that the results can be affected by bias (the intentional or unintentional favoring of one result, outcome, conclusion, etc. over another). Bias can be introduced into your survey by: poor survey design (questions, responses, response type, flow, etc.), poor survey administration, overall response and abandonment rates, sample/population size/makeup, and respondent apathy, just to name a few.
The good news is that most of the disadvantages of using a survey can be mitigated by:
- Clearly identifying your target population (who will receive the survey)
- Planning and designing a survey that will provide valid, usable results
- Selecting who will administer the survey, and how it will be administered
- Aggregating, analyzing, and interpreting the results using standardized procedures
Before we get into the actual survey questions, lets review the entire survey process.
10 Steps to Survey Success.
Getting the dependable results you need requires some planning. Here are ten steps to help you navigate the process of building a successful survey experience.
Step 1 - Outline the purpose of the survey. Why are you creating a survey? What is it that you want to collect and analyze? Are you trying to prove a point, identify areas of improvement, or generate new ideas? What will you do with the data and/or the results you collect? Answering these types of questions will clearly define your goals and drive the rest of the survey process.
Step 2 - Identify who should participate. Often referred to as: your target population, target demographic, or respondents; these are the group(s) of people you intend to survey. For example, customers, key stakeholders, users of an application, a particular age, or a special needs demographic, etc. If your target population is really large, such as an online customer base or a global company, consider surveying a portion of the population using a sampling method (e.g. random sampling). Make sure you have enough participants to ensure your results are a valid representation of your target population.
Step 3 - Select the delivery/collection method. Selecting the appropriate delivery/collection method for your survey depends on several variables, such as: which method will achieve your overall goals within your budget, timeframe, size/type of target population, and resources available to conduct the survey and analysis. We recommend using online/mobile surveys whenever possible. The most common methods of survey delivery/collection include:
- Online/Mobile Surveys - provide the broadest reach, and the fastest turn-around time by using the internet or phone line and a website or mobile device to deliver and collect the self-reported responses of the participants. Offering the most flexibility in survey design, online surveys can be created quickly and inexpensively. We recommend you use some type of survey software (such as SurveyMonkey, Google Forms, etc.), which allows you to: quickly create dynamic surveys with full branching logic, track respondent participation, offer incentives, send electronic reminders, store surveys for edit and reuse, total and summarize response data, just to name a few. For respondents online surveys allow a certain degree of anonymity encouraging more honest responses. The convenience of an online design can allow participants to complete the survey at their own pace by saving a partially completed survey that can be finished at a later time, and pop-up instructions can be created to provide additional direction.
- Face-to-Face Interviews - uses an interviewer who physically meets with a participant in a neutral location, public space, the participant's home, etc. and is often used where the target population is not too large. This method enables the trained interviewer to interpret a respondent's reactions and provide/obtain additional information. However, that same personal touch could introduce interviewer bias, replace true responses with socially acceptable ones, and enable interviewers working on commission to submit multiple surveys, and/or manipulate the date to benefit them, and skew your results. Additional expenses may occur if interviewers must be hired and trained.
- Telephone Surveys - can be a cost effective method depending on local/long distance rates. There are several types of telephone interviews available, all of which may use computer assisted dialing (e.g. auto-dialer):
- Traditional - uses an interviewer to call the respondent and conduct the survey.
- Computer assisted telephone interviewing (CATI) - the interviewer conducts the interview over the telephone by following a script that displays on their computer screen.
- Computer assisted personal interviewing (CAPI) - the interviewer conducts an interview in person by following a script that displays on their computer screen.
- Interactive voice response (IVR) - uses a computer to deliver a pre-recorded automated script with voice recognition or touch-tone phone response. This method can take advantage of branching logic, and can capture and store respondent answers and demographic information.
Although response rates for telephonic surveys are higher than mailed surveys, abandonment rates may increase, because they are often mistaken for telemarketing calls. Also, bias can be introduced for that part of the target population that does not have a telephone.
- Mail Surveys - are a low cost method that use a printed, paper survey that is mailed to the respondent, allowing them time to think about their answers and respond at their own convenience, which is, also, why the response rate is the lowest of all the methods. In this method the completed survey is returned by the respondent via mail. Additional steps must be taken to get these hard-copy responses entered into an application for processing (e.g. scanning, data entry, etc.).
The method you choose will have an impact on the types of questions you use in your survey.
Step 4 - Design the survey. This is the fun part! Here is where you decide what questions to ask, what type of data you want to collect, the format of the responses, how the questions should flow, whether you should use images or not, the number of questions, etc.
Step 5 - Create a little hype! Generating a little email buzz, about the upcoming survey, can improve your chances of a faster turn-around time. This is especially true if the purpose of your survey is to improve/revise an existing user application and/or process. It let's people know that there is a light at the end of the tunnel and it's not a train!
Step 6 - Test the survey. Conduct a "test-run" with a small, select group and elicit feedback. Have them rate the survey based on factors such as: ease of use, understandability, clear instructions, clear and complete responses, should we ask a question in a different way, are there any redundant questions or responses, are we missing some important piece of data, does the flow/branching logic work correctly, length of time to complete, etc. Also, if using online or mobile devices, test the link and survey on various devices and browsers! Based on the outcome of the test - revise the survey if necessary.
Step 7 - Administer the survey. Select the day and time to launch the survey (we recommend launching on a Monday morning). Provide the participants with all the survey information (via email, social media, etc.) such as: title of the survey (yes, give it a name), start date, end date, a brief explanation of the survey, delivery method information (e.g. URL, phone number, address of on-site interview, etc.), any information the should have on hand prior to taking the survey, and don't forget to thank them, in advance, for taking time out of their busy day to complete the survey!
Step 8 - Send reminders. Monitor your response rate and send out friendly reminders regarding the importance of completing the survey (e.g. their feedback matters). Include survey end date (also, number of days left to complete) and all survey information. Also, send reminders on the last day of the survey. When the survey ends, send a big thank you - and a "can't wait to share the results and analysis with you" type of message.
Step 9 - Process, store, and analyze the results. You've put a lot of effort into getting these results so hang onto them. A copy of the raw data (actual participant results) should be labeled and kept for future comparative studies. Decide how you want to tally, process, and interpret/analyze the raw data.
Step 10 - Present the Results and Recommendations. How you present the results, and how you convey your recommendations is the culmination of all your work. Where you use some type of presentation and/or a report, include the purpose of the survey, the total number of participants surveyed and the total number of participants that responded. Include the original survey questions and responses. When presenting the data itself include: numbers, and percentages, also, consider a graphical representation of the data points. Also, provide an appendix or separate document that contains the actual raw data.
There's More Than One Way to Ask a Question.
When it comes to selecting how you ask a question, you have plenty of options to consider and each one will allow you to collect different aspects/views of data. Here are some of the most popular types of survey questions.
- Open-Ended Questions - provide a text box for respondents to enter their own answer in their own words. Use this type of question when you want to explore a topic in detail or to discover unknown opportunities. Analyzing this type of question can be time consuming, as each response could have multiple topics within them that will need to be tagged, grouped and summarized with other respondents' answers - for that reason use them sparingly and keep the text area short (e.g. allow a few sentences).
- Closed-Ended Questions - provide a pre-defined set of the most probable answers for the respondent to select from. There is more than one type of closed-ended question to choose from, here are a few of the most commonly used:
- Multiple Choice - provides a pre-defined set of answers that can require the respondent to select one answer or allow them to select multiple answers. Use this type of question, when you are looking for the frequency of an answer - say to prove/disprove a known point.
- Dichotomous - is a multiple-choice question with only two possible answers (e.g. Yes or No, True or False). This type of question can be used to provide a clear distinction or act as a filter between two entities in order to separate them. Once separated the two distinct groups can branch-off into different question sets (known as branching or skip logic).
- Paired Comparisons - similar to dichotomous questions, respondents must select between two alternatives (e.g. red or blue, brand name vs. features, etc.).
- Scaled Responses - provide a progressive range or scale of answers the respondent can select from. Use this type of question to obtain information that has a progressive order (e.g. age, income level, frequency, etc.) or to measure subjective data (such as agree-disagree, opinions around quality, etc.). Responses should be clearly labeled, and for ease of use and analysis, try to limit the number of available responses between five and seven. If you are using more than one scaled response questions in your survey, make sure they are consistent (e.g. the rating choices are ordered from low to high or high to low for all questions). These types of questions, also, require instructions (e.g. "Using a scale of 1 to 5, where 1 means Strongly agree and 5 means Strongly disagree, how would you describe...etc.).
- Ranked or Ordinal - asks respondents to rank a given set of responses based on level of importance (e.g. "Please rank all the following features you would like added to the application based on order of importance to you, using numbers 1 through 5 where 1 is the most important and 5 is the least important.").
- Semi Closed-Ended Questions - provide the structure of the closed-ended question and the flexibility of the open-ended question, by providing free text boxes that appear based on selection of a particular response. For example, giving the participant a multiple-choice question where one of the responses is "Other". When "Other" is selected a free text box may display to enable the respondent to provide additional details (don't forget to label this free text box).
- Demographic Questions - gather information about the respondent (e.g. age, income, role, etc.). In addition to segmenting your population for analysis, they can, also, be used to branch/skip a respondent to a specific set of questions within the survey. In general, keep demographic/identifying information to a minimum to increase your response rate.
Tips and Techniques to Design an Effective Survey.
Keep the survey short to avoid higher abandonment rates (e.g. 7 to 12 questions that should take no more than 10 minutes to answer). If you have more questions, break them up into two surveys.
Give the respondents enough time (not too much) to answer the survey (e.g. 7 to 14 days).
Explain the purpose of the survey (keep it short - 1 to 2 paragraphs).
Keep it flowing, start with general questions, and then move to the more specific questions.
Group similar questions together.
Select a font style and size that is easy to read.
If available, provide a contact number to support respondents with survey issues.
Conclude by thanking the respondents for participating.
Questions and Answers
- Be brief and to the point - remove unnecessary words or phrases.
- Separate an explanations from the actual question to avoid confusion.
- Make sure the questions tie back to the purpose of your survey.
- Use simple clear language - avoid abbreviations, acronyms, and jargon (e.g. LOL!)
- Do not use ambiguous language that can be interpreted differently by each respondent (e.g. "Are you a "regular" visitor...?").
- Explain any terms that might be confusing or misunderstood.
- Do not ask leading questions, where the question itself suggests an answer (e.g. "We think our search engine is excellent, how would you rate our search engine?)
- Do not use compound questions (e.g. two questions in one - "Rate the speed AND accuracy....?")
- Add "Other" as an answer option with a free text box, and "Not Applicable" where appropriate.
- Limit use of Open-Ended questions.
- Use spell check and proof read your surveys - for mail surveys, check for printing errors.
- Use radio buttons (circle buttons), when you want a user to select only one answer.
- Use checkboxes (square boxes), when a user can select one or more answers.
- Answers should be mutually exclusive - meaning do not overlap answers, for example:
- Response A = 1 to 5, Response B = 6 to 7
- NOT Response A = 1 to 5 and Response B = 5 to 7 (where the 5's are in both answers)
"Survey Says!" - Surveys will be around for quite awhile.
A survey is a powerful elicitation tool that can be used at the start of a project, for continuous process/product improvement, to launch new products/features, help change the business culture, and communicate across a spectrum of diverse customers and user communities. As a business analyst, being able to conduct a survey, analyze the results and communicate those results along with your recommendations will become one of your go-to skills. Keep in mind that with great surveys comes and even greater expectation that you will act on the results, and if you deliver - survey says, your next survey will be a success!
If you want to learn more about survey tools, look for my co-hort's next blog where Kathleen will discuss the in's and out's of various online survey tools. If you like this article, or are interested in other business analyst topics, leave us a comment. We'd love to hear from you!
Ready to get started? Request your free Survey Guidelines ebook here: https://thebazone.com/surveyebook