I didn’t expect tech writing to involve creating so many surveys. One day I may go back and get another degree in math or statistics and then just specialize in creating surveys and analyzing product users—I like making surveys that much. For now, my goal is to learn about Google’s survey tool and branch out from SurveyMonkey.
Several years ago, our department achieved a coup by getting permission to pass out a survey on documentation at our user conference. We stood outside the one of the conference halls as people waited in line to file in, chatting and handing out surveys. I enjoyed getting to talk to actual customers.
We got about 40 responses out of all the folks who took surveys—not bad. But, we had no follow-up plan for using the information. Management didn’t develop one and I was too inexperienced to effectively push for such a plan. We made a tweak to release notes as a result, but the effort is largely remembered (unfairly) as much to-do about a survey that didn’t pan out to results.
I would argue that we got a baseline of responses to common questions that we can ask repeatedly over time. Besides, what can you expect from one survey? One survey does not equal user analysis. It’s a baby step.
The following year some team members took another survey to the same conference, but the surveys sat in a stack on a table. Team members remember three people taking surveys; none were returned. It was the year of no surveys.
The next round of surveys came with the Help Design project. When we were ready to show a prototype of our new help system format, our director asked us to gather responses from the focus group in a written form, such as a survey. This would be easier for us to take those tangible preferences to other stakeholders within the company in order to get our ideas implemented.
Currently, I’m involved in two new surveys: one at work to gather general information about our users and one for the Usability and User Experience SIG of STC to gather requirements for their web site redesign. As a matter of fact, it would be helpful if took that survey, if you are interested in usability; here it is.
I’m starting to see some helpful patterns, which I’ll share here:
- List your objectives.
Once you start sending your survey questions around to a group for review, it’s easier to talk about which questions should or shouldn’t be included if you can talk about which objectives the survey supports. You can brainstorm a list of overall objectives for user analysis, then choose which handful of objectives you are focusing on for this particular survey.
- Think about how often you will solicit input from users.
When will you do your next survey or some other form of user input?We decided to do two outreaches per year, in addition to ongoing help system analytics. It’s easier to let go of some of the objectives for this round if you know you can focus on them next time.
- Cultivate a list of contacts.
Our current survey will go to the clients who weren’t able to be included in the help design focus group. This is the result of a few courtesy emails on my part. We sent out an email soliciting clients for the focus group, but there was limited space. For the folks who weren’t included, I emailed to let them know we didn’t have space for them and asking if I could contact them for future feedback opportunities. Not one of them said no.
- Give concrete examples to make the questions easier.
I tend to think of this as beating around the bush. If one of your objectives is find out what types of manuals and other deliverables you should be providing, I don’t think you should ask, “What types of manuals would be helpful to you?” That’s a question for tech writers—we’re the experts on that. The survey should gather related information that helps us answer that question. Here’s something a little better: “How do you like to learn new things?” followed by some examples, “A. Instructional videos. B. From a co-worker. C. Classroom training. D. Trial and error on my own.”
- Make the questions distinct from each other and keep it short.
Please don’t make a survey that reminds people of a behavioral-style job interview. Do you take three questions out of ten to gather demographics? Unless user demographics is one of your major objectives, cut it down and combine questions. Are your questions distinct from each other, or will respondents get annoyed by thinking they have just answered the same question twice? You might be trying to discern a subtle difference with the two questions, but that might not be apparent to someone who is quickly completing the survey.