Survey writing with stem students at open minds, open doors

I recently had the opportunity to speak to some middle school girls about my job through the Open Minds, Open Doors initiative. Sponsored by our local area education agency, this innovative program invites young girls interested in science, technology, engineering, and math (STEM) to learn about STEM career opportunities from people employed in potential career fields. As a research analyst who took a while to find my niche, I felt it would be good for these girls to know about the field of market research because it’s such a behind-the-scenes job, unlike the ever-popular veterinarian.

Besides telling them about myself and how I got to this career choice, I showed them a little about what my work entails. I described the whole process of our studies, from our research strategist sending a proposal to a prospective client, to showing them the results, including the analysis that helps the client understand what the results really meant for their business.

To illustrate the complexity of the study process, I created a five-question survey with questions relevant to them.

First, I had them complete the survey. Once they were done, I showed them how our software could automatically graph their responses for me. I explained the analysis has to go beyond that because sometimes the automatic graphs weren’t quite right. There wasn’t much interest in that, but, as I’d hoped, they were much more animated when I got to the next part.

I had purposely created each question to demonstrate a different problem with writing survey questions. One question asked them which school they went to and had all of the schools possible represented. I randomized the order, though, which made it harder for them to find the specific school they were looking for. Another question asked them how satisfied they were with various teachers, but the scale was skewed, had one very negative response and three gradients of positive responses.

I went through each question and asked them what was wrong with it. Although the first guess sometimes wasn’t right, they would always get to the answer I was aiming for after discussion. As I listened to their ideas, I realized how much the process of taking a survey is learned. They didn’t immediately understand how to do ranking, for example.

Talking with them also reinforced how much of my work is about language. Not only do I have to keep constant watch on my vocabulary, but I have to remember to write clear, simple instructions so respondents understand what is expected of them.

In fact, this is a constant area of improvement for our team. As we analyze the responses to new question types, we can tell if the survey instructions need tweaking. Client insights are also very helpful because they are not as used to seeing the survey questions as we are.

The goal, ultimately, is to create a survey that will gather the information needed for our clients with the least amount of work for the respondents. Practices like alphabetizing a list of names or offering the response options that are going to capture what they would have written in a text box help to make the process faster and easier for respondents. Instead of dropping out of a survey or not selecting something because they couldn’t find it, we get the data most useful for our clients.