Your browser is out-of-date!

Update your browser to view this website correctly. Update my browser now

×

Need a Survey? Make One!

Modern online services make creating and managing a complex online survey surprisingly easy

Ultimately it was handling open-ended questions that caused the gnashing of teeth and scratching of the head.

This story began late last year prior to my position with RWEE. IBiquity Digital Corp. commissioned my consulting company Rareworks LLC to create and run a survey to gauge current use of HD Radio technology among public radio stations. I had never fielded an online survey — or any survey, really, except for asking the family if they wanted ketchup on their stadium hot dogs — but I had taken enough of them in the course of business that I imagined I could make one that wouldn’t be annoying to the public radio engineers who participated and that would return good data for iBiquity.

Well, the process was perhaps not as linear as I imagined, but the outcome returned usable information in a pleasant presentation. While the final survey results have not been released publicly by iBiquity as of this writing, I wanted to relate here the process of building an online survey and what to be aware of in that process, because you may find yourself tasked with creating and managing a survey for your station or group.

START WITH WHAT YOU’RE GIVEN

The company supplied me with a spreadsheet of public radio stations that held HD Radio licenses, and a skeleton set of survey questions that asked a station about its use of HD Radio technology, Program Service Data, HD Radio power and the like. The survey was to be run for several months, and I was charged with gathering the data, analyzing it and writing a formal report.

 You have probably taken lots of online surveys that auto-magically seem to know you’ve replied a certain way to a question and then offer more questions along the same topic. In a survey tool, these programming tricks are called “question and answer piping” and “skip logic.” Depending on one’s answers, additional questions are revealed or skipped entirely. My survey questions were no different; if a station didn’t operate auxiliary/booster/repeater HD Radio stations, there was no reason to ask the respondent for additional call letters and station information.

Mental Note #1: Don’t annoy your survey participants.

I also had seen surveys that auto-populated survey fields from an email invitation, and this too was desirable. For instance, I wanted to send an email invitation to a station and when the engineer/general manager clicked on the link, a Web browser would open to the survey with the station’s call letters already within the “Your station call sign” answer box.

THE SEARCH FOR THE RIGHT ONLINE SURVEY TOOL

It seems like every research project begins with a search for existing tools, and this project was no exception.

I found what I thought was the simplest and least expensive concept: download an open source survey tool, get a free Amazon Web Services server, install the survey software on it, write the survey itself and sit back and gather the results.

Then little nagging concerns started floating up: How will I verify the results? How will I manage survey invitations? What on Earth would the data report look like? How would meaningful graphs showing valid data be constructed? Could I keep the data secure? Could respondents take the survey on any mobile and desktop platform?

Suddenly it looked like the least expensive software path would be the most expensive in time and trouble to create, run, validate, secure and maintain. Therefore, it was back to Web searches for “online survey tools.” A number of articles attempted to round up and review various commercial online survey companies and their services; while these described the facets of running an online survey, the feature and price list of survey providers didn’t seem quite right. The references seemed … dated. And sure enough, the articles had been written years ago and the companies’ costs had changed, as had the nature of their business.

Mental Note #2: Don’t spend too much time poring over reviews and articles that are over a year old — the Internet apparently moves faster than that.

While a number of online survey companies popped up in the Web search, a closer look showed each had a range of services that might or might not fit with the project. The free online survey offerings were too limited in number of responses and analytic detail, while a number of the monthly for-pay services lacked text analysis (the open-ended questions where a user could enter anything) and the ability to neatly “pipe” data from an invitation email into the survey itself. Some companies offered everything I needed for this survey but only at a high annual rate.

The result of more searching and testing on survey providers’ free trials narrowed the field to a company called QuestionPro (questionpro.com) as the best provider for this project; this doesn’t mean that other providers aren’t suitable for your survey projects — far from it. The provider I selected had the month-to-month arrangements and the analytical tools to make it efficient to create and run the survey I had in mind. The company also offered extensive email invitation management, which I discovered later was a sanity-saver.

IF YOU TRY TO CONTACT A STATION, WILL THEY ANSWER?

With the survey provider selected, the next step was to build out the contact database with current names, phones and emails for all of the listed stations.

This would have been a daunting task, but in examining the spreadsheet, a number of the listed stations had the same licensee, so by grouping the stations by licensee, the number of contacts was more than cut in half, and by using available Internet sources, a rough contact sheet was built over several weeks.

Some interesting effects of this information-gathering became known quickly: When one is trying to email a human, the most discouraging email addresses begin with “info@” or “mail@” or “inquiry@.” For the lack of responses I received sending to these station addresses, they might have well been named “NoOneReadsThis@” or “Unresponsive@.” Some stations seemed to be actively discouraging direct listener contact by making their phone number hard to find, or not listing anyone “in charge.”

As an interesting aside, I noticed that job turnover among station managers and leadership wasn’t always reflected on the station’s public-facing portals.

Mental Note #3: Politely request public radio stations to please review and update their website and enhance their contact information.

IF YOU BUILD A SURVEY, WILL THEY COME?

Fig. 1: The opening page of the survey. If a participant clicked the email link, this page would open and the survey engine would log they had “read” the invitation. The survey started taking shape after repeated testing and debugging; the data piping and question-skip features worked well and the layout was attractive with a custom logo. Fig. 1 shows part of the opening page.

A number of open-ended questions had to be included in the survey because respondents were asked in several places “Please tell us your reason for doing ” or “What is your Importer software version?” Every question also had an INFO icon where the respondent could click and the context for that particular question would be further explained. Although QuestionPro offers tutorials on using their survey engine, it always seemed faster to dig in and play with the survey creation tool, then read the document to understand the finer points.

On Jan. 29, the survey was announced on the PubTech list server and the first wave of email invitations was sent. These emails used contact info that had been gathered then filtered into a comma delimited file and uploaded in batch to the QuestionPro site. From there, a customized email message was built automatically that neatly folded in the station manager’s name, title and call letters.

One aspect about this process was unfamiliar: The CAN-SPAM act for commercial email requires senders to give recipients the right to stop emailing them and has stiff penalties for violations, so every QuestionPro survey invitation automatically included a link to unsubscribe as well as other CAN-SPAM required information, and all email originating from the QuestionPro system is checked for CAN-SPAM compliance — or it isn’t sent.

After the initial batch of invitations was sent, the QuestionPro system began logging bounced email and those who clicked on the link within the invitation to start the survey. Each bounced mail was handled by telephoning the station to ask for the correct address, where I found that the staff had changed, or the person had left the station. After gleaning the correct email, it was a straightforward step to edit the survey mail list in QuestionPro and resend just that invitation. After several days of correcting email addresses, the next batch of invitations were sent and subsequent email bounces within that batch were corrected.

REACHING OUT TO PROD THE PEOPLE

There was a period of quiet where all the invitations were sent and I was thrilled to see responses starting to trickle in. Then no new respondents appeared.

Mental Note #4: Everyone, not just you, is busy with activities other than your survey.

QuestionPro has tools that easily send formatted reminder emails to respondents who have not completed the survey, and a new email template was created to gently remind the stations that we were hoping for their participation. The survey tools showed the survey took an average of 18 minutes to complete — information that was folded into the reminder message. Several cycles of sending a reminder, waiting for responses and sending another reminder occurred, but participation felt like it needed a boost.

This was the cue to reach out and call stations to remind them of the survey, correct contact information and chat. Although time-consuming, it was a pleasure to reconnect with station managers and engineers, ask about their spring pledge drive, chat about HD Radio technology, hear about their specific challenges with equipment or staffing or other technology and get a feel for the public radio system in general. This contact with stations boosted their interest in the survey and gave me anecdotal material to fold into the final report.

Along the way, I discovered an issue with the way the survey asked a station for its current Importer/Exporter version. No one has that software information at hand, so when the respondent tried to save the page and return later, QuestionPro balked because every question on the page was required to be answered before the page could be saved. Several stations had to repeat the survey because the information wasn’t saved, other stations had partial answers, both issues causing delay and annoyance. In retrospect, I should have put large notices prior to these pages and allowed respondents to save and return later.

Mental Note #5: Hindsight is wise.

A QuestionPro trouble ticket was filed, noting this behavior and requesting that the survey tools allow responses to be saved at any time from any page. Fortunately, the QuestionPro survey controls allowed for on-the-fly editing and advanced survey management, so this challenge was overcome.

ANALYZING THE RESULTS

On March 30, the survey was closed and the data analysis began.

The first order of business was to cull empty responses; making the survey easy to take by allowing multiple responses from the same station made a situation where (I imagined) the general manager would start the survey, see the technical questions, stop and forward the email invitation to the station engineer who would then start a new instance of the survey. Some stations had up to seven false survey starts that had to be manually inspected and deleted. After culling the empties, reconciliation between completed and incomplete surveys took place; some stations had one completed survey and several partially completed. A quick check of every partially completed survey was made to ensure we didn’t miss anything significant. Finally, a review was made of stations’ surveys that had been started but never completed. They contained valid data that was folded into the final report and the number of incomplete surveys was reported.

One interesting issue was seeing a survey for a familiar-sounding station “WRAR.” After opening that survey I was surprised to find it was not a radio station in Virginia, but my early test and debugging survey using my personalized “Rarey” call sign! I immediately deleted that record and the analysis continued. QuestionPro provided rich analytical tools, so one could browse results in a variety of graph formats and ways in Microsoft Excel, Word, Adobe PDF and as raw numbers.

OPEN-ENDED QUESTIONS, WITH OPEN-ENDED ANSWERS

Fig. 2: The QuestionPro-generated Tag cloud based on the first names of survey respondents. The image has been edited for publication. While most questions in this survey were of the “click one of the following” type, there were open-ended questions that required more handling and consideration to turn into meaningful graphs. QuestionPro reporting tools automatically converted open-ended questions into tag clouds — those interesting representations that let you visualize the frequency of a repeated word in a block of text. (You can make your own tag cloud at sites like wordle.net or tagcrowd.com.)

For my purposes, the tag cloud wasn’t suitable for displaying, say, stations’ Importer software versions — the tag cloud of software versions was a jumble of floating point numbers. This was a vexing challenge. To turn this and other open-ended questions into a “metric” required taking the answers to each question, placing them in a list one per line, and then alphabetizing the list.

This ordered list caused similar answers to group together, making them easy to count by hand and turn into a graph or table. For instance, with Importer versions, most respondents simply entered the version number, which grouped nicely into 2.x, 3.x and 4.x versions. Easily counted, easily reported.

The same technique also worked for equipment types and manufacturer. This extra labor made better, more digestible data. However a delightful side effect of QuestionPro’s reporting was the creation of a tag cloud of respondents’ first names and last names. It was socially interesting to see which first name is most popular among public radio engineers, if not relevant to the final report.

WRAP IT UP WITH A BOW

Aggregating the pretty, pretty graphs and tables generated by QuestionPro’s system into a Word document was straightforward, and the survey tools gave state-by-state and country-by-country accounting of the respondents, as well as a breakout of which computing platform respondents used to take the survey.

I find that one of the most challenging tasks of this process is deciding on a color/font/margin/editorial theme that conveys information pleasantly and clearly but without calling attention to itself. Avoiding saturated colors and fat table borders goes a long way toward that goal. Likewise, a pie chart with too much drop shadow makes the report look frivolous.

All in all, the mechanics of creating, fielding and analyzing the data from a survey was a fascinating exercise, and for this survey, the tools were sharp and reliable. I hope you can relate the next time you encounter an open-ended survey question.

Rich Rarey is technical editor of RWEE and principal of Rareworks LLC consulting.

Close