Archive for the ‘Questionnaire Design’ Tag

Keep it Simple

In my 25 years of doing market research there have been two research projects that have crashed and burned.  While I don’t particularly want to point out failures, I believe I have learned from them.  Both projects crashed for essentially the same reason.

First, in the early days when I was an Account Manager for one of Chicago’s prominent research shops, I fielded a study for a distilled spirits company.  We were trying to identify the optimal combination of the bottle shape and label design for a new super premium bourbon brand.  We recommended that a conjoint analysis be utilized to develop the combination that had the greatest utility and appeal in the market.

A conjoint analysis is a relatively complicated statistical technique that assigns utility values to the variables being tested.  What’s more complicated is actually executing the data collection.  In this case we utilized 50 artist renderings of different bottle and label combinations.  The interviewer was to present two photos to the respondent and the respondent then selected the one that was most appealing.  Based on the result, two additional photos were presented, then two more, and so on.  The problem was the interviewer had to process a complicated procedure to determine which photos to present to the respondent.

We fielded the study in five markets, and of course, had not budgeted for one of our staff (me) to travel to those markets to conduct the briefing with the field services we had contracted with.  One of the markets was Homewood, a suburb of Chicago, so I did the briefing there.  Of the other four markets, all four crashed.  Only the Homewood market was successful in executing the field work as designed.   

The point here is that while the methodology for data collection was very complicated, the real complication was that the travel expense for field briefings was not budgeted.  We tried to conduct interviewer briefings remotely (this was long before Skype and the internet).

The second problem was a study I designed for a member based association.  The idea was to interview visitors to determine why they visited and how to attract more.  While the association was footing the bill, the members were the real beneficiaries of the study.

The data collection methodology was really straight forward and easy.  It required member organizations to simply intercept a sample of their visitors and ask them if they would participate in the survey once they returned home.  It was designed to be very low impact, both on visitors and on the organization’s staff. 

The key members, after much consultation with their corporate hierarchies, said that no, they could not ask their guests to participate in a survey.  There were a variety of reasons stated.  We then went through a series of fall-back methodologies that resulted in the lawyers ruling that we could not use any incentives to get people to agree to participate.  The final methodology was completely passive and resulted in a success rate of about 10%.  In the end the association canceled the whole project.

The point is that in the bourbon case 25 years ago and the visitor survey case last year, the projects were affected by complications that could have been avoided.  If, 25 years ago I had insisted on budgeting travel to conduct the field briefings, the project would have succeeded the first time through.  Last year, if I had insisted on the original methodology (which I have used many, many times before successfully) instead of a diluted process that five or six parties had to agree on, the project would have returned the results all were seeking.

Keep it simple.

How to Make Your Next Online Survey More Effective

The internet has spawned an number of efficient tools to utilize in conducting market research.  Surveymonkey and Zoomerang are probably the most well known.  It is important to note, however, that these are tools, not the answer.

Out of professional interest, I take every survey I get the chance to.  I am always looking for an idea that I can borrow.  What I often find are many errors.  Here are some things that might help you with your next online survey.

  • First, really identify what you are trying to accomplish with the survey.  You should spend much more time on the front end than on the back end.  One problem with market research is that it tends to raise lots of questions after the research is complete.  Try to anticipate those so they can be included in the survey. 
  • When you structure or write your questionnaire, don’t assume.  Don’t assume that your respondent did something while visiting, is interested in your membership or subscription program, or understands what your mission is.  It is frustrating to be asked to rate something about which I have no knowledge or interest.
  • Do not use a progress bar if there are skip patterns in the questionnaire.  My surveys can be very short or very long depending on the respondent, their experiences, and how they answer, i.e. whether the questionnaire skips or not.  A progress bar or percent complete gauge can incorrectly lead to many terminated interviews.
  • Worry less about the length of the survey and more about whether or not it achieves your objectives.  When I am presented with a “4 question survey that only takes a minute,” I tend to dismiss it as marketing, not market research.
  • Do not require a question to be answered.  This is my biggest issue with Surveymonkey and Zoomerang, not because of the capability to require a response, but because most people using them to conduct surveys don’t understand how or the implications.  What do you do when an answer is required but none of the choices reflect your attitudes or behavior?  This is something I encounter frequently.
  • Test.  I am often surprised when an online survey has skip logic that does not work and puts the respondent into an endless loop or the wrong questionnaire branch.  Have other people read and test the questionnaire before it goes to field.  Try every answer combination to make sure the questionnaire behaves the way you think it will.
  • Do not require the respondent to identify themselves.  If you want honest answers the survey must be blind. 
  • Understand the sampling.  You can’t test awareness of a program or upcoming event if the source of the research sample is your newsletter’s mailing list.  Your member database will give you your member’s perspective, not the community’s.  Understand the implications of who is taking the survey.

Finally, when in doubt, use a professional.  If the research is going to go before your Board or presented outside your organization, always use a professional.

In-Depth Interviews

One qualitative technique that is often overlooked is conducting in-depth telephone interviews.  Focus groups are what most people think of when they think qualitative.

I am in the middle of a project for a client that has me conducting 30 in-depth telephone interviews among three different respondent segments; ten interviews per segment.  We will also be doing focus groups and two separate waves of quantitative surveys.  The point here is that these 30 interviews are the beginning of the process, not the whole research program.

In ten of the interviews I am talking with nonprofits.  My client is a nonprofit and the idea is to see how other organizations have dealt with the issues we are trying to solve.  Here are the steps I used:

1.         Build the list.  I have a target list of 60 organizations that meet certain criteria.  The criteria were designed to give me a broad mix of perspectives.  Some of the organizations are large, some small, some urban, some rural, etc.

2.         Create the Discussion Guide.  Unlike a questionnaire, a discussion guide is an outline of the issues and topics that I want to cover.  The challenge, like in writing a quantitative questionnaire, is to limit the content to what you really want and need to ask.  Otherwise the interview will take too long to keep the respondent engaged.

3.         Send a written invitation to participate.  In my case I sent an email invitation that explained who I was, who my client is, and the reasons I want to talk with them.  I listed three general topic areas I wanted to explore.

4.         I then started scheduling specific dates and times to conduct the interviews.  Even when someone took the initiative and called me, I would ask to schedule an appointment.  The reason is step 5.

5.         Research the organization before the interview.  Websites, annual reports, and news articles provided me with much background information about the organization before I spoke with them.  This saved much valuable telephone time since I didn’t have to ask questions like “How big is your organization?” or “What areas do you serve?”

6.         Conduct the interview.  My discussion guide has 16 topics listed.  In none of the interviews have I been able to cover all 16.  In one of the interviews I only covered one of them.  That is because we went into great depth on that one topic.  The gain far outweighed the topics I forfeited.  That is the point.  An in-depth interview, like a focus group, shouldn’t be a rigid, structured questionnaire. 

So what am I learning from my interviewing?  Importantly, it has all been done before and we don’t have to reinvent the wheel.  I am getting very specific, relevant advice on how other organizations have approached the issues. 

Finally, and this is somewhat self-serving, have a professional do it for you, particularly if you are talking with contemporaries in your field, prospective donors, community leaders, etc.  If you conduct the interviewing yourself you run the high risk of not really hearing the answers or turning the conversation into a sales pitch.

Your comments, suggestions, and questions are always welcome.

Do It Yourself Surveys

People ask me fairly frequently about what online survey tools they can use to conduct surveys.  I have many issues with do-it-yourself market research.  I would almost never recommend that an organization conduct a survey themselves.  I view it the same as attorneys and CPA’s; a lawyer would never suggest that you prepare your Articles of Incorporation yourself and a CPA would never recommend that you do your own taxes.  (There are many online tools available for do-it-yourself law and accounting).

The above being said, there are many tools available to conduct surveys online.  At the basic, elementary level are Zoomerang and Surveymonkey.  At the professional, power user level are products like SPSS.

One thing to consider when selecting an online survey tool is the flexibility of the questionnaire design.  My number one rule is:  Never let the methodology drive the questionnaire content.  For example, with one online tool I tested I found there were several limitations in terms of the questionnaire design.   One important one was that I could not have respondents enter a whole number to the survey question: “How much did you spend on…?”  Yes, I could have presented ranges, but you can’t calculate an average from a range.

Many online survey tools advertise the library of questionnaire templates they have available.  My general feeling is that if you need a template, you need to hire a researcher to conduct the study for you.  In other words, if you don’t know what questions to ask, the results from a template driven survey will probably have limited value. 

One question you should ask yourself as you write your questionnaire is:  “What are we going to do with the results of this question?”  If you don’t really know or if definitive actions cannot be taken, you probably don’t need the question in the survey.

Another consideration is the ability to create skip patterns in the questionnaire.  A skip pattern is where the questionnaire branches based on the answer to a previous question.  Some skips are basic, some are very complicated.  Also, can you randomize or invert answer choices?  Can you display a previous answer as part of a new question? 

Another area to consider is how flexible the product is on the back end of the survey.  Can you control quota groups?  Can you create cross tabs with multiple banner points?  Can you filter which respondents are included in a table?  Can you export the data?  Can you export the tables and graphs?  Can you calculate basic statistics (like confidence intervals, means, medians, standard deviations, etc.)?

All of the above should be considered before committing to an online research tool.  You may also want to review a couple of my prior posts that deal with sampling and statistics:

http://nonprofitsurvey.wordpress.com/2009/11/24/who-to-research/

http://nonprofitsurvey.wordpress.com/2010/08/24/important-numbers-confidence-interval/

http://nonprofitsurvey.wordpress.com/2010/06/25/important-numbers-incidence/

Discovering Options

Last week I spent several hours sitting in an out patient hospital waiting room.  I was not in a particularly good mood because, among other things, I was worried.  When I am in one of these moods I tend to look for the worst in people, not the best.  Everyone in the waiting room with me was either noisy, irritating, or invading my space.

As I read the novel I brought with me and tried to insulate myself from the people surrounding me, the words Discovering Options crept into my consciousness.  I immediately focused my attention on four people sitting near me.  Yes the words Discovering Options came from a man who was talking with a young woman in the group.

This man explained to the young woman that he was a mentor for a youth mentoring program called Discovering Options.  He went on to explain how the program worked and the significant commitment in time and energy it took to be a mentor.  He then went on to articulate the value he derived from his relationship with the program and the protégé he was mentoring.  He was so upbeat about his experiences that the woman asked him to send her information about the program and how to become a mentor.  Wow! 

The question here is How many of your volunteers, donors, members, patrons, supporters, or mentors, can tell your story?  How many can sell your story?  

What I am heading toward here is that it is probably time to do a survey among your volunteers, members, or patrons.  Have you done one recently?  InnovateVMS, one of the organizations I work with, conducts two surveys annually.  The first is a survey of its clients or constituents.  The second is among its mentors or volunteers.  Both provide critical information to the organization’s management that has helped the organization grow. 

If you have a survey coming up consider including one of my favorite survey questions:  How would you describe (your organization name) to someone who has never heard about it?  The responses you get to this question speak volumes about whether the message you are communicating is the message that is being heard.  In the Discovering Options case, it was.

Your comments, questions, and suggestions are always welcome by email, Facebook, blog, or phone.  Thanks for reading.

The Second Most Important Question

Last November I posted an article about what I consider the most important survey question.  That is:  What was the main reason you visited (or became a member, donor, etc.)?  You can read the post here: http://nonprofitsurvey.wordpress.com/2009/11/06/the-most-important-survey-question/

What is the second most important survey question?  It is the suggestions for improvement question, as in: 

  • How could we improve the experience?
  • What could we do to make your visit better?
  • Was there anything bothersome, disappointing, or could have been improved upon?

I have seen many surveys conducted by a wide range of nonprofits and this question is almost never asked.  Why?  Is it because it must be presented as an open ended question where respondents write or type in their answers?  Open ended questions are more difficult to deal with because they must be coded to enable them to be tabulated.

In some cases this question is not included because the organization really doesn’t want to know.  Hard to believe but true.  What is harder to believe is when responses are returned and the staff rationalizes the results.  For example, I presented to an organization that a relatively high proportion of the survey respondents said that the organization’s rest rooms were dirty or not well maintained.  The VP of Operations then proceeded to explain to me that the restrooms had the original fixtures and were old and somewhat dated, not dirty.  I can only report what respondents say and in most people’s view, there is a difference between old and dirty.

The point here is that your organization must have the will to address the suggestions and criticisms that come out of this survey question.  I would suggest that in today’s economic and political climate, we all need to be asking our constituents, customers, visitors, donors, members, and patrons how we can improve.  Importantly, we must be prepared to act on those suggestions and criticisms.

Demographics

Demographic information is an important part of any quantitative survey.  Much of the demographic information gathered in a survey, at some point, finds its way into a comparison to the U. S. Census.  The 2010 U. S. Census is wrapping up and if past Censuses are any guide, it will be a couple of years until the information gathered is available.

Most of my surveys utilize a somewhat standard demographic block consisting of:

  • Head of Household Age
  • Head of Household Education
  • Home Ownership
  • Number in Household
  • Number of Children in Household
  • Number of Adults Employed
  • Race or Ethnic Origin
  • Household Income

In each case I deliberately structure the response choices so they match those used in the U. S. Census.  That way, direct comparisons can be made.

Of all the demographic data we collect, race or ethnic origin causes my clients the most discomfort.  I have had clients exclude it from surveys, presumably so they don’t have to address the result.  Keep in mind that many of my clients are cultural institutions that receive public support from a variety of sources.  Many are charged with serving their entire market, not just the more affluent, White areas. 

On the other hand, I have clients who use the result of the race question to demonstrate their penetration into the African American, Asian, or Latino communities. 

Race or ethnic origin can be challenging to present because the Census has many, many categories.  I tend to use White, African American, Hispanic, Asian, Native American, Other, and Multiple or mixed.  To draw direct comparisons to the Census I must do some manipulation of the Census data.  For example, the Census has categories for White Hispanic and Non-White Hispanic.

Finally, I field many questions from clients about asking a respondent to reveal their income.  When we ask they tell us (about 80% of the time).  If we are utilizing a face to face interview or a telephone interview we train the interviewer to present the entire demographic block of questions in a somewhat impersonal manner.  The interviewer needs to convey that he or she doesn’t really care personally.  That also usually works.

Your comments are always welcome.

Survey Length

One of the client issues I frequently deal with is whether the survey is too long.  This issue is particularly important when the survey is an online survey or a self administered paper questionnaire.

In a telephone interview or an intercept there is a live person (the interviewer) that keeps the respondent engaged.  Interviewers are trained to keep the respondent moving forward through long and sometimes tedious surveys.  Having worked as an interviewer (telephone and mall intercepts) in my distant past, the worst scenario is to be within a couple of questions to the end and have the respondent get frustrated and hang up or walk away.  In self administered questionnaires there is no interviewer to encourage the respondent to hang with it. 

Generally, in surveys for nonprofit organizations where the respondent was randomly recruited, I feel the questionnaire can be on the long side.  Why?  My experience has been that the reason a respondent agrees to participate in a survey for a nonprofit is that he/she is interested in the nonprofit.  Viewed another way, the recruiting is the difficult part.  Once someone agrees to take the survey I try to drag everything I can out of them.

My experience has been that I achieve a very high completion rate (95% and higher), even with long surveys. 

How long is too long?  It depends on the subject.  A survey asking a respondent to describe their experiences at a museum, zoo, or theater can be really long because people like talking about or responding to questions about their leisure experiences.  A survey that analyzes the impact of a social services program in an underserved market probably needs to be shorter.

If the group of respondents was provided by an online sampling supplier, the survey can be as long as you want.  That is because the respondent is being compensated for their time and they know the estimated survey length before they start.

What is too short?  Personally, if you are going to go to the trouble to field a questionnaire, I believe that a four question survey is a waste of time.  The survey needs to address your questions and issues and that usually includes the demographic characteristics of the respondent. 

When in doubt, test it.  If you field a questionnaire and eight of the first ten respondents don’t complete it, i.e. terminate it midway, that is a good indication that it is too long.

Follow

Get every new post delivered to your Inbox.

Join 43 other followers