“And the question is….?”  In the game show Jeopardy, contestants have to formulate a question as their “answer” to an answer category.  This backward thinking may be difficult in the game show scenario, but CMOR members and others can easily formulate all kinds of questions about the topic of respondent cooperation… some that range from the simple to the most esoteric.

During the year 2003, the respondent cooperation questions are consistent with previous years except there were more related to the Do Not Call list this year.   Survey researchers are still curious about the “nuts and bolts” of respondent cooperation and how to educate their staffs and clients about the issues.  The top ten list of questions asked or request for information are as follows:

1)       Information or Summary Data on Trends in the Industry and Survey Rates

Trends, trends, and more trends are requested to be used in preparing proposals for new business opportunities, presentations to management or clients for understanding industry issues, training new research staff, and benchmarking a company’s performance against industry data.  CMOR maintains some of the most comprehensive trends in its Respondent Cooperation & Industry Image Study which was first begun in 1978 by Walker Research.  Refusal rate trends and trends in respondent attitudes and behaviors toward survey research and polls, and use of telephone technologies to screen calls are some of the most sought after data.  This study was recently completed in the spring of 2003 and is available by contacting the CMOR office.

2)       Definition of Survey Rates

What is the difference between response rate and cooperation rate?  How are refusal rates calculated?  What is considered a terminate, break-off, or an incomplete?  What is the definition of ineligible respondent?  Does an ineligible include language barriers, senility, etc.?  Each company may have just a little bit different way of defining call results which adds to all this confusion.

Other industry associations provide varied definitions.  The calculations are theoretically correct, but depend on the definitions of disposition codes, and which ones are included.  Some calculations show a more optimistic rate, while others are more conservative.  The American Association of American Public Opinion Research (AAPOR) has recently revised their very comprehensive report on how to define and calculate survey rates and can be found on their website www.aapor.org.

A number of years ago, CMOR adopted the definitions and calculations created by a small group of statisticians working within ARF that use the AAPOR definitions as the base.

3)       General Questions About the Cooperation Tracking System Data

The CMOR Cooperation Tracking System collects data for the purpose of observing trends and understanding the impact of industry and environmental events on refusal, cooperation, and response rates.  The data provide insight into what types of studies are ineffective, and what steps may need to be taken to help improve these rates. The disposition form that companies use are comprehensive, yet easy to implement. Data that are collected include response, refusal and cooperation rates, length of survey, methodology, subject matter, and sample type.   However, the data is only as representative as the number of companies that participate and provide the data. This is an invaluable tool that is often used by survey researchers to help in bidding a project or understanding their own company’s performance.  For more information about participating and getting the most value out of this system, contact the CMOR office.

4)       How to Calculate Survey Rates for Online/ Mail Surveys

The survey rate definitions that CMOR provides on its website refer to telephone and in-person interviews.  Online/Mail surveys are unique because so many variables can affect response.  (Response rate is typically considered the percentage of completed questionnaires that are returned.)  The AAPOR website, www.aapor.org , provides a detailed definition of how to calculate mail surveys.  Other resources include Interactive Marketing Research Organization (IMRO) for questions about online research.

5)       What Is the Growth/Decline of Various Survey Methodology

CMOR relies on various industry sources, in addition to the studies and trends we compile.  In the recent 2003 Respondent Cooperation & Industry Image Study, we again asked our respondents their preference in methodology for a future study.  Not surprisingly, telephone studies are less appealing, and Internet studies are becoming more appealing.  Mail studies are given second choice preference, probably because they can be done at the respondent’s convenience. 

6)       Questions About Do Not Call and Its Effects on Response

CMOR’s governmental affairs staff has provided much needed information about the regulations, legislation, and court activity on Do Not Call and its implications to the survey research industry.  In addition, with the government affairs and respondent cooperation staff worked with MRA staff to develop, standard responses for interviewers to use when respondents posed questions about DNC.

7)       Information on Effectiveness of Incentives

CMOR has collected some information on incentives.  In the Respondent Cooperation & Industry Image Study, we asked respondents whether or not they received an incentive for the last survey in which they participated and in what form.  Overall, a very small percent of respondents reported receiving an incentive when participating in a survey.  Incentives are more common among online, in-person or mail surveys.  In CMOR’s Respondent Satisfaction Study, respondents showed a higher level of satisfaction with the survey experience when they received an incentive of either product or cash. 

Then there’s the issue of promised incentives versus no incentive, and its effect on cooperation.  There are numerous research studies conducted on this topic, primarily by the University of Michigan staff.  AAPOR’s Public Opinion Quarterly over the years has also featured excellent articles on the topic of incentivesThe issue of using incentives, however, is much more complex than determining what amount of money or what type of gift will produce the “best results.” 

8)       Are There Industry “Standards” for Survey Rates

This is highly requested information, and probably should be first on this list because this question comes up in just regular phone conversations, as well as, formal requests.  In the world of government or non-profit research, requests for proposals often state an optimal response, cooperation, contact, or some such rate.  Of course, if it were possible to obtain consent from all the respondents, then that would be best. 

There are no easy answers because there are also so many variables to consider when conducting survey research.  Factors such as identifying the sponsor, length of survey, subject matter, mode, etc, all affect cooperation rates.  Other factors such as telephone screening devices, privacy concerns, and lack of time or convenience can affect response rates, but many things we can’t control.  CMOR’s Cooperation Tracking System is one benchmark that can be used since it looks at survey rates by several variables.  The Respondent Cooperation & Industry Image data also provides valuable trend data that can be used as a benchmark.

9)       Are There Other Industry “Standards”

Is there a standard for the number of call attempts that should be made to a household?  What are the standard times to call on a RDD study?  What are the standard incentives given to physicians?  What are the standards in interviewing children?  What are the standards for monitoring interviews? What are the standards in leaving a voice message when conducting a telephone interview?  The list goes on.  CMOR provides guidelines and best practices in such areas as how to gain cooperation and  models for effective survey introductions.   CMOR has conducted a Survey Practices Study that yielded valuable insight into how telephone centers operate and what current practices are in use.  For example, we covered items such as percent of studies monitored, hours spent for initial interviewer training, and percent of centers that leave an answering machine message.  CMOR Task Forces also have formulated guidelines for interviewer training and motivation programs.  We continue to work on relevant issues deemed important by the industry and our log of questions asked is a prime source of issues to research and answer.

10)   Where Can I Find Information For My Research Study

CMOR has long been a resource for graduate students around the world needing information about the survey research industry.  The topics vary greatly, but often there is some current research data that can be shared to help students become researchers concerned with excellence in quality research.    

Throughout the year, CMOR members as well as non-members pose questions about respondent cooperation and related topics.  Often times, the questions posed are not within the purview of CMOR’s mission, but with “sleeves rolled up” and some knowledge of industry resources, we make a concerted effort provide the inquirers with good leads to other associations, websites, research studies, articles, or data for their answers.  Sometimes the questions posed are relevant, critical questions about the industry, its standards, methods, or procedures, that have no answers now but form a pool of issues that are worthy of pursuing.