In 1995 CMOR designed and conducted the original Research Profession Image study to:

  • Serve as an industry benchmark for future measurement of cooperation levels
  • Measure the effect of specific variables on the public’s willingness to participate in surveys
  • Generate data that might be used to formulate guidelines on best practices
  • Inform the development of tools & solutions to improve respondent cooperation

In addition, in order to understand the ways in which the findings were reflective of time trends, many of the questions used in the original CMOR survey were taken from previous surveys on the public image of the research profession, conducted by Walker Research.

The 2006 Image Study

In 2006, the CMOR conducted the survey a fifth time, using the same questionnaire as in 1995 (with some revisions/ updating). This report provides an analysis of the current data from this extensive research effort. The key areas explored in‐depth are:

  • Public willingness to cooperate in opinion research studies
  • Public behavior and attitudes with regard to public opinion research (including the impact of telemarketing)
  • Public attitudes toward confidentiality and privacy protections within the research profession
  • The penetration of telephone technology, and its impact on telephone surveys
  • The current impact of the Internet as a research tool


Please note, the RDD samples are the only data trended over time. When historical comparisons (trend timelines) are noted, they refer to the RDD sample from 2006‐ not the Panel or Intercept samples.

Chapter 1 – Reasons for Refusing

  • From historical RDD samples, the proportion of people who have indicated refusing to complete a study in the past year remains stable (since 1999) at about 4 out of 10.
  • Refusers are fairly similar across demographic variables (i.e. race, age, gender, etc.). However, education appears to be positively correlated with refusals‐ the greater the education level, the more likely to refuse.
  • The greatest motivating factor (from all 3 samples) for refusing to participate in a study is the respondent ‘didn’t have the time to participate.’
  • Other strong motivators for refusing were ‘lack of interest in subject, belief that survey was a sales pitch’ (sugging), and ‘a concern that personal information would be shared.’
  • Panel sample members were much more likely to refuse (than RDD & Intercept) to participate due to a lack of adequate incentives being offered.

Chapter 2 –Reasons for Participating

  • From historical RDD samples, respondents claiming to have participated (62%) in the past year have grown by 20% since 1992 (42%).
  • From the RDD samples, a greater percentage (83%) of respondents with at least some college education indicated having participated in the past year than those with less education (68%).
  • The greatest motivating factors for participation, for all 3 samples, was the study not ‘taking a long time to complete, the (study) will be used to improve products/services, and the (study) will be used by the government for policy decisions.’
  • Additionally, panel sample respondents were highly motivated to participate by incentives. This corresponds to chapter 1 findings‐ lack of incentives being a reason for refusal among this group.
  • Looking at the RDD Sample, demographic variables illustrate differences in willingness to participate.
  • Improving products, services, and government programs/policy are strong motivators for both males and females. But, females are more likely to participate (than are males) if the survey results will assist government in programming & planning.
  • Incentives tend to play a greater role in the decision to participate for African Americans and younger respondents.
  • Study length tends to play a very strong role in the decision of whether to participate among respondents who have at least some graduate education.
  • There are no significant differences in participation between persons with listed and unlisted phone numbers.

Chapter 3 – Attitudes Toward Research

Opinions of Survey Research

  • All 3 Image samples agreed most strongly with the ideas: opinion research is useful to help produce better products/services; research gives people opportunity to provide feedback into gov’t policy decisions; and the research profession serves a useful purpose.
  • Panel survey participants strongly agreed with the idea that research is a ‘good way to make some extra money,’ whereas RDD and Intercept respondents agreed less strongly with this statement.
  • Panel sample respondents were more likely to agree with the idea that research organizations do not give away personal information.
  • The most agreed upon negative statements about studies (from all 3 samples) were that ‘(opinion research studies) are often used to disguise a sales pitch’ and ‘some questions asked about in (studies) are too personal.’

Favorite Mode of Data Collection

  • RDD sample respondents favored mail (31%), telephone (28%), & internet (20%) as their favorite types of research.
  • Both Panel (50%) and Intercept (43%) respondents also favored mail as their favorite mode of data collection. Additionally, sizable proportions of both panel (21%) and intercept (12%) selected online focus groups as a preferred method of data collection.
  • The least preferred mode of data collection by respondents from all 3 samples was in‐person surveys.

Privacy Concerns

  • Privacy concerns have been stable over time. Historical RDD comparisons illustrate that (since 1990) around 1 quarter of respondents feel that ‘(opinion research studies) are an invasion of privacy.’
  • Similarly, since 2001, between 25‐30% of respondents have felt that survey organizations can be trusted to protect their ‘rights to privacy.’
  • Panel respondents are more likely to trust (51%) that research organizations do not give away/share their personal information than are RDD (32%) or intercept (25%).
  • Fifty‐seven (57) percent of 2006 Image Study RDD respondent’s were ‘very concerned’ with threats to their personal privacy. An additional 25% were ‘somewhat concerned.’

Answering Machine/Voicemail

Looking at historical RDD trends, over the past 16 years, the trend of answering machine (with voicemail added in 2006) ownership has increased overtime and leveled off at about 3 quarters of respondents.

Call Screening

  • About 2.5 out of every 10 RDD respondents (27%) specified screening more than 75% of their telephone calls.
  • About 3 out of 10 (33%) of Panelists and 4 out of 10 (40%) Intercept respondents indicated screening more than 75% of calls.

Do Not Call List

  • Over half of respondents from the RDD (55%) and Intercept (61%) samples subscribed to either a state or Federal “Do Not Call List.”
  • Nearly three‐forth’s of panel sample respondents (68%) subscribed to either a state or Federal “Do Not Call List.”

Special Services: Caller ID, Call Blocking, Distinctive Ringing

  • Historical RDD Trends show increases in all privacy services.
  • Caller ID (58%) ownership grew over previous figures, experiencing a 32% subscription growth over 2003 figures (44%).
  • Call blocking experienced a 38% growth from 2003 (16%) to 2006 (22%).
  • Distinctive ringing also grew, jumping (44%) from 9% of 2003 RDD respondents to 13% of 2006 RDD respondents.
  • Privacy manager was added an option for the first time in 2006. Thirteen (13%) percent of RDD respondents indicated that they subscribe to privacy manager.

Chapter 4 – Challenges to Opinion Research


Historical RDD trends illustrate that incidences of ‘selling under the guise of research’ (sugging) have remained relatively stable over the past 5 years. Between about 3‐3.5 out of every 10 respondents indicate having been sugged in the previous 12 months.


Around one‐fourth to one‐half (25‐47%) of respondents, depending on the Image Study sample, believe that ‘organizations that conduct (opinion research studies) are objective and unbiased.’


Around 1 out of every 3 RDD respondents (35%) demonstrates significant uncertainty in believing in opinion research study results. This is somewhat larger than the results from 2003 (25%) and 2001 (28%).

Only 16‐18% of respondents (depending on Image Study sample) believe (or understand) the ability of statistical sampling to produce generalizable data.

Challenges to Internet Panel Research

There is reason to believe that respondents may be participating in Internet panels with an unacceptably high frequency; this may be an issue requiring further inquiry. Over 42% of panel members in the Internet panel had participated in 16+ surveys over the past year. Out of this group, there is evidence to believe that 40% had participated 90 or more times over the past year. These panel members likely belong to numerous Internet panels, and it is unclear if they introduce bias into survey results.

Chapter 5 – The Interview Experience

  • About three‐fourths of RDD and Panel respondents (76% & 77% respectively.) noted that their last opinion research study experience was either “very pleasant” or “somewhat pleasant.” By comparison, less than half of Intercept respondents (46%) felt the same way.
  • Historical RDD trends illustrate that over the past 25 years, between 76‐83% of respondents have felt that their opinion research study experience has been pleasant.

Interviewer‐Administered Modes (Telephone & In‐person)

  • In all 3 samples, the strongest positive experiences of past telephone/in‐person interviews were that the interview was ‘courteous/pleasant’ and that ‘questions/instructions were easy to understand.’ The least highly rated positive quality of interviews was that ‘subject matter was interesting.’
  • The strongest negative quality, seen from all 3 samples, was that the ‘questions were too personal.’

Mail Surveys

  • From all 3 samples, strong positive qualities of last mail survey experience were that ‘the respondent was given a reasonable amount of time to complete and return survey, questionnaire was professional in appearance, and the questions/instructions were easy to understand.’
  • The strongest negative quality was ‘the length of the survey was too long.’ Internet Surveys • From all 3 samples, results were similar to those whose last opinion research study experience was telephone, in‐person, or mail studies. Strong positive qualities of last Internet survey experience were that “the respondent was ‘given a reasonable amount of time to respond, easy to understand questions/instructions, and courteous and pleasant instructions.”
  • Likewise, the strongest negative quality was ‘the length of the survey was too long’

Chapter 6 ‐ Incentives

About 1‐1.5 out of every 10 of RDD & Intercept respondents received incentives for their last opinion research study. Five in 10, or Fifty‐one (51%), of Panel respondents received incentives.

Historical RDD samples illustrate that over the past 16 years, between 1‐3 respondents out of every 10 received incentives for their last interview.

There appears to be a relationship between interview length and amount of incentives. A greater percentage of longer studies (21 + minutes) offered large incentives ($11+) than did shorter studies (0 thru 20 minutes). Limited sample sizes hinder projections over this variable.

Incentives were seen as a much greater motivating factor for survey participation to members of the panel sample (80%), than the RDD (32%) or intercept (49%).


The Image Study explores the different qualities and characteristics of surveys from the perspective of the respondent.

1) What is important to the public? People are most likely to participate when:

They feel they have the time to participate
Research studies will obtain greater cooperation from respondents when they offer them flexibility in responding. If a respondent does not have time to participate at the moment, they can be accommodated by arranging/offering an alternative time for participation.

Many mail & Internet surveys are flexible in this way.

Telephone & in‐person surveys may achieve greater participation with well‐trained interviewers who can be assertive, yet respectful and accommodating to respondents. There are also new and emerging focus group techniques (such as online focus groups) that allow the participant flexibility in response timing.

The study will not take a long time to complete
A strong disincentive for participation is long surveys. It’s important to include only necessary items (and questions). Moreover, since concerns over data confidentiality are a concern, an appropriate way to reduce burden may be to remove demographic variables (in surveys) that are not essential to analysis or weighting.

They know their input will make a difference
Respondents are highly motivated by the knowledge that their time and input will result in changes for the better. This is entirely logical; time is an important factor in the research participation decision. It is important to communicate the idea, to the respondent, that their input will affect decision‐making.

The topic is interesting to them
A study is more likely to achieve high participation when the topic is of interest to the respondent. However, appealing to respondents based on this idea may be hazardous to the validity of the study‐ as it may attract the most interested people to participate (i.e. people with significantly different opinions).

A good alternative may be to design more interesting surveys in terms of process. Unique, exciting designs and question formats may help to boost the appeal of studies. Internet surveys (and focus groups) offer a particularly promising capacity for these types of designs (with the ability to include pictures, audio, and other unique qualities).

2) Privacy. Privacy concerns are substantial among the public.

Around one‐quarter (1/4) of Image RDD respondents feel that opinion research is an invasion of privacy. These concerns have remained stable since 1990, and are something that the research profession should be able to counteract.

Notably, about 8 in 10 respondents (from all Image samples‐ RDD, Panel, & Intercept) are concerned over threats to their personal privacy in America.

It is vital that all survey research organizations maintain a privacy policy; disclose the confidential nature of the research process and clearly regard and respect the rights of the public.

3) Favorite Mode. Although all research modes potentially have unique appeal to the respondent, mail surveys are seen as a favorite mode of data collection. This is logical as they offer time flexibility and avoid any intrusiveness that may arise out of interviewer‐administered modes.

A mail follow‐up survey may be a viable option for boosting cooperation rates (in surveys). This may be plausible when rich sampling frames are available where addresses are known, or by offering the respondent the choice at initial contact through another mode (upon sensing hesitation to participate).

4) Challenges. The research profession has a poor image in the eyes of the public. Large proportions of respondents feel:

  • they have been ‘sugged,’
  • research organizations are not objective, and
  • survey data are sometimes questionable in terms of validity

It is vital that the profession addresses these negative views through self‐regulation, communication with the public, and through pursuing illegitimate and fraudulent entities that imitate research organizations.

5) Incentives. Incentives are much more important to internet panelists than to the respondents from the RDD or intercept sample. This is logical, as panel companies have likely developed an expectation for incentives to the public. Similarly, regularity of participation required by panelists may demand greater levels of personal incentive.

Other forms of samples (e.g. RDD) may achieve greater cooperation through communicating the idea that the respondent’s feedback will serve to effect decision‐making.

Special Thanks: Sample, interviewing, programming, and tabulation services provided at no charge by: Braun Research, Dynamic Logic, Redata Inc., Survey Sampling Int. & Western Wats, Inc. The Research Profession Image Study research committee was composed of Paul Braun, Donna Gillin, Patrick Glaser, Tom Kelly, Ed Ledek, Kathy Pilhuj, and Jacki Spear. Additional assistance provided by Howard Gershowitz, Terri Hansee, Chris Adams, Elyse Gammer, Howard Fienberg, LaToya Rembert-Lang, and Anndel Martin.