What role does panelist engagement play in online survey data quality? Many industry and academic leaders continue to debate the answer (Baker et al., 2010). While there are many indicators of data quality in online surveys, such as item nonresponse (Rao & Gravelle, 2008) and breakoffs (Dirk & Loosveldt, 2006; Peytchev, 2009), survey completion time has recently risen in prominence. Research leaders scrutinize online panels today because of panelists participating in numerous surveys in short time periods, which suggests respondents seeking maximum returns (i.e., incentives) with minimal survey effort. Termed colloquially as speeders, survey respondents who complete a survey much more rapidly compared to the norm are the target of investigation in this study. We investigate speeding behavior in a multi-mode survey that included online and mobile versions of the same questionnaire. Respondents provided information on the type of mobile device (Android, BlackBerry, iPhone) they used to take the survey and we used that information to compare speeding behavior by mobile device type.

On one hand, speeders are likely to be professional respondents solely motivated by incentives offered for survey completion and unconcerned with the survey itself or with the answers they provide. On the other, speeders may also be well-intentioned respondents who get frustrated with a survey (too long, boring topic, contains lengthy grid items, requires answers for every item, etc.) and react by speeding through the survey. In either case, speeding is considered problematic survey behavior because respondents are not providing thoughtful, accurate answers. Consequently, the data they provide may be of poor quality, and in turn, may have to be discarded so survey estimates are not adversely affected. From an evidentiary standpoint, a number of studies have found that speeding in a survey has data quality implications (Hartmann, 2011; Henning, 2008). Speeders are associated with straight-lining response behavior (Beckers, Siegers, & Kuntz, 2011; Gittelman & Trimarchi, 2009; Roßmann, 2010; Walker, Pettit, & Rubinson, 2009) and providing shorter responses to open-ended questions (Galesic & Bosnjak, 2009; Roßmann, 2010).

1. Methods

1.1 Sample

We used data in this study from a survey that was fielded to a large, national sample of online panelists from a probability-based online panel maintained by Knowledge Networks (a GfK company). The survey sample was restricted to adult panelists in Internet households who were also smartphone users and willing to take a survey on their smartphone. Eligible panelists were pre-screened and randomly assigned to take the survey on their smartphone, via a mobile app, or online, on a computer (as they usually do).

1.2 Questionnaire Design

For the mobile phone survey, we utilized the Survey on Demand App (SODA), developed by Techneos (a Confirmit company). The survey app was programmed and optimized for all major types of smartphone operating systems. As an added advantage, the app did not require a continuous Internet connection throughout the survey. The same survey was also fielded to those assigned to the online mode. The survey itself contained 24 questions on consumer behavior, Internet usage, and TV viewing habits. These three sections were randomly ordered, although question order within the section remained unchanged. The overall appearance of the mobile and online questionnaires was made to look as similar as possible. This meant that each survey question appeared on a separate page and the survey featured short questions, short response lists, no grid items, minimal need for vertical scrolling, and was relatively short in length. The wording, sequence, response categories, and skip patterns for survey questions in each mode were identical.

Table 1. Median Completion Times
(in Minutes) by Survey Mode

Survey Mode

n

Total

Mobile app

674

5.6

 Android app

313

5.6

 BlackBerry app

70

5.6

 iPhone app

291

5.5

Online

504

5.3

All

1,178

5.5

 

Table 2. Percentage of Respondents Flagged as Speeders, by Survey Mode

Survey Mode

Total
(n=1,172)

 

Speeder Index

Mobile app

11.1%

 Android app

9.9%

 BlackBerry app

11.4%

 iPhone app

12.4%

Online

8.5%

All

10.0%

 

Table 3. Comparison of Timing Measures Between
Speeders and Non-Speeders

Timing Measures

Speeders

Non-Speeders

Percent of respondents

10%

90%

Median survey completion time (minutes)

3.0

5.7

Median question completion time (seconds)

5

8

Answered a question in 1 second

20%

3%

Questions answered under the question-specific median (range)

17-24

0-23

Table 4. Survey Taking Behavior on Open-Ended Questions

 

Percent Missing

Mean Characters Typed

Median Completion Time (seconds)

 

Speeders

Non-Speeders

Speeders

Non-Speeders

Speeders

Non-Speeders

You stated that you [love/ like/don’t like/hate] shopping. Why is that?

9.3

3.3*

26.9

44.0^

13.5

32.0

What activities do you use the Internet for?

7.6

1.9*

26.6

41.3^

15.0

34.5

How would life be different for you if there was no television?

11.0

3.3*

25.1

39.0^

13.0

28.0

*X2 tests for differences between speeders and non-speeders are statistically significant at p<.05. ^t-tests for differences between speeders and non-speeders are statistically significant at p<.05.

1.3 Data Collection and Measures

Those assigned to the mobile survey were emailed instructions to download and install the survey app on their smartphone and were provided with a survey code to start the survey. This second step was taken to ensure that only those assigned to the mobile survey app could access it. Those assigned to the online survey were sent email invitations that contained a link to the survey and were instructed to complete the survey on a PC or laptop. Table 1 on the next page shows the survey by mode/mobile app type. The mobile and online surveys were fielded simultaneously for nearly two weeks in November, 2011. A total of 732 panelists responded to the mobile app survey and 725 responded to the online survey, representing survey-specific completion rates (COMR) rates of 58 percent and 61 percent, respectively. For analysis of speeders, a portion of the survey completes was removed for data issues (e.g., missing timestamps) and as a result the effective analytical sample size was reduced from 1,457 to 1,178.

To identify speeders, we used the Speeder Index approach (Roßmann, 2010), which takes into account speeding behavior on a page-by-page basis. The Speeder Index is computed by calculating the median completion time for each page of the survey. Then, for each page we assigned a value of 1 (if the respondent’s completion time is greater than or equal to the median) or a fraction less than 1 (if the respondent’s completion time is less than the median). The value of the fraction is calculated by dividing the respondent’s completion time by the median completion time. Then, the values are summed across all pages and divided by the number of pages to which the respondent has been exposed. This summary measure is the Speeder Index, which can assume values between 0 and 1. The lower the index value, the higher the overall incidence of speeding in the survey. As the last step in this approach, 10 percent of respondents with the lowest values on the Speeder Index are flagged as speeders.

Overall, we identified 1 in 10 survey respondents as speeders and found that the median completion time for these respondents is almost half of what it is for non-speeders.

2. Results

2.1 Survey Completion Time

The median completion times by survey mode and mobile device type are presented in Table 1. The overall median completion time is approximately 5.5 minutes and it did not vary significantly by mode or by device type within mobile.

2.2 Speeders

Table 2 presents the percentage of respondents flagged as speeders, by survey mode. Using the Speeder Index, we identified 10 percent of survey respondents as speeders. Online respondents were less likely to be flagged as speeders, while iPhone respondents were more likely.

Table 3 compares various survey timing measures among speeders and non-speeders, independent of survey mode. The evidence of speeding behavior lies in their survey completion time: the median completion time for speeders was three minutes, compared to 5.7 minutes for non-speeders. That is an impressive gap; the overall median completion time among speeders is almost half that for others. Similarly, speeders completed questions in a median completion time of five seconds, while non-speeders completed questions in a median completion time of eight seconds. In addition, 20 percent of speeders were shown to answer at least one survey question in one second. Finally, those flagged with the Speeder Index answered at least 17 of the 24 questions under the question-specific median. In short, based on several survey timing measures, the Speeder Index seems to be performing its intended function – flagging those who are speeding through the survey (or at least through parts of the survey).

2.3 Effect on Data Quality

What was the effect of speeding on data quality? One very noticeable difference between speeders and non-speeders was with responses to open-ended items. The survey contained three open-ended questions with text boxes provided for qualitative responses. As shown in Table 4 on the following page, speeders were significantly more likely to skip the open-ended questions and not provide a response. In addition, when responses were provided, speeders provided shorter responses than non-speeders, consistent with findings from Galesic and Bosnjak (2009) and Roßmann (2010). Due to both these factors, speeders completed each of the open-end questions 15 to 20 seconds quicker than non-speeders, accounting for a difference of about 50 seconds in terms of overall median survey completion time.

3. Discussion and Conclusion

In this study, we investigated the issue of speeding in a multi-mode survey in an effort to understand more about this survey behavior and particularly how it manifests across modes and mobile device types. Overall, we identified 1 in 10 survey respondents as speeders and found that the median completion time for these respondents is almost half of what it is for non-speeders. Across modes, we found no significant differences in the likelihood of speeding among mobile and online survey takers. The findings provide some evidence that surveys can be designed to perform similarly among mobile and online respondents (assuming that surveys are kept short and that survey content are similar across mode). Finally, and perhaps most importantly, we demonstrated in this study that speeding does have an effect on data quality and on survey responses received. Speeders were more likely to skip open-ended questions completely and, when they did answer, were more likely to provide shorter responses. In sum, we find that speeders (both mobile and online) do not take the necessary amount of time to fully consider the question and provide a thoughtful response. While the findings described apply to this particular survey, we hope this work encourages more research on speeding behaviors in online and mobile surveys.

References

Baker, R., Blumberg, S. J., Brick, J. M., Couper, M. P., Courtright, M., Dennis, J. M., . . . Zahs, D. (2010). Research Synthesis: AAPOR Report on Online Panels. Public Opinion Quarterly, 74(4), 711-781.

Beckers, T., Siegers, P., & Kuntz, A. (2011). Speeders in Online Value Research. Paper presented at the GOR 11, Düsseldorf, Germany.

Dirk, H., & Loosveldt, G. (2006). An Experimental Study on the Effects of Personalization, Survey Length Statements, Progress Indicators, and Survey Sponsor Logos in Web Surveys. Journal of Official Statistics, 22(2), 191-210.

Galesic, M., & Bosnjak, M. (2009). Effects of Questionnaire Length on Participation and Indicators of Response Quality in a Web Survey. Public Opinion Quarterly, 73(2), 349-360. Gittelman, S., & Trimarchi, E. (2009). Variance between purchasing behavior profiles in a wide spectrum of online sample sources. http://www.mktginc.com/pdf/Short_%20Variance.pdf

Hartmann, S. (2011). In the market for an online panel? What clients need to know. http://www.quirks.com/articles/2011/20110825-2.aspx

Henning, J. (2008). What’s the Catch? Does Sample Sourcing Matter Retrieved September 27, 2011, from http://blog.vovici.com/blog/bid/17843/What-s-the-Catch-Does-Sample-Sourcing-Matter

Peytchev, A. (2009). Survey Breakoff. Public Opinion Quarterly, 73(1), 74-97.

Rao, K., & Gravelle, T. B. (2008). The Role of Survey Response Timing on Web Survey Results: Cross-Sectional and Longitudinal Analyses. Paper presented at the Midwest Association of Public Opinion Research, Chicago, IL.

Roßmann, J. (2010). Data Quality in Web Surveys of the German Longitudinal Election Study 2009. Paper presented at the ECPR Graduate Conference, Dublin, Germany. 

Walker, R., Pettit, R., & Rubinson, J. (2009). The Foundations of Quality Initiative: A Five-Part Immersion into the Quaiity of Online Research. Journal of Advertising Research.

Get the PDF of this article.