Investigating Respondent Multitasking and Distraction Using Self-reports and Interviewers’ Observations in a Dual-frame Telephone Survey

Eva Aizpurua, Center for Social and Behavioral Research, University of Northern Iowa & School of Law, Trinity College Dublin
Erin O. Heiden, Center for Social and Behavioral Research, University of Northern Iowa
Ki H. Park, Center for Social and Behavioral Research, University of Northern Iowa
Jill Wittrock, Center for Social and Behavioral Research, University of Northern Iowa
Mary E. Losch, Center for Social and Behavioral Research, University of Northern Iowa


Previous research has shown that people often engage in other activities while responding to surveys and that respondents’ multitasking generally has no effect on indicators of data quality (e.g., item non-response, non-differentiation). One of the limitations of these studies is that they have mostly used self-reported measures of respondents’ multitasking. We build on prior research by combining self-reported measures of multitasking with interviewers' observations of respondents' distractions recorded after each interview. The dataset comes from a statewide dual-frame random digit dial telephone survey of adults in a Midwestern state (n = 1,006) who were queried on topics related to awareness …


, , , , , ,

No Comments

Evaluation of Gaining Cooperation Methods for IVR Surveys in Low- and Middle-income Countries

Ashley Amaya, RTI International
Charles Lau, RTI International
Yaa Owusu-Amoah, VOTO Mobile
Jocelyn Light, VOTO Mobile


Interactive voice response (IVR) is gaining popularity as a data collection method for survey research. In low- and middle-income countries, IVR is used as a primary data collection mode. The system places an out-bound dial; when the individual answers, he/she hears a recorded greeting and invitation to begin the survey. This approach has the benefit of reducing labor costs, but without an interviewer, there is no one to help gain cooperation, answer questions, or identify the appropriate language in which to continue, resulting in low production outcome rates (e.g., cooperation rate, response rate). In this paper, we use experiments embedded …


, , ,

No Comments

Collecting Multiple Data Linkage Consents in a Mixed-mode Survey: Evidence from a large-scale longitudinal study in the UK

Marie Thornby, formerly UCL Institute of Education, UK
Lisa Calderwood, UCL Institute of Education, UK
Mehul Kotecha, NatCen Social Research, UK
Kelsey Beninger, Kantar Public, formerly NatCen Social Research, UK
Alessandra Gaia, City, University of London, formerly UCL Institute of Education, UK


Linking survey responses with administrative data is a promising practice to increase the range of research questions to be explored, at a limited interview burden, both for respondents and interviewers. We describe the protocol for asking consent to data linkage on nine different sources in a large-scale nationally representative longitudinal survey of young adults in England: the Next Steps Age 25 Survey. We present empirical evidence on consent to data linkage from qualitative interviews, a pilot study, and the mainstage survey. To the best of our knowledge, this is the first study that discusses the practicalities of implementing a data …


, , ,

No Comments

Response Rates in the European Social Survey: Increasing, Decreasing, or a Matter of Fieldwork Efforts?

Koen Beullens, KU Leuven, Belgium
Geert Loosveldt, KU Leuven, Belgium
Caroline Vandenplas, KU Leuven, Belgium
Ineke Stoop, SCP The Hague, The Netherlands


Response rates are declining increasing the risk of nonresponse error. The reasons for this decline are multiple: the rise of online surveys, mobile phones, and information requests, societal changes, greater awareness of privacy issues, etc. To combat this decline, fieldwork efforts have become increasingly intensive: widespread use of respondent incentives, advance letters, and an increased number of contact attempts. In addition, complex fieldwork strategies such as adaptive call scheduling or responsive designs have been implemented. The additional efforts to counterbalance nonresponse complicate the measurement of the increased difficulty of contacting potential respondents and convincing them to cooperate. To observe developments …


, ,

No Comments

‘Don’t Know’ Responses to Survey Items on Trust in Police and Criminal Courts: A Word of Caution

Marloes Callens, Public Governance Institute, KU Leuven, Belgium
Geert Loosveldt, Centre for Sociological Research,KU Leuven, Belgium


In 2010 the European Social Survey included a module on public trust in national police and criminal courts. The included questions were especially susceptible to item nonresponse. This study examines the interviewer and country variability in responding “I don’t know” to these questions using a beta-binomial logistic mixed model, controlling for demographic background variables. The results show that there are large differences between interviewers and countries which are not due to underlying demographic differences between the respondents. The difference in data quality between interviewers and countries make (inter)national comparisons more difficult. More importantly, we could assume that these missing values …


, , ,

No Comments

Except where otherwise noted, content on this site is licensed under a Creative Commons Attribution 4.0 International License. Creative Commons License