Evaluation of Gaining Cooperation Methods for IVR Surveys in Low- and Middle-income Countries

Ashley Amaya, RTI International
Charles Lau, RTI International
Yaa Owusu-Amoah, VOTO Mobile
Jocelyn Light, VOTO Mobile


Interactive voice response (IVR) is gaining popularity as a data collection method for survey research. In low- and middle-income countries, IVR is used as a primary data collection mode. The system places an out-bound dial; when the individual answers, he/she hears a recorded greeting and invitation to begin the survey. This approach has the benefit of reducing labor costs, but without an interviewer, there is no one to help gain cooperation, answer questions, or identify the appropriate language in which to continue, resulting in low production outcome rates (e.g., cooperation rate, response rate). In this paper, we use experiments embedded …


, , ,

No Comments

Collecting Multiple Data Linkage Consents in a Mixed-mode Survey: Evidence from a large-scale longitudinal study in the UK

Marie Thornby, formerly UCL Institute of Education, UK
Lisa Calderwood, UCL Institute of Education, UK
Mehul Kotecha, NatCen Social Research, UK
Kelsey Beninger, Kantar Public, formerly NatCen Social Research, UK
Alessandra Gaia, City, University of London, formerly UCL Institute of Education, UK


Linking survey responses with administrative data is a promising practice to increase the range of research questions to be explored, at a limited interview burden, both for respondents and interviewers. We describe the protocol for asking consent to data linkage on nine different sources in a large-scale nationally representative longitudinal survey of young adults in England: the Next Steps Age 25 Survey. We present empirical evidence on consent to data linkage from qualitative interviews, a pilot study, and the mainstage survey. To the best of our knowledge, this is the first study that discusses the practicalities of implementing a data …


, , ,

No Comments

Response Rates in the European Social Survey: Increasing, Decreasing, or a Matter of Fieldwork Efforts?

Koen Beullens, KU Leuven, Belgium
Geert Loosveldt, KU Leuven, Belgium
Caroline Vandenplas, KU Leuven, Belgium
Ineke Stoop, SCP The Hague, The Netherlands


Response rates are declining increasing the risk of nonresponse error. The reasons for this decline are multiple: the rise of online surveys, mobile phones, and information requests, societal changes, greater awareness of privacy issues, etc. To combat this decline, fieldwork efforts have become increasingly intensive: widespread use of respondent incentives, advance letters, and an increased number of contact attempts. In addition, complex fieldwork strategies such as adaptive call scheduling or responsive designs have been implemented. The additional efforts to counterbalance nonresponse complicate the measurement of the increased difficulty of contacting potential respondents and convincing them to cooperate. To observe developments …


, ,

No Comments

‘Don’t Know’ Responses to Survey Items on Trust in Police and Criminal Courts: A Word of Caution

Marloes Callens, Public Governance Institute, KU Leuven, Belgium
Geert Loosveldt, Centre for Sociological Research,KU Leuven, Belgium


In 2010 the European Social Survey included a module on public trust in national police and criminal courts. The included questions were especially susceptible to item nonresponse. This study examines the interviewer and country variability in responding “I don’t know” to these questions using a beta-binomial logistic mixed model, controlling for demographic background variables. The results show that there are large differences between interviewers and countries which are not due to underlying demographic differences between the respondents. The difference in data quality between interviewers and countries make (inter)national comparisons more difficult. More importantly, we could assume that these missing values …


, , ,

No Comments

The Need to Account for Complex Sampling Features when Analyzing Establishment Survey Data: An Illustration using the 2013 Business Research and Development and Innovation Survey (BRDIS)

Brady T. West, Survey Research Center, Institute for Social Research, University of Michigan-Ann Arbor, USA, bwest@umich.edu
Joseph W. Sakshaug, Institute for Employment Research, Germany, joe.sakshaug@iab.de


The importance of correctly accounting for complex sampling features when generating finite population inferences based on complex sample survey data sets has now been clearly established in a variety of fields, including those in both statistical and non-statistical domains. Unfortunately, recent studies of analytic error have suggested that many secondary analysts of survey data do not ultimately account for these sampling features when analyzing their data, for a variety of possible reasons (e.g., poor documentation, or a data producer may not provide the information in a public-use data set). The research in this area has focused exclusively on analyses of …


, , , ,

No Comments

Except where otherwise noted, content on this site is licensed under a Creative Commons Attribution 4.0 International License. Creative Commons License