Response Rates in the European Social Survey: Increasing, Decreasing, or a Matter of Fieldwork Efforts?

Koen Beullens, KU Leuven, Belgium
Geert Loosveldt, KU Leuven, Belgium
Caroline Vandenplas, KU Leuven, Belgium
Ineke Stoop, SCP The Hague, The Netherlands

Response rates are declining increasing the risk of nonresponse error. The reasons for this decline are multiple: the rise of online surveys, mobile phones, and information requests, societal changes, greater awareness of privacy issues, etc. To combat this decline, fieldwork efforts have become increasingly intensive: widespread use of respondent incentives, advance letters, and an increased number of contact attempts. In addition, complex fieldwork strategies such as adaptive call scheduling or responsive designs have been implemented. The additional efforts to counterbalance nonresponse complicate the measurement of the increased difficulty of contacting potential respondents and convincing them to cooperate. To observe developments …

, ,

No Comments

‘Don’t Know’ Responses to Survey Items on Trust in Police and Criminal Courts: A Word of Caution

Marloes Callens, Public Governance Institute, KU Leuven, Belgium
Geert Loosveldt, Centre for Sociological Research,KU Leuven, Belgium

In 2010 the European Social Survey included a module on public trust in national police and criminal courts. The included questions were especially susceptible to item nonresponse. This study examines the interviewer and country variability in responding “I don’t know” to these questions using a beta-binomial logistic mixed model, controlling for demographic background variables. The results show that there are large differences between interviewers and countries which are not due to underlying demographic differences between the respondents. The difference in data quality between interviewers and countries make (inter)national comparisons more difficult. More importantly, we could assume that these missing values …

, , ,

No Comments

The Need to Account for Complex Sampling Features when Analyzing Establishment Survey Data: An Illustration using the 2013 Business Research and Development and Innovation Survey (BRDIS)

Brady T. West, Survey Research Center, Institute for Social Research, University of Michigan-Ann Arbor, USA,
Joseph W. Sakshaug, Institute for Employment Research, Germany,

The importance of correctly accounting for complex sampling features when generating finite population inferences based on complex sample survey data sets has now been clearly established in a variety of fields, including those in both statistical and non-statistical domains. Unfortunately, recent studies of analytic error have suggested that many secondary analysts of survey data do not ultimately account for these sampling features when analyzing their data, for a variety of possible reasons (e.g., poor documentation, or a data producer may not provide the information in a public-use data set). The research in this area has focused exclusively on analyses of …

, , , ,

No Comments

Why were there three? – Determinants of the presence of an intimate partner during face-to-face interviews

Richard Preetz, Institute of Social Science, University of Oldenburg, Germany
Malte Langeheine, Leibniz Institute for Prevention Research and Epidemiology - BIPS, Bremen, Germany

This study analyses determinants of the presence of an intimate partner during face-to-face interviews. Based on theoretical assumptions about opportunity structure, social control, social support, and companionship, we investigated partner presence using data from the first wave of the German Family Panel (pairfam). Descriptive results revealed that an intimate partner was present in every seventh interview. Multivariate results using separate logistic regression models for the presence of the female (n = 3,272) and the male partner (n = 2,348) revealed that the opportunity structure, such as the couple’s living arrangements or their employment status, had the greatest influence on the …

, , ,

No Comments

A Review of Reporting Standards in Academic Journals – A Research Note

Hagen von Hermanni - University of Leipzig, Leipzig (Germany)
Johannes Lemcke - Robert-Koch-Institute (RKI), Berlin (Germany)

Response rates can be calculated by various means, allowing the researchers the usage of different disposition codes, which in turn can result in vastly different response rates for the same survey. One of the most comprehensive reporting conventions is the ‘Standard Definitions’ by the American Association of Public Opinion Research (AAPOR), which describes specific definitions of disposition codes and various outcome rates in great detail, allowing for only a marginal variance of results. In this inquiry, we aim to document the reporting of response rates and other survey characteristics in recent publications of scientific journals. Our analyses are based on …

, , , , ,

No Comments

Except where otherwise noted, content on this site is licensed under a Creative Commons Attribution 4.0 International License. Creative Commons License