{"id":817,"date":"2013-01-23T20:01:52","date_gmt":"2013-01-23T19:01:52","guid":{"rendered":"http:\/\/fors07.unil.ch\/surveyinsights\/?p=817"},"modified":"2014-01-23T13:57:43","modified_gmt":"2014-01-23T12:57:43","slug":"measuring-interviewer-characteristics-pertinent-to-social-surveys-a-conceptual-framework","status":"publish","type":"post","link":"https:\/\/surveyinsights.org\/?p=817","title":{"rendered":"Measuring Interviewer Characteristics Pertinent to Social Surveys: A Conceptual Framework"},"content":{"rendered":"<h1><strong>Introduction<\/strong><\/h1>\n<p>In all interviewer-mediated surveys interviewers play a crucial role during the entire data collection process. They make contact with and gain cooperation from the sample unit, ask survey questions, conduct measurements, record answers and measures, and maintain respondents\u2019 motivation throughout the interview (Schaeffer, Dykema, &amp; Maynard, 2010). As such, the job of an interviewer encompasses a diversity of roles and requires a variety of skills. Especially with the rise of computer-assisted interviewing, which permits the collection of even more complex data, a well-trained staff of interviewers has become indispensable.<\/p>\n<p>When examining survey data we frequently find interviewer effects on all of these interviewer survey tasks indicating that there is variation in how interviewers handle their various responsibilities. Yet often, researchers are far removed from the interviewers and the actual survey operations (Koch, Blom, Stoop, &amp; Kappelhof, 2009) and have little or no information about what determines these interviewer effects. In fact, for the majority of survey data collection is contracted out and thus researchers have no influence on which interviewers work on their study and on how they were trained.<\/p>\n<p>The literature <em>describing<\/em> interviewer effects on various aspects of the survey process is substantial (for an overview see Schaeffer et al., 2010, chapter 13). However, only few studies have succeeded in <em>explaining<\/em> the interviewer effects found (cf. J\u00e4ckle, Lynn, Sinibaldi, &amp; Tipping, 2013). One possible reason for this research gap is the lack of information on the interviewer level, which is necessary for identifying determinants of interviewer effects. In the past years, paradata (Couper &amp; Lyberg, 2005) have been increasingly used to explain interviewer effects. Another potentially powerful source of auxiliary data is interviewer characteristics collected through an interviewer survey.<\/p>\n<p>This paper presents the conceptual framework of a new international interviewer questionnaire to explain interviewer effects. We specifically focus on interviewer effects other than interviewer falsification, since we believe that the latter cannot be explained by means of interviewer surveys. Furthermore, we developed an interviewer questionnaire for researchers who contract out fieldwork. Survey agencies aiming to identify suitable interviewers through an assessment might find a different questionnaire more appropriate. The questionnaire was developed in cooperation with researchers across various survey projects and will thus be relevant to survey projects across countries and disciplines.<\/p>\n<p>This paper consists of three parts. First, a <em><a href=\"#_sec_2\">theoretical background and literature review<\/a><\/em> outlines the main aspects of the data collection process affected by interviewer effects. The subsequent <em><a href=\"#_sec_3\">conceptual framework<\/a><\/em> constitutes the core of the paper, where the motivation for surveying various interviewer characteristics is laid out. Finally, the <em><a href=\"#_sec_4\">last section<\/a><\/em> presents findings on the variation of interviewer characteristics collected with the new interviewer questionnaire and implemented on the Survey of Health, Ageing, and Retirement in Europe (SHARE) in Germany in 2011. These results show that the survey well discriminates between interviewers, which is a prerequisite for explaining interviewer effects in survey data.<\/p>\n<h1 id=\"_sec_2\"><strong>Theoretical background and literature<\/strong><\/h1>\n<p>Exposing interviewer effects implies that outcomes of sample units assigned to the same interviewer are more similar than would be expected if variation were random.<a id=\"_ftnref1\" title=\"\" href=\"#_ftn1\">[1]<\/a> Three main types of interviewer effects can be distinguished: interviewer effects on the unit nonresponse process, on item nonresponse and on the actual measurement (<em><a href=\"#_fig_1\">Figure 1<\/a><\/em>).<\/p>\n<p><em>Figure 1: Types of interviewer effects in surveys<\/em><\/p>\n<p><a href=\"https:\/\/surveyinsights.org\/wp-content\/uploads\/2013\/01\/SMIF20121001_figure_1_.jpg\"><img loading=\"lazy\" decoding=\"async\" class=\"alignnone size-full wp-image-1521\" src=\"https:\/\/surveyinsights.org\/wp-content\/uploads\/2013\/01\/SMIF20121001_figure_1_.jpg\" alt=\"\" width=\"483\" height=\"229\" srcset=\"https:\/\/surveyinsights.org\/wp-content\/uploads\/2013\/01\/SMIF20121001_figure_1_.jpg 483w, https:\/\/surveyinsights.org\/wp-content\/uploads\/2013\/01\/SMIF20121001_figure_1_-300x142.jpg 300w\" sizes=\"auto, (max-width: 483px) 100vw, 483px\" \/><\/a><\/p>\n<h2>Interviewer effects on unit nonresponse<\/h2>\n<p>When considering the unit nonresponse process we find that interviewers are differentially successful at recruiting sample units leading to differential unit response rates. A growing literature has examined the role of the interviewer in the nonresponse process and attention has been paid to interviewer attributes, such as experience (Durban &amp; Stuart, 1951; Couper &amp; Groves, 1992; Singer, Frankel, &amp; Glassman, 1983; Snijkers, Hox, &amp; de Leeuw, 1999; Olson &amp; Peytchev, 2007; Lipps &amp; Pollien, 2011), interviewer skills (Morton-Williams, 1993; Campanelli, Sturgis, &amp; Prudon, 1997), interviewer-respondent interaction (Groves &amp; Couper, 1998), as well as survey design characteristics, such as interviewer burden (Japec, 2008) and interviewer payment (de Heer, 1999; Durrant, Groves, Staetsky, &amp; Steele, 2010).<\/p>\n<p>To explain differential response rates across interviewers survey methodologists have examined interviewer attitudes and motivation (Campanelli et al., 1997; Groves &amp; Couper, 1998; Hox &amp; Leeuw, 2002; Durrant et al., 2010; Blom, de Leeuw, &amp; Hox, 2011). This strand of research was inspired by the work of Lehtonen (1996), who developed a short interviewer attitudes scale and showed that attitudes correlate with attained response rate. Another line of studies focuses on interviewer behaviour and interviewer-respondent interaction (Couper &amp; Groves, 1992; Campanelli et al., 1997; Groves &amp; Couper, 1998; Snijkers et al., 1999). This started with the pioneering work of Morton-Williams (1993), who analysed tape recordings of survey introductions and identified successful interviewer strategies, such as, using professional and social skills, and adapting these to the doorstep situation.<\/p>\n<h2>Interviewer effects on item nonresponse<\/h2>\n<p>In addition, interviewers have an influence on item nonresponse, i.e. on the respondents\u2019 willingness to answer each question in the survey and on their consent to providing additional information. The consent to the collection of additional information can be diverse; typical examples are consent to record linkage (Lessof, 2009; Calderwood &amp; Lessof, 2009; Sakshaug,\u00a0Couper, Ofstedal, &amp; Weir, 2012; Sala, Burton, &amp; Knies, 2012; Korbmacher &amp; Schr\u00f6der, <em>forthcoming<\/em>) and consent to the collection of biomarkers in health surveys (Sakshaug, Couper, &amp; Ofstedal, 2009).<\/p>\n<p>Traditionally, the literature on interviewer effects on item response rates describes a clustering effect of item nonresponse within interviewers and tries to model these interviewer effects by demographic characteristics of the interviewer (Singer et al. 1983). Another strand of research looks into collecting additional information about the interviewers, for example on their expectations, by means of interviewer questionnaires (Singer and Kohnke-Aguirre 1979; Singer et al. 1983).<\/p>\n<h2><strong>Interviewer effects on measurement <\/strong><\/h2>\n<p>Finally, interviewers can through their observable characteristics and their actions influence the measurement itself, i.e. which answer a respondent provides. Theory related to this third type of interviewer effect typically stems from the literature on respondents\u2019 cognitive processes when answering survey questions (Tourangeau, Rips, &amp; Rasinski, 2000). This process is complex and iterates through various stages, which may be influenced by the interviewers (Cannell, Miller, &amp; Oksenberg, 1981; Tourangeau et al., 2000). Since survey questions differ widely in content and structure and since interviewer effects are estimate-specific, they can be different for different questions and topics (Schaeffer et al., 2010) and cannot be generalized for all measurements within a survey. Covering all of these different types of interviewer effects on measurement goes beyond the scope of the conceptual framework developed in this paper. Instead we focus on identifying interviewer characteristics potentially associated with interviewer effects on unit and item nonresponse.<\/p>\n<p>As described, there have been several previous attempts at explaining interviewer effects in survey data by means of interviewer surveys. However, the studies found that the predictive power of the variables collected on the interviewer questionnaires was low and explained only part of the observed variance (e.g. Hox &amp; de Leeuw, 2002; Durrant et al., 2010; Blom et al., 2011). The conceptual framework of the interviewer questionnaire presented in this paper ties in with previous work with an important extension. Instead of focusing on interviewer demographics, which seldom prove significant in explaining interviewer effects (c.f. Singer et al., 1983), and avowed doorstep behaviour, the questionnaire covers four dimensions of interviewer characteristics: Interviewers\u2019 attitudes towards the survey process, their own behaviour regarding data collection requests, experiences with conducting certain types of surveys and measurements, and their expectations regarding the survey outcome.<\/p>\n<h1 id=\"_sec_3\"><strong>Conceptual framework<\/strong><\/h1>\n<p>The goal of the new questionnaire is to implement an instrument measuring a wide range of interviewer characteristics, which have been shown relevant in previous studies (see the literature review in <em><a href=\"#_sec_2\">Theoretical background and literature<\/a><\/em>). In particular, we aim to find correlates of interviewer effects on various types of unit and item nonresponse.<\/p>\n<p>The questionnaire covers all four dimensions of interviewer characteristics: interviewer attitudes towards the survey process, interviewers\u2019 own behaviour regarding data collection requests, interviewers\u2019 experience with measurements, and interviewers\u2019 expectations regarding the survey outcome in terms of response rates. It consists of two parts. First, a battery of general items assumed to be associated with general unit and\/or item nonresponse relevant across a variety of social surveys is considered. Second, various blocks of questions aim at explaining interviewer effects that were specific to the fourth wave of SHARE Germany. These blocks may or may not apply to other surveys, which have a different survey design and focus on different research questions. The full questionnaire collects information on interviewer characteristics to explain five groups of interviewer effects: on unit nonresponse in general, on income nonresponse (as an example of item nonresponse), on unit nonresponse across different incentives groups of an experiment, on consent to the collection of four types of biomarkers, and on consent to record linkage. It is obvious that some items in this questionnaire focus on the SHARE survey, but it is not restricted to it. Segmenting the questionnaire along the five groups of interviewer effects also allows other surveys to implement the questionnaire by adopting the relevant elements.<\/p>\n<p>The conceptual framework is based on our own experiences at interviewer trainings on a diversity of studies, from findings in previous analyses of interviewer effects, and from consultations with survey methodologists on various European and US surveys. When aiming to explain interviewer effects by means of characteristics collected in an interviewer survey, the underlying assumption is that interviewers differentially impact on the data collection process, that this differential impact is related to their \u2013 conscious and subconscious \u2013 appearance and actions, and that these can be explained by characteristics collected in an interviewer survey.<\/p>\n<p><em><a href=\"#_tab_1\">Table 1<\/a><\/em> displays the four dimensions measured in the interviewer questionnaire (rows) and the interviewer effects they aim to explain (columns). We expect the first three dimensions \u2013 attitudes towards the survey process, own behaviour with regards to data collection requests, and experience with relevant types of measurements, to independently impact on the survey outcomes. The fourth dimension \u2013 interviewers\u2019 expectations regarding the survey outcome \u2013 is expected to be influenced by attitudes, behaviours, and experiences.<\/p>\n<p>The concepts covered by these four dimensions are described in the following. In addition, the interviewer survey collects general interviewer demographics and measures of interviewing experience. The question numbers cited in the following refer to the questions in the SHARE interviewer questionnaire (see\u00a0<em><a href=\"https:\/\/surveyinsights.org\/wp-content\/uploads\/2013\/01\/Appendix-A.pdf\">Appendix A<\/a><\/em> and <em><a href=\"https:\/\/surveyinsights.org\/wp-content\/uploads\/2013\/01\/Appendix-B.pdf\">Appendix B<\/a><\/em>)<\/p>\n<p><em><span class=\"breakBefore\">Table 1: Conceptual framework of the interviewer questionnaire<\/span><\/em><\/p>\n<table width=\"100%\" border=\"1\" cellspacing=\"0\" cellpadding=\"0\" align=\"left\">\n<tbody>\n<tr>\n<td rowspan=\"2\" valign=\"top\" width=\"15%\"><strong>\u00a0<\/strong><\/td>\n<td colspan=\"2\" valign=\"top\" width=\"33%\">\n<p align=\"center\"><strong>General part<\/strong><\/p>\n<\/td>\n<td colspan=\"3\" valign=\"top\" width=\"50%\">\n<p align=\"center\"><strong>SHARE-DE specific part<\/strong><\/p>\n<\/td>\n<\/tr>\n<tr>\n<td valign=\"top\" width=\"16%\"><strong>Unit non- response<\/strong><strong>\u00a0<\/strong><\/td>\n<td valign=\"top\" width=\"16%\"><strong>Item non- response<br \/>\n(income)<\/strong><\/td>\n<td valign=\"top\" width=\"16%\"><strong>Unit non- response (incentives)<\/strong><\/td>\n<td valign=\"top\" width=\"16%\"><strong>Consent to<br \/>\nbiomarker collection<\/strong><\/td>\n<td valign=\"top\" width=\"16%\"><strong>Consent to<br \/>\nrecord linkage<\/strong><\/td>\n<\/tr>\n<tr>\n<td valign=\"top\" width=\"15%\"><strong>Attitudes<\/strong><\/td>\n<td valign=\"top\" width=\"16%\">Q3: reasons for being an interviewer<br \/>\nQ4: how to conduct standardized interviews<br \/>\nQ5: how to achieve response<br \/>\nQ6, Q11, Q12: trust, data protection concerns<\/td>\n<td valign=\"top\" width=\"16%\">Q4: how to conduct standardized interviews<br \/>\nQ6, Q11, Q12: trust, data protection concerns<\/td>\n<td valign=\"top\" width=\"16%\"><\/td>\n<td valign=\"top\" width=\"16%\">Q6, Q11, Q12: trust, data protection concerns<\/td>\n<td valign=\"top\" width=\"16%\">Q6, Q11, Q12: trust, data protection concerns<\/td>\n<\/tr>\n<tr>\n<td valign=\"top\" width=\"15%\"><strong><\/strong><strong>Own behaviour<\/strong><\/td>\n<td valign=\"top\" width=\"16%\">Q8, Q9: own survey participation<br \/>\nQ27, Q28: use of internet social networks \/ online banking<\/td>\n<td valign=\"top\" width=\"16%\">Q27: use of internet social networks \/ online banking<br \/>\nQ34: income response<\/td>\n<td valign=\"top\" width=\"16%\">Q10: incentives received<\/td>\n<td valign=\"top\" width=\"16%\">Q22: consent to biomarkers, hypothetical<br \/>\nQ24: blood donation<\/td>\n<td valign=\"top\" width=\"16%\">Q13: data disclosure, hypothetical<br \/>\nQ14, Q16: data linkage, hypothetical<br \/>\nQ17: \u201cpension records cleared\u201d<br \/>\nQ27, Q28: use of internet social networks \/ online banking<\/td>\n<\/tr>\n<tr>\n<td valign=\"top\" width=\"15%\"><strong><\/strong><strong>Experience with measure- ments<\/strong><\/td>\n<td valign=\"top\" width=\"16%\">Q1, Q2: experience working as an interviewer<br \/>\nQ18: SHARE experience<\/td>\n<td valign=\"top\" width=\"16%\">Q1, Q2: experience working as an interviewer<br \/>\nQ18: SHARE experience<\/td>\n<td valign=\"top\" width=\"16%\">Q1, Q2: experience working as an interviewer<br \/>\nQ18: SHARE experience<\/td>\n<td valign=\"top\" width=\"16%\">Q23: experience with collecting bloodspots<\/td>\n<td valign=\"top\" width=\"16%\"><\/td>\n<\/tr>\n<tr>\n<td valign=\"top\" width=\"15%\"><strong>Expecta- tions <\/strong><\/td>\n<td valign=\"top\" width=\"16%\">Q19: effect of incentives on unit response<\/td>\n<td valign=\"top\" width=\"16%\">Q20: income response<\/td>\n<td valign=\"top\" width=\"16%\">Q19: effect of incentives on unit response<\/td>\n<td valign=\"top\" width=\"16%\">Q21: consent to biomarker<\/td>\n<td valign=\"top\" width=\"16%\">Q15: consent to data linkage<\/td>\n<\/tr>\n<\/tbody>\n<\/table>\n<p><em>Note: The question numbering refers to the questions in the SHARE Germany interviewer survey (see\u00a0<a href=\"https:\/\/surveyinsights.org\/wp-content\/uploads\/2013\/01\/Appendix-A.pdf\">Appendix A<\/a> and <a href=\"https:\/\/surveyinsights.org\/wp-content\/uploads\/2013\/01\/Appendix-B.pdf\">Appendix B<\/a>). Questions on the interviewers\u2019 demographic background are not displayed in the framework<\/em><\/p>\n<h2><strong>Interviewer attitudes towards the survey process <\/strong><\/h2>\n<p>Interviewers that are good at making contact and gaining cooperation from the sample unit are usually good at tailoring their approach to the situation they find at the visited address (Morton-Williams, 1993). However, tailoring takes more effort and skills than repeating the same routine with each sample unit. The extent to which interviewers make the effort of tailoring their approach might be related to their general attitudes towards their job as interviewers and towards life in general. In addition, interviewers\u2019 own concerns about data protection and their trust in other people might shape the way they approach sample units and ask their respondents for sensitive information.<\/p>\n<p>This first dimension of general interviewer attitudes in the conceptual framework covers these aspects. Some of the attitudes collected in the interviewer questionnaire are related to the questions on previous interviewer questionnaires (e.g. de Leeuw &amp; Hox, 2009). However, in addition to questions on the contacting and cooperation process, i.e. unit nonresponse, the SHARE interviewer questionnaire also collects information that might be related to item nonresponse and non-consent. The attitudes addressed are reasons for being an interviewer (Q3), attitudes towards under which circumstances it is legitimate to deviate from the standard interviewing protocols (Q4), how to best achieve unit response (Q5), and general questions regarding trust and data protection concerns (Q6, Q11 and Q12) that might be particularly effective in explaining non-consent and item nonresponse on income.<\/p>\n<h2><strong>Interviewers\u2019 own behaviour regarding data collection requests<\/strong><\/h2>\n<p>The maxim \u2018do as you would be done by\u2019 runs as a common theme through many cultures. Therefore, it is not difficult to imagine that survey requests, which interviewers themselves would not answer to, are difficult for them to sell to respondents. The second dimension of the conceptual framework thus assumes that the way interviewers behave or would behave, if faced with a similar situation as the respondent, influences the way they interact with the respondent. If interviewers participate in surveys themselves and supply all of the information asked from them, they are likely to be better at eliciting such information from their respondents.<\/p>\n<p>A series of questions in the interviewer questionnaire covers interviewers\u2019 own behaviour. These questions for example cover whether interviewers have taken part in surveys and, if so, what kind of surveys these were and whether they received any incentives (Q8, Q9 and Q10). Along a more general line, we examine how easily interviewers divulge information about themselves in their daily lives by asking about their membership in social networks like Facebook, Myspace or Twitter and their use of online banking (Q27, Q28). The questionnaire also asks about their income (Q34), to see whether item nonresponse on income on the interviewer questionnaire is correlated with item nonresponse among respondents to the SHARE survey. For measures of consent to the collection of biomarkers and consent to record linkage we inspect interviewers\u2019 actions in similar situations. The questionnaire asks whether the interviewer donates blood (Q24) and whether they have cleared their pension records (\u201cKontenkl\u00e4rung\u201d), a process German citizens are asked to go through to ensure that the pension records that the state holds are correct (Q17). Finally, the questionnaire contains hypothetical questions on whether interviewers would disclose sensitive information (Q13), consent to record linkage (Q14 and Q16) and consent to the collection of biomarkers (Q22) if asked in an interview situation.<\/p>\n<h2><strong><span class=\"breakBefore\">Interviewers\u2019 experience with measurements<\/span><\/strong><\/h2>\n<p>Interviewers\u2019 familiarity with different types of surveys and measurements may influence their confidence in conducting these. This, in turn, may shape the professionalism with which they interact with the respondents. Interviewer training levels out some of the differences in experience with measurements; however, only up to a certain degree. If interviewers, for example, have previously worked on SHARE, they have more background knowledge about the content of the study, which is knowledge they may employ in their introduction. Likewise, if interviewers have experience with pricking a small needle into someone\u2019s finger for collecting blood spots in blood sugar tests, they are likely to feel more confident about collecting dried blood spots for biomarkers and to portray this confidence during the interview. The SHARE interviewers are diverse in the experiences that they have gathered on their job and in their life in general. Some wave 4 SHARE interviewers have worked on all of the previous SHARE waves and are well used to the type of sample and the instrument. Others have conducted surveys that cover similar aspects as SHARE does.<\/p>\n<p>The third dimension of the interviewer questionnaire, therefore, investigates interviewers\u2019 experiences with working as an interviewer (Q1 and Q2), with SHARE (Q18), and with conducting blood sugar tests for diabetics (Q23).<\/p>\n<h2><strong>Interviewers\u2019 expectations regarding survey outcome<\/strong><\/h2>\n<p>Anecdotal evidence from interviewer trainings suggests that interviewers\u2019 perceptions about the viability of a survey are related to fieldwork outcomes. While implying a causal effect of interviewers\u2019 expectations on fieldwork outcomes would be far-fetched, in the context of explaining interviewer effects empirically testing whether interviewers who are confident about the success of a survey are also more likely to reach high response rates is informative.<\/p>\n<p>The final dimension in the conceptual framework covers interviewers\u2019 expectations of unit nonresponse rates, consent rates and item nonresponse rates. The survey asks interviewers what response and consent rates they expect for the different incentives groups (Q19), for the various biomarker measurements (Q21), for consent to record linkage (Q15), and for the survey questions on income (Q20).<\/p>\n<h2><strong>Alternative conceptualization<\/strong><\/h2>\n<p>When developing the interviewer questionnaire we opted for a general conceptualization of just four dimensions. We believe that dimensions one to three influence both the expectations interviewers\u2019 hold about their performance as well as their actual performance. As depicted in <em><a href=\"#_tab_1\">Table 1<\/a><\/em>, we expect certain items within each dimension to be correlated with only one of the survey outcomes, unit nonresponse, item nonresponse or biomarker \/ record-linkage non-consent, while others are expected to be associated with all types of nonresponse.<\/p>\n<p>Our framework for explaining interviewer effects is just one of many possible conceptualizations. One recent interesting conceptualization, while not directly comparable to our approach, can be found in J\u00e4ckle et al. (2013). In their complex framework they model interviewers\u2019 influence on a sample persons\u2019 likelihood of cooperation in a survey as the interplay of household psychological predisposition, interviewer observable attributes and interviewer behaviour. All of these are in turn influenced by a complex system of personality traits, interpersonal skills, expectations, experience, and socio-demographic characteristics.<\/p>\n<p>An alternative conceptualization of our framework might also go into more detail on the interrelatedness of interviewers\u2019 demographic characteristics, psychological predispositions, social environment, survey design, and the dimensions measured in the interviewer questionnaire. However, unlike other researchers involved with interviewer questionnaires previously, we consider various types of interviewer effects together. Through the complexity of a more detailed conceptual framework one might miss the wood for the trees. Nonetheless, when analysing processes leading to unit nonresponse, item nonresponse, non-consent to biomarkers or non-consent to record-linkage and considering interviewer effects thereupon, we recommend developing a specific and detailed conceptual framework for each process.<\/p>\n<h1 id=\"_sec_4\"><strong>Variation across interviewers: Results from the 2011 SHARE interviewer survey<\/strong><\/h1>\n<p>In early 2011 an interviewer questionnaire based on the conceptual framework described in this paper was implemented at the end of the interviewer training sessions for SHARE Germany. In total, 197 interviewers were trained. Participation in the interviewer survey was voluntary and interviewers did not receive any incentive for participating. 163 interviewers completed the questionnaire, yielding an 83% response rate. There was a negligible amount of item nonresponse and answers that were not codeable.<\/p>\n<p>In addition, other large-scale social surveys implemented this interviewer questionnaire. Having presented and further developed the conceptual framework at the 2010 International Workshop on Household Survey Nonresponse in Nuremberg, Germany, several other studies showed interest fostering cooperation with survey methodologists across surveys and countries. At the end of 2010 the German PASS study (Panel Arbeitsmarkt und soziale Sicherung) at the Institute for Employment Research (IAB)<a id=\"_ftnref2\" title=\"\" href=\"#_ftn2\">[2]<\/a> implemented the questionnaire online, with a 10 Euro conditional incentive and well before their interviewer trainings. By 2012 the core of this interviewer survey has been implemented in at least three further large data collections: (1) a survey aimed at measuring the methodological effect of filter questions at the IAB, (2) the German part of the Programme for the International Assessment of Adult Competencies (PIAAC) at GESIS<a id=\"_ftnref3\" title=\"\" href=\"#_ftn3\">[3]<\/a> and (3) the recruitment interview of the German Internet Panel (GIP), a longitudinal internet survey based on a face-to-face recruited probability sample of the general population conducted by Mannheim University<a id=\"_ftnref4\" title=\"\" href=\"#_ftn4\">[4]<\/a>. In addition, several other studies have shown an interest in implementing interviewer questionnaires based on the conceptual framework in this paper.<\/p>\n<p>Variation in the interviewer data is a prerequisite for explaining interviewer effects in survey data. Paragraphs 4.1\u20134.4 show that there is considerable variation in key variables in the 2011 SHARE Germany interviewer survey. We focus on variables from the core of our conceptual framework, i.e. those related to item and unit nonresponse, rather than those applicable to the collection of biomarkers and consent to record linkage.<\/p>\n<h2><strong>Variation in attitudes towards the survey process<\/strong><\/h2>\n<p>The first dimension of the conceptual framework is the attitudes that interviewers hold regarding survey interviews. In question 3 interviewers were asked for their reasons for working as an interviewer. <em><a href=\"#_fig_2\">Figure 2<\/a><\/em> shows that while many interviewers gave importance to most of the reasons presented, there was considerable variation. For example, while the opportunity of interacting with people (socialize) and gaining insight into other people\u2019s social circumstances were given importance scores of six and seven by about 45% of interviewers, about 80% of interviewers mentioned that the possibility to determine their own working hours and interesting work was this important to them.<\/p>\n<p><em><span class=\"breakBefore\">Figure 2: Attitudes \u2013 reasons for working as an interviewer<\/span><\/em><\/p>\n<p><a id=\"_fig_2\" href=\"https:\/\/surveyinsights.org\/wp-content\/uploads\/2013\/01\/SMIF20121001_figure2.png\"><img loading=\"lazy\" decoding=\"async\" class=\"alignnone  wp-image-991\" src=\"https:\/\/surveyinsights.org\/wp-content\/uploads\/2013\/01\/SMIF20121001_figure2.png\" alt=\"\" width=\"678\" height=\"294\" srcset=\"https:\/\/surveyinsights.org\/wp-content\/uploads\/2013\/01\/SMIF20121001_figure2.png 969w, https:\/\/surveyinsights.org\/wp-content\/uploads\/2013\/01\/SMIF20121001_figure2-300x130.png 300w\" sizes=\"auto, (max-width: 678px) 100vw, 678px\" \/><\/a><\/p>\n<p><em>\u201cThere are different reasons for working as an interviewer. How important are the following aspects to you?\u201d (Q3)<\/em><\/p>\n<p>The survey also contains an item battery inquiring interviewers\u2019 attitudes towards sticking to the prescribed interviewing protocols. Since interviewers are regularly trained and know what they are supposed to do, we were concerned that interviewers\u2019 attitudes towards the protocols would only reflect their training. Therefore, all items were phrased such that it would be legitimate for interviewers to admit that they deviate from the protocols. As <em><a href=\"#_fig_3\">Figure 3<\/a><\/em> portrays, there is large variation across items and interviewers. For example, interviewers widely differed in their answers to the statement \u201cIf the respondent doesn&#8217;t understand a question, I explain what is actually meant by the question.\u201d Approximately 30% of interviewers answered that this statement does not at all apply to them, while almost 40% said that it perfectly applied to them. Similarly, there is great variation across interviewers as to whether they speak faster, if they notice that the respondent is in a hurry. Regarding other statements interviewers answered more homogenously. Almost all interviewers stated that they \u201calways exactly stick to the interviewer instructions, even if [they] don\u2019t consider them sensible\u201d and all agreed that if they \u201cnotice that the respondent has difficulties understanding the question, [they] speak more slowly\u201d.<\/p>\n<p><em>Figure 3: Attitudes \u2013 following the standardized interview protocols<\/em><\/p>\n<p><a id=\"_fig_3\" href=\"https:\/\/surveyinsights.org\/wp-content\/uploads\/2013\/01\/SMIF20121001_figure3.png\"><img loading=\"lazy\" decoding=\"async\" class=\"alignnone  wp-image-979\" src=\"https:\/\/surveyinsights.org\/wp-content\/uploads\/2013\/01\/SMIF20121001_figure3.png\" alt=\"\" width=\"675\" height=\"253\" srcset=\"https:\/\/surveyinsights.org\/wp-content\/uploads\/2013\/01\/SMIF20121001_figure3.png 964w, https:\/\/surveyinsights.org\/wp-content\/uploads\/2013\/01\/SMIF20121001_figure3-300x112.png 300w\" sizes=\"auto, (max-width: 675px) 100vw, 675px\" \/><\/a><\/p>\n<p><em><span class=\"breakBefore\">\u201cBelow follows a series of statements about difficult respondents and contact attempts. We would like to know from you, how you react in the following situations. The statement applies to me \u2026\u201d (Q4)<\/span><\/em><\/p>\n<p>We researched interviewers\u2019 attitudes towards data protection concerns and asked them how concerned they were about the safety of their personal data. As described above, we assume that this might be an indicator of how much trust in data protection they can instill in the respondent during the interview. Again, the results from the survey demonstrate variation in data protection concerns across interviewers (<em><a href=\"#_fig_4\">Figure 4<\/a><\/em>) with between 17% and 40% of answers in each of the four categories.<\/p>\n<p><em>Figure 4: Attitudes \u2013 data protection concerns<\/em><\/p>\n<p><a id=\"_fig_4\" href=\"https:\/\/surveyinsights.org\/wp-content\/uploads\/2013\/01\/SMIF20121001_figure4.png\"><img loading=\"lazy\" decoding=\"async\" class=\"alignnone  wp-image-978\" src=\"https:\/\/surveyinsights.org\/wp-content\/uploads\/2013\/01\/SMIF20121001_figure4.png\" alt=\"\" width=\"588\" height=\"315\" srcset=\"https:\/\/surveyinsights.org\/wp-content\/uploads\/2013\/01\/SMIF20121001_figure4.png 840w, https:\/\/surveyinsights.org\/wp-content\/uploads\/2013\/01\/SMIF20121001_figure4-300x160.png 300w\" sizes=\"auto, (max-width: 588px) 100vw, 588px\" \/><\/a><\/p>\n<p><em>\u201cHow concerned are you about the safety of your personal data?\u201d (Q11)<\/em><\/p>\n<h2><strong>Variation in interviewer <\/strong><strong>behaviour regarding data collection requests<\/strong><\/h2>\n<p>The second dimension of the conceptual framework measures interviewers\u2019 own behaviour in survey situations or similar contexts. The items displayed in <em><a href=\"#_fig_5\">Figure 5<\/a><\/em> indirectly look at whether interviewers are concerned about their private data, as we asked them if they used social networks and online banking. The figure illustrates that interviewers by no means are a homogenous group of people when it comes to their behaviour on the Internet. While about 35% of interviewers use social networks, 63% have sufficient trust in the safety of the Internet to use it for online banking.<\/p>\n<p><em><span class=\"breakBefore\">Figure 5: Own behaviour \u2013 social networks and online banking<\/span><\/em><\/p>\n<p><a href=\"https:\/\/surveyinsights.org\/wp-content\/uploads\/2013\/01\/own_behaviour.png\"><img loading=\"lazy\" decoding=\"async\" class=\"alignnone  wp-image-1384\" src=\"https:\/\/surveyinsights.org\/wp-content\/uploads\/2013\/01\/own_behaviour.png\" alt=\"\" width=\"540\" height=\"324\" srcset=\"https:\/\/surveyinsights.org\/wp-content\/uploads\/2013\/01\/own_behaviour.png 750w, https:\/\/surveyinsights.org\/wp-content\/uploads\/2013\/01\/own_behaviour-300x180.png 300w\" sizes=\"auto, (max-width: 540px) 100vw, 540px\" \/><\/a><\/p>\n<p><em>\u201cDo you use social networks in the internet like Facebook, Myspace or Twitter?\u201d (Q27)<br \/>\n\u201cDo you use the internet for online banking?\u201d (Q28)<\/em><\/p>\n<h2><strong>Variation in experience<\/strong><\/h2>\n<p>The interviewer survey contains several items measuring interviewers\u2019 experience with various measurements including their experience working as an interviewer, working for previous waves of SHARE, and collecting bloodspots. <em><a href=\"#_fig_6\">Figure 6<\/a><\/em> displays their general experience working as an interviewer. The results show that the interviewers working on the fourth SHARE wave varied in their experience: While 23% had less than one year of experience, 27% had been doing this work for more than 10 years.<\/p>\n<p><em>Figure 6: Experience with measurements \u2013 working as an interviewer<\/em><em><\/em><\/p>\n<p><a id=\"_fig_6\" href=\"https:\/\/surveyinsights.org\/wp-content\/uploads\/2013\/01\/SMIF20121001_figure6.png\"><img loading=\"lazy\" decoding=\"async\" class=\"alignnone  wp-image-981\" src=\"https:\/\/surveyinsights.org\/wp-content\/uploads\/2013\/01\/SMIF20121001_figure6.png\" alt=\"\" width=\"557\" height=\"315\" srcset=\"https:\/\/surveyinsights.org\/wp-content\/uploads\/2013\/01\/SMIF20121001_figure6.png 795w, https:\/\/surveyinsights.org\/wp-content\/uploads\/2013\/01\/SMIF20121001_figure6-300x169.png 300w\" sizes=\"auto, (max-width: 557px) 100vw, 557px\" \/><\/a><\/p>\n<p><em>\u201cHow long in total have you been working as an interviewer?\u201d (Q1)<\/em><\/p>\n<h2><strong><span class=\"breakBefore\">Variation in expectations<\/span><\/strong><\/h2>\n<p>In 2011 the refresher sample of the SHARE survey in Germany was allocated to an incentives experiment with four treatment groups of unconditional incentives (\u20ac0, \u20ac10, \u20ac20, \u20ac40). In addition, all respondents are always promised 10\u20ac for the completion of the interview. In the interviewer survey we asked about interviewers\u2019 expectations regarding their unit response rate for each of experimental conditions. The results show that interviewers differed substantially in their confidence in achieving high response rates (<em><a href=\"#_fig_7\">Figure 7<\/a><\/em>). When no unconditional incentive is sent with the advance letter, the SHARE interviewers on average expected unit response rates of 43%. However, as the boxplots in <em><a href=\"#_fig_7\">Figure 7<\/a><\/em> illustrate, the variation around the median is great. Furthermore, interviewers were confident that the higher the value of the incentive the more successful they would be in recruiting respondents. According to the interviewers\u2019 expectations the 40\u20ac unconditional household incentive paired with a 10\u20ac conditional individual incentive would on average yield a 23% increase in the unit response rate compared to a setting where no unconditional incentive is sent.<\/p>\n<p><em>Figure 7: Expectations \u2013 response rates at different incentives levels<\/em><em><\/em><\/p>\n<p><a id=\"_fig_7\" href=\"https:\/\/surveyinsights.org\/wp-content\/uploads\/2013\/01\/SMIF20121001_figure7.png\"><img loading=\"lazy\" decoding=\"async\" class=\"alignnone  wp-image-982\" src=\"https:\/\/surveyinsights.org\/wp-content\/uploads\/2013\/01\/SMIF20121001_figure7.png\" alt=\"\" width=\"578\" height=\"420\" srcset=\"https:\/\/surveyinsights.org\/wp-content\/uploads\/2013\/01\/SMIF20121001_figure7.png 826w, https:\/\/surveyinsights.org\/wp-content\/uploads\/2013\/01\/SMIF20121001_figure7-300x217.png 300w\" sizes=\"auto, (max-width: 578px) 100vw, 578px\" \/><\/a><\/p>\n<p><em>\u201cStudies vary as to whether they reward respondents for their survey participation and how much respondents receive. Please imagine that your respondents receive the following incentives. What do you expect, which percentage of your sample persons will agree to the interview, if&#8230;\u201d (Q19)<\/em><\/p>\n<h1><strong>Discussion and conclusion<\/strong><\/h1>\n<p>In this paper we propose a conceptual framework of measuring interviewer characteristics for explaining interviewer effects on unit and item response, including consent to the collection of biomarkers, consent to record linkage, and item response on income measures. The conceptual framework encompasses four dimensions of interviewer characteristics:<\/p>\n<p>&nbsp;<\/p>\n<ul>\n<li><em>Interviewer attitudes <\/em>towards the survey process that might shape the way interviewers approach sample units and ask their respondents for sensitive information, such as attitudes towards their job as interviewers, concerns about data protection and trust in other people.<\/li>\n<li><em>Interviewers\u2019 own behaviour<\/em> regarding data collection requests and hypothetical behaviour when faced with survey requests or similar measurements.<\/li>\n<li><em>Interviewers\u2019 experience with measurements<\/em>, for example, experience with conducting specific surveys or the collection of specific measurements like biomarkers or consent to record linkage.<\/li>\n<li><em>Interviewers\u2019 expectations <\/em>about the unit and item response rates they will achieve on a given survey.<\/li>\n<\/ul>\n<p>This conceptual framework formed the basis of an interviewer questionnaire implemented during the interviewer trainings in the fourth wave of SHARE Germany in early 2011. Exploratory analyses show that the survey well distinguishes between interviewers on the measures implemented along these four dimensions. This is a prerequisite for explaining interviewer effects.<strong><\/strong><\/p>\n<p><strong><\/strong>The theory, conceptual framework, and findings presented in this paper are merely a starting point for analyses of interviewer effects. Once data cleaning process are completed, the interviewer data can be linked with paradata and survey data allowing a multitude of analyses into interviewer effects in SHARE Germany. Furthermore, parts of the interviewer questionnaire were also implemented in other surveys. Cross-survey analyses will allow investigating, whether findings are survey specific or hold generally across large-scale social surveys.<\/p>\n<p>This paper aims to contribute to the literature on interviewer effects by stimulating the development, collection, and analysis of new measures of interviewer characteristics to explain and ultimately adjust for interviewer effects in survey data. We make our conceptual framework and the interviewer questionnaire available to the public to encourage the continuous development of both and to conduct analyses of interviewer effects across surveys and countries. We hope to thereby foster research and insights in the area of interviewer effects in interviewer-mediated data collections.<\/p>\n<div>\n<hr align=\"left\" size=\"1\" width=\"33%\" \/>\n<div>\n<p><a id=\"_ftn1\" title=\"\" href=\"#_ftnref1\">[1]<\/a> An interviewer effect is typically estimated by an intraclass correlation coefficient (ICC), i.e. the ratio of the interviewer variance to the sum of all variances in the model (e.g. Anderson &amp; Aitkin, 1985, Groves &amp; Magilavy, 1986). The ICC allows us to estimate to which extent the variation across respondents in the survey estimate is clustered within the interviewers conducting the survey.<\/p>\n<\/div>\n<div>\n<p><a id=\"_ftn2\" title=\"\" href=\"#_ftnref2\">[2]<\/a> http:\/\/www.iab.de\/780\/section.aspx<\/p>\n<\/div>\n<div>\n<p><a id=\"_ftn3\" title=\"\" href=\"#_ftnref3\">[3]<\/a> http:\/\/www.gesis.org\/en\/piaac\/piaac-home\/<\/p>\n<\/div>\n<div>\n<p><a id=\"_ftn4\" title=\"\" href=\"#_ftnref4\">[4]<\/a> http:\/\/reforms.uni-mannheim.de\/english\/internet_panel\/home\/<ins cite=\"mailto:AGB\" datetime=\"2012-12-02T13:50\"><\/ins>index.html<\/p>\n<\/div>\n<\/div>\n","protected":false},"excerpt":{"rendered":"<p>Introduction In all interviewer-mediated surveys interviewers play a crucial role during the entire data collection process. They make contact with and gain cooperation from the sample unit, ask survey questions, conduct measurements, record answers and measures, and maintain respondents\u2019 motivation throughout the interview (Schaeffer, Dykema, &amp; Maynard, 2010). As such, the job of an interviewer [&hellip;]<\/p>\n","protected":false},"author":8,"featured_media":0,"comment_status":"closed","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"_acf_changed":false,"footnotes":""},"categories":[80,49],"tags":[59,58,62,61,63,60],"class_list":["post-817","post","type-post","status-publish","format-standard","hentry","category-interviewer-characteristics-2","category-survey-design-2","tag-conceptual-framework","tag-interviewer-characteristics","tag-interviewer-questionnaire","tag-item-nonresponse","tag-paradata","tag-unit-nonresponse"],"acf":[],"_links":{"self":[{"href":"https:\/\/surveyinsights.org\/index.php?rest_route=\/wp\/v2\/posts\/817","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/surveyinsights.org\/index.php?rest_route=\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/surveyinsights.org\/index.php?rest_route=\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/surveyinsights.org\/index.php?rest_route=\/wp\/v2\/users\/8"}],"replies":[{"embeddable":true,"href":"https:\/\/surveyinsights.org\/index.php?rest_route=%2Fwp%2Fv2%2Fcomments&post=817"}],"version-history":[{"count":146,"href":"https:\/\/surveyinsights.org\/index.php?rest_route=\/wp\/v2\/posts\/817\/revisions"}],"predecessor-version":[{"id":3937,"href":"https:\/\/surveyinsights.org\/index.php?rest_route=\/wp\/v2\/posts\/817\/revisions\/3937"}],"wp:attachment":[{"href":"https:\/\/surveyinsights.org\/index.php?rest_route=%2Fwp%2Fv2%2Fmedia&parent=817"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/surveyinsights.org\/index.php?rest_route=%2Fwp%2Fv2%2Fcategories&post=817"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/surveyinsights.org\/index.php?rest_route=%2Fwp%2Fv2%2Ftags&post=817"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}