{"id":13357,"date":"2020-07-23T08:00:39","date_gmt":"2020-07-23T07:00:39","guid":{"rendered":"https:\/\/surveyinsights.org\/?p=13357"},"modified":"2020-07-23T06:53:21","modified_gmt":"2020-07-23T05:53:21","slug":"assessing-nonresponse-bias-by-permitting-individuals-to-opt-out-of-a-survey","status":"publish","type":"post","link":"https:\/\/surveyinsights.org\/?p=13357","title":{"rendered":"Assessing Nonresponse Bias by Permitting Individuals to Opt Out of a Survey"},"content":{"rendered":"<h1>Background<\/h1>\n<p>Relative to other survey modes, Web surveys offer the potential to collect higher quality data at a lower cost in a narrower field period (Bethlehem and Biffignandi, 2011), albeit oftentimes with a lower response rate (Manfreda et al., 2008; Shih and Fan, 2008).\u00a0 This is naturally of concern to applied researchers because, all else equal, a lower response rate increases the risk of nonresponse bias (Groves and Couper, 1998).\u00a0 Although there are numerous strategies one can pursue in the design and execution of Web-based surveys to increase response rates (e.g., Heerwegh and Loosveldt, 2006; Couper, 2008; Keusch, 2012; Dillman, Smyth, and Christian, 2014), because few of these surveys are compulsory, there is virtually always a need to compensate for nonresponse and assess the residual risk of bias it poses to point estimates.<\/p>\n<p>A variety of techniques for assessing nonresponse bias have been proposed in the survey methodology literature.\u00a0 These techniques can be divided into two classes.\u00a0 The first class exploits auxiliary information on the sampling frame or from external sources. \u00a0A critical limitation of these techniques is that they only indirectly measure nonresponse bias.\u00a0 A disparate demographic distribution, say, between respondents and known population figures will only introduce bias if those variables are also correlated with survey outcome variables (Little and Vartivarian, 2005) beyond what can be remedied by the given missing data compensation strategy employed (Little and Rubin, 2019).\u00a0 A second class of nonresponse bias assessment techniques seeks to extract information from a portion of nonrespondents to make inferences on the larger pool of nonrespondents.\u00a0 One strategy within this class is to follow up with (often a sample of) nonrespondents using a different mode or data collection protocol altogether (e.g., Criqui, Barrett-Connor, and Austin, 1978; Dallosso et al., 2003; Ingels et al., 2004; Stoop, 2004; Voogt, 2004; Groves et al., 2005).\u00a0 An inherent limitation of this second class of techniques is that the process of identifying nonrespondents for follow-up is itself subject to nonresponse bias, because not all nonrespondents can be contacted, and not all who are contacted comply with the subsequent data collection request.\u00a0 Moreover, requisite weighting of the follow-up cohort may lead to an increased unequal weighting effect (Kish, 1992) that can, in turn, decrease the precision of point estimates.<\/p>\n<p>The purpose of this article is to report results from an experiment fielded in a self-administered Web survey to evaluate a novel strategy to assess nonresponse bias that combines elements of both classes of techniques described above and addresses some of their limitations.\u00a0 The notion is to offer individuals a way to opt out from a survey by embedding a link to effectively \u201cunsubscribe\u201d to further reminder emails.\u00a0 Prior to allowing the individual to opt out, however, he or she must first indicate the primary reason for deciding not to participate.\u00a0 Hence, the strategy taps directly into nonrespondents\u2019 sentiments to learn why certain types of individuals decline the survey request.\u00a0 Additionally, though it may seem counterintuitive, introducing the opportunity to opt out also has the potential to increase the likelihood of participating, as it is believed to engender trust and empathy with the researcher (Sudman, 1985; Mullen et al., 1987).<\/p>\n<h1>Survey and Experimental Design<\/h1>\n<p>The survey data discussed in this paper were collected as part of the 2017 Federal Employee Viewpoint Survey (FEVS) (<a href=\"http:\/\/www.opm.gov\">www.opm.gov<\/a>\/fevs).\u00a0 First administered in 2002, the FEVS\u2014formerly the Federal Human Capital Survey\u2014is an annual organizational climate survey administered to civilian employees of the U.S. federal government by the U.S. Office of Personnel Management (OPM).\u00a0 The survey instrument is chiefly attitudinal in nature, measuring various dimensions of employee satisfaction, such as one\u2019s level of enjoyment with work performed, perceptions of management\u2019s leadership skills, and availability of career progression opportunities.<\/p>\n<p>Aside from a few minor exceptions, the FEVS sampling frame is derived from the Statistical Data Mart of the Enterprise Human Resources Integration (EHRI-SDM), a large-scale database of U.S. Federal government personnel maintained by OPM.\u00a0 The FEVS target population is full- or part-time, permanently employed (i.e., non-seasonal and non-political) civilian personnel who have been employed by their agency for at least six months prior to the start of the survey.\u00a0 As described in U.S. Office of Personnel Management (2017), a stratified sample of 1,139,882 employees from 85 agencies was selected for the 2017 FEVS that was fielded between May 2 and June 22, 2017.\u00a0 Agencies had staggered start and end dates, but each agency had a field period lasting exactly six weeks.\u00a0 On launch day, sampled employees were sent an email invitation to participate that contained a unique URL.\u00a0 Five weekly reminder emails were sent to nonrespondents thereafter, with the final reminder including wording to indicate that the survey would close at the end of the business day.<\/p>\n<p>A total of 112,576 employees, roughly 10% of those sampled for the 2017 FEVS, were randomly selected to participate in the opt out experiment\u2014small and independent agencies, often with fewer than 200 employees, were excluded.\u00a0 Figure 1 demonstrates the prototypical email body in which the opportunity to opt out was offered via a link labeled \u201cClick here if you are considering not participating in the FEVS.\u201d\u00a0 For those designated to be part of the opt out experiment, the link appeared in the initial invitation and all subsequent reminders.\u00a0 The link was absent in emails to employees not designated for the opt out experiment.<\/p>\n<p>&nbsp;<\/p>\n<p><strong>Figure 1<\/strong>: Example Email Invitation Body for the 2017 Federal Employee Viewpoint Survey Containing a Link to Opt Out.<\/p>\n<p><a href=\"https:\/\/surveyinsights.org\/wp-content\/uploads\/2020\/01\/Lewis_fig1.png\"><img loading=\"lazy\" decoding=\"async\" class=\"alignnone wp-image-13366 \" src=\"https:\/\/surveyinsights.org\/wp-content\/uploads\/2020\/01\/Lewis_fig1-300x257.png\" alt=\"\" width=\"517\" height=\"443\" srcset=\"https:\/\/surveyinsights.org\/wp-content\/uploads\/2020\/01\/Lewis_fig1-300x257.png 300w, https:\/\/surveyinsights.org\/wp-content\/uploads\/2020\/01\/Lewis_fig1.png 718w\" sizes=\"auto, (max-width: 517px) 100vw, 517px\" \/><\/a><\/p>\n<p>&nbsp;<\/p>\n<p>A short survey was launched when the link to opt out was clicked.\u00a0 The first question posed was \u201cWould you say that you are unsure about participating in the FEVS, or that you do not wish to participate?\u201d\u00a0 The intent of this question was to gauge the individual\u2019s conviction level about not participating\u2014essentially, to enable a proxy distinction between a soft and hard initial refusal. \u00a0The second question in the opt out survey ascertained the single most influential reason for not wanting to participate in the 2017 FEVS.\u00a0 A screen shot of this question is given in Figure 2.\u00a0 Six options were offered (e.g., being too busy, data confidentiality concerns), which were drawn from results to an open-ended question included in a nonrespondent follow-up study fielded at the conclusion of the 2004 administration of the survey (U.S. Office of Personnel Management, 2006).\u00a0 Also included was a write-in option for someone to specify an unlisted reason.\u00a0 Two research team members independently recoded 176 write-in responses into new or existing categories, with 28 initially discordant recodes requiring reconciliation.\u00a0 Some of the reasons unaccounted for by the original set of response options include a recent or pending employment status change, technical issues accessing the survey, and a belief that one had already taken the survey.\u00a0 In all, 126 of the write-in responses were grouped into new or existing categories, while the other 50 remained classified as \u201cOther.\u201d<\/p>\n<p>&nbsp;<\/p>\n<p><strong>Figure 2<\/strong>: Screenshot of Opt Out Survey Question Regarding One\u2019s Primary Reason for Refusing to Participate in the 2017 Federal Employee Viewpoint Survey.<\/p>\n<p><a href=\"https:\/\/surveyinsights.org\/wp-content\/uploads\/2020\/01\/Lewis_fig2.png\"><img loading=\"lazy\" decoding=\"async\" class=\"alignnone wp-image-13368\" src=\"https:\/\/surveyinsights.org\/wp-content\/uploads\/2020\/01\/Lewis_fig2-300x81.png\" alt=\"\" width=\"656\" height=\"177\" srcset=\"https:\/\/surveyinsights.org\/wp-content\/uploads\/2020\/01\/Lewis_fig2-300x81.png 300w, https:\/\/surveyinsights.org\/wp-content\/uploads\/2020\/01\/Lewis_fig2-768x208.png 768w, https:\/\/surveyinsights.org\/wp-content\/uploads\/2020\/01\/Lewis_fig2.png 855w\" sizes=\"auto, (max-width: 656px) 100vw, 656px\" \/><\/a><\/p>\n<p>&nbsp;<\/p>\n<p>Upon answering the question shown in Figure 2, a randomly predetermined 75% of individuals received a last-moment appeal in the form of a succinct, bulleted list of 3-4 assurances about which the individual may not have been aware.\u00a0 These assurances were tailored to the response given (e.g., brief itemization of protections in place for those expressing confidentiality concerns).\u00a0 The complementary 25% of individuals were routed to a page with short message stating no further email reminders to participate would be sent from the FEVS administration team, but that the survey link would remain active in case the individual changed his or her mind before the end of the field period.\u00a0 The purpose of this randomization\u2014and the first opt out survey question regarding nonresponse conviction level\u2014was to assess the effectiveness of the last-moment appeal, which we viewed as an automated refusal conversion procedure of sorts.\u00a0 Results pertinent to that research objective can be found in Lewis et al. (2019).\u00a0 The express purpose of this article is to report on the other research objective underpinning this experiment: to assess whether those who opt out are representative of the larger pool of 2017 FEVS nonrespondents.<\/p>\n<h1>Results<\/h1>\n<p>The flowchart in Figure 3 summarizes core dispositions and associated counts for individuals randomly selected for the 2017 FEVS opt out experiment.\u00a0 Of the original count of 112,576 individuals, 105,319 were deemed eligible to be included in the present analyses.\u00a0 We excluded from consideration individuals who had no chance to participate, which includes those for whom a valid email address could not be obtained, indication in EHRI-SDM of departing one\u2019s position between the time of sample selection and survey administration, or being on an approved absence encompassing the full field period.<\/p>\n<p>&nbsp;<\/p>\n<p><strong>Figure 3<\/strong>: Flowchart of Key Dispositions and Counts for Individuals Randomly Selected for the 2017 FEVS Opt Out Experiment.<\/p>\n<p><a href=\"https:\/\/surveyinsights.org\/wp-content\/uploads\/2020\/01\/Lewis_fig3.png\"><img loading=\"lazy\" decoding=\"async\" class=\"alignnone wp-image-13369\" src=\"https:\/\/surveyinsights.org\/wp-content\/uploads\/2020\/01\/Lewis_fig3-251x300.png\" alt=\"\" width=\"324\" height=\"387\" srcset=\"https:\/\/surveyinsights.org\/wp-content\/uploads\/2020\/01\/Lewis_fig3-251x300.png 251w, https:\/\/surveyinsights.org\/wp-content\/uploads\/2020\/01\/Lewis_fig3.png 536w\" sizes=\"auto, (max-width: 324px) 100vw, 324px\" \/><\/a><\/p>\n<p>&nbsp;<\/p>\n<p>Considering the overall response rate to the 2017 FEVS was 45.5%, we were surprised to find that only 1,533 individuals (roughly 1.5% of those eligible) clicked on the link to launch the opt out survey.\u00a0 A similarly designed pilot study conducted during the 2016 FEVS resulted in a comparably low click rate, but was attributed at the time to placing the link too far towards the bottom of the email body and introducing the opt out link at the midpoint of the field period.\u00a0 Evidently, a more prominent placement and including from the onset of the field period fails to entice a greater proportion of individuals, at least for this survey and target population.<\/p>\n<p>Of the 1,533 individuals clicking on the opt out survey link, 485, or 31.6%, opted out of the 2017 FEVS.\u00a0 On the other hand, 831 individuals, or 54.2%, decided to take the survey.\u00a0 This was notably larger than the response rate of those designated for the experiment but who never clicked on the link: 46,897\/103,786 = 45.2%.\u00a0 The increase observed is a credit to the automated refusal conversion strategy targeting approximately 75% of these individuals, as discussed in Lewis et al. (2019).\u00a0 The remaining 217 individuals who viewed the opt out survey neither opted out nor completed the survey.<\/p>\n<p>Table 1 summarizes the distribution of the primary reason the 485 opters out cited for declining to participate in 2017 FEVS.\u00a0 At 29.2%, the most frequently cited reason was the belief that results would not be used to enact any substantive workplace changes.\u00a0 This is understandable, considering the survey is administered on an annual basis and it takes several months following field period closeout for data processing, weighting for unit nonresponse, and report generation to occur.\u00a0 Results must then be interpreted, action plans created, and interventions approved by management prior to being implemented.\u00a0 Logistically, it can be difficult to make noticeable cultural changes in response to one year\u2019s survey results prior to the commencement of the subsequent year\u2019s FEVS administration. \u00a0The second most frequently cited reason was concern about the confidentiality of one\u2019s responses.\u00a0 This, too, is understandable, because the survey instrument includes sensitive items regarding one\u2019s perception of an immediate supervisor\u2019s performance, and reports are generated for work units with as few as 10 respondents.\u00a0 A public-use analysis file with individual-level responses is also released.\u00a0 For both formats, a rigorous sequence of confidentiality checks and protections are applied prior to release, yet the perceived risk of disclosure apparently remains high.\u00a0 Lastly, it is interesting to note how the distribution of reasons to not participate has changed relative to the nonrespondent follow-up study conducted after the 2004 administration.\u00a0 As reported in U.S. OPM (2006), the number one reason cited at that time was being too busy, at 46.9%.\u00a0 The belief that results would not be used to change anything was a distant second, at 9.5%.\u00a0 Confidentiality concerns were only cited 6.7% of the time.<\/p>\n<p>&nbsp;<\/p>\n<p><strong>Table 1<\/strong>: Distribution of Primary Reason for Opting Out of the 2017 Federal Employee Viewpoint Survey.<\/p>\n<table width=\"508\">\n<tbody>\n<tr>\n<td width=\"419\"><strong>Nonresponse Reason<\/strong><\/td>\n<td style=\"text-align: right;\" width=\"89\"><strong>Percentage<\/strong><\/td>\n<\/tr>\n<tr>\n<td width=\"419\">Survey results are not used to change anything in my workplace<\/td>\n<td style=\"text-align: right;\" width=\"89\">29.2<\/td>\n<\/tr>\n<tr>\n<td width=\"419\">I am concerned about the confidentiality of my responses<\/td>\n<td style=\"text-align: right;\" width=\"89\">24.0<\/td>\n<\/tr>\n<tr>\n<td width=\"419\">I am too busy to take the survey<\/td>\n<td style=\"text-align: right;\" width=\"89\">15.2<\/td>\n<\/tr>\n<tr>\n<td width=\"419\">I receive too many requests to take surveys<\/td>\n<td style=\"text-align: right;\" width=\"89\">10.3<\/td>\n<\/tr>\n<tr>\n<td width=\"419\">Dislike format \/ technical issues<\/td>\n<td style=\"text-align: right;\" width=\"89\">4.4<\/td>\n<\/tr>\n<tr>\n<td width=\"419\">Recent employment change<\/td>\n<td style=\"text-align: right;\" width=\"89\">3.7<\/td>\n<\/tr>\n<tr>\n<td style=\"text-align: left;\" width=\"419\">Survey results are never shared with employees<\/td>\n<td style=\"text-align: right;\" width=\"89\">3.1<\/td>\n<\/tr>\n<tr>\n<td width=\"419\">Participation in the survey is not supported by leadership in my agency<\/td>\n<td style=\"text-align: right;\" width=\"89\">1.6<\/td>\n<\/tr>\n<tr>\n<td width=\"419\">Indifference<\/td>\n<td style=\"text-align: right;\" width=\"89\">1.2<\/td>\n<\/tr>\n<tr>\n<td width=\"419\">Believed completed the survey<\/td>\n<td style=\"text-align: right;\" width=\"89\">0.9<\/td>\n<\/tr>\n<tr>\n<td width=\"419\">Other<\/td>\n<td style=\"text-align: right;\" width=\"89\">6.5<\/td>\n<\/tr>\n<\/tbody>\n<\/table>\n<p>&nbsp;<\/p>\n<p>Figure 4 contrasts the distributions of 7 demographic variables derived from EHRI-SDM for those who opted out relative to respondents and nonrespondents who were designated for the opt out experiment.\u00a0 Ideally, the distributions of nonrespondents and opters would be found to resemble one another, as this would help to bolster the case for treating opters out\u2014and their primary nonresponse reasons cited\u2014as representative of the larger pool of nonrespondents.\u00a0 Results from Figure 4 are encouraging in that regard.\u00a0 The distributions are nearly identical for work location, supervisory status, gender, and minority status.\u00a0 The distributions appear to diverge slightly with respect to age and tenure with the U.S. Federal Government, with opters out tending to be older and longer tenured.\u00a0 For these two demographics, opters out tend to look more like respondents.\u00a0 But for the others, it appears the distributions of opters out more closely resemble nonrespondents than respondents.<\/p>\n<p>&nbsp;<\/p>\n<p><strong>Figure 4<\/strong>: Distribution of Opters Out versus Nonrespondents and Respondents in the 2017 Federal Employee Viewpoint Survey.<\/p>\n<p><a href=\"https:\/\/surveyinsights.org\/wp-content\/uploads\/2020\/01\/Lewis_fig4_rev.jpg\"><img loading=\"lazy\" decoding=\"async\" class=\"alignnone wp-image-14458 \" src=\"https:\/\/surveyinsights.org\/wp-content\/uploads\/2020\/01\/Lewis_fig4_rev-215x300.jpg\" alt=\"\" width=\"455\" height=\"635\" srcset=\"https:\/\/surveyinsights.org\/wp-content\/uploads\/2020\/01\/Lewis_fig4_rev-215x300.jpg 215w, https:\/\/surveyinsights.org\/wp-content\/uploads\/2020\/01\/Lewis_fig4_rev-735x1024.jpg 735w, https:\/\/surveyinsights.org\/wp-content\/uploads\/2020\/01\/Lewis_fig4_rev-768x1070.jpg 768w, https:\/\/surveyinsights.org\/wp-content\/uploads\/2020\/01\/Lewis_fig4_rev-1103x1536.jpg 1103w, https:\/\/surveyinsights.org\/wp-content\/uploads\/2020\/01\/Lewis_fig4_rev.jpg 1229w\" sizes=\"auto, (max-width: 455px) 100vw, 455px\" \/><\/a><\/p>\n<p>&nbsp;<\/p>\n<p>To more formally test these distributional hypotheses, we fit two multivariable logistic regression models, both using the same 7 EHRI-SDM covariates in Figure 4 as independent variables.\u00a0 In the first, we subsetted the data and assigned a dependent variable as an indicator variable of either being an opter out or a nonrespondent.\u00a0 In the second, we subsetted the data and assigned a dependent variable assigned as an indicator of either being an opter out or a respondent.\u00a0 With respect to the counts shown in Figure 3, the goal was to take a wholistic view of the auxiliary variables and investigate their simultaneous ability to predict whether an individual is likely to be one of the 485 opters out versus one of the 217 + 56,889 = 57,106 nonrespondents or one of the 831 + 46,897 = 47,728 respondents, respectively.<\/p>\n<p>Results of the tests for main effects from these two models are provided in Table 2.\u00a0 The results confirm what was suggested in Figure 4, that opters out more closely resemble nonrespondents than respondents.\u00a0 With the exception of employee age, the Wald chi-square test statistic is always smaller for the nonrespondent model, implying more compatible demographic distributions.\u00a0 Additionally, at the conventional <em>\u03b1<\/em> = 0.05 level, 5 of the 7 auxiliary variables are insignificant in the nonrespondent model, whereas only 2 of the 7 are insignificant in the respondent model.<\/p>\n<p>&nbsp;<\/p>\n<p><strong>Table 2<\/strong>: Tests of Main Effects in Multivariable Logistic Regression Models of Opters Out versus Nonrespondents and Respondents in the 2017 Federal Employee Viewpoint Survey.<\/p>\n<p>&nbsp;<\/p>\n<table style=\"height: 350px;\" width=\"838\">\n<tbody>\n<tr>\n<td colspan=\"4\" width=\"321\">\n<p style=\"text-align: center;\"><span style=\"text-decoration: underline;\"><em>Opters Out vs. Nonrespondents<\/em><\/span><\/p>\n<\/td>\n<td width=\"35\"><\/td>\n<td colspan=\"4\" width=\"321\">\n<p style=\"text-align: center;\"><span style=\"text-decoration: underline;\"><em>Opters Out vs. Respondents<\/em><\/span><\/p>\n<\/td>\n<\/tr>\n<tr>\n<td width=\"149\"><\/td>\n<td width=\"24\"><\/td>\n<td width=\"87\"><\/td>\n<td width=\"61\"><\/td>\n<td width=\"35\"><\/td>\n<td width=\"149\"><\/td>\n<td width=\"24\"><\/td>\n<td width=\"87\"><\/td>\n<td width=\"61\"><\/td>\n<\/tr>\n<tr>\n<td width=\"149\"><strong>Variable<\/strong><\/td>\n<td width=\"24\">\n<p style=\"text-align: right;\"><strong>DF<\/strong><\/p>\n<\/td>\n<td style=\"text-align: right;\" width=\"87\"><strong>Wald<br \/>\nChi-Square<\/strong><\/td>\n<td width=\"61\">\n<p style=\"text-align: right;\"><strong><em>p <\/em><\/strong><strong>value<\/strong><\/p>\n<\/td>\n<td width=\"35\"><\/td>\n<td width=\"149\"><strong>Variable<\/strong><\/td>\n<td width=\"24\">\n<p style=\"text-align: right;\"><strong>DF<\/strong><\/p>\n<\/td>\n<td style=\"text-align: right;\" width=\"87\"><strong>Wald<br \/>\nChi-Square<\/strong><\/td>\n<td width=\"61\">\n<p style=\"text-align: right;\"><strong><em>p <\/em><\/strong><strong>value<\/strong><\/p>\n<\/td>\n<\/tr>\n<tr>\n<td width=\"149\">Work Location<\/td>\n<td style=\"text-align: right;\" width=\"24\">1<\/td>\n<td style=\"text-align: right;\" width=\"87\">1.78<\/td>\n<td style=\"text-align: right;\" width=\"61\">0.18<\/td>\n<td style=\"text-align: right;\" width=\"35\"><\/td>\n<td style=\"text-align: right;\" width=\"149\">\n<p style=\"text-align: left;\">Work Location<\/p>\n<\/td>\n<td style=\"text-align: right;\" width=\"24\">1<\/td>\n<td style=\"text-align: right;\" width=\"87\">2.53<\/td>\n<td style=\"text-align: right;\" width=\"61\">0.11<\/td>\n<\/tr>\n<tr>\n<td width=\"149\">Supervisory Status<\/td>\n<td width=\"24\">\n<p style=\"text-align: right;\">2<\/p>\n<\/td>\n<td style=\"text-align: right;\" width=\"87\">4.28<\/td>\n<td style=\"text-align: right;\" width=\"61\">0.12<\/td>\n<td style=\"text-align: right;\" width=\"35\"><\/td>\n<td style=\"text-align: right;\" width=\"149\">\n<p style=\"text-align: left;\">Supervisory Status<\/p>\n<\/td>\n<td style=\"text-align: right;\" width=\"24\">2<\/td>\n<td style=\"text-align: right;\" width=\"87\">14.42<\/td>\n<td width=\"61\">\n<p style=\"text-align: right;\">0.00<\/p>\n<\/td>\n<\/tr>\n<tr>\n<td width=\"149\">Gender<\/td>\n<td style=\"text-align: right;\" width=\"24\">1<\/td>\n<td style=\"text-align: right;\" width=\"87\">0.03<\/td>\n<td style=\"text-align: right;\" width=\"61\">0.86<\/td>\n<td style=\"text-align: right;\" width=\"35\"><\/td>\n<td style=\"text-align: right;\" width=\"149\">\n<p style=\"text-align: left;\">Gender<\/p>\n<\/td>\n<td style=\"text-align: right;\" width=\"24\">1<\/td>\n<td style=\"text-align: right;\" width=\"87\">13.37<\/td>\n<td width=\"61\">\n<p style=\"text-align: right;\">0.00<\/p>\n<\/td>\n<\/tr>\n<tr>\n<td width=\"149\">Minority Status<\/td>\n<td style=\"text-align: right;\" width=\"24\">2<\/td>\n<td style=\"text-align: right;\" width=\"87\">3.95<\/td>\n<td style=\"text-align: right;\" width=\"61\">0.14<\/td>\n<td style=\"text-align: right;\" width=\"35\"><\/td>\n<td style=\"text-align: right;\" width=\"149\">\n<p style=\"text-align: left;\">Minority Status<\/p>\n<\/td>\n<td style=\"text-align: right;\" width=\"24\">2<\/td>\n<td style=\"text-align: right;\" width=\"87\">6.44<\/td>\n<td width=\"61\">\n<p style=\"text-align: right;\">0.04<\/p>\n<\/td>\n<\/tr>\n<tr>\n<td width=\"149\">Employee Age<\/td>\n<td style=\"text-align: right;\" width=\"24\">4<\/td>\n<td style=\"text-align: right;\" width=\"87\">90.45<\/td>\n<td style=\"text-align: right;\" width=\"61\">&lt; 0.01<\/td>\n<td style=\"text-align: right;\" width=\"35\"><\/td>\n<td style=\"text-align: right;\" width=\"149\">\n<p style=\"text-align: left;\">Employee Age<\/p>\n<\/td>\n<td style=\"text-align: right;\" width=\"24\">4<\/td>\n<td style=\"text-align: right;\" width=\"87\">59.72<\/td>\n<td width=\"61\">\n<p style=\"text-align: right;\">&lt;.0001<\/p>\n<\/td>\n<\/tr>\n<tr>\n<td width=\"149\">Federal Tenure<\/td>\n<td style=\"text-align: right;\" width=\"24\">3<\/td>\n<td style=\"text-align: right;\" width=\"87\">1.50<\/td>\n<td style=\"text-align: right;\" width=\"61\">0.68<\/td>\n<td style=\"text-align: right;\" width=\"35\"><\/td>\n<td style=\"text-align: right;\" width=\"149\">\n<p style=\"text-align: left;\">Federal Tenure<\/p>\n<\/td>\n<td style=\"text-align: right;\" width=\"24\">3<\/td>\n<td style=\"text-align: right;\" width=\"87\">6.94<\/td>\n<td width=\"61\">\n<p style=\"text-align: right;\">0.07<\/p>\n<\/td>\n<\/tr>\n<tr>\n<td style=\"text-align: right;\" width=\"149\">\n<p style=\"text-align: left;\">Income Level<\/p>\n<\/td>\n<td style=\"text-align: right;\" width=\"24\">3<\/td>\n<td style=\"text-align: right;\" width=\"87\">15.44<\/td>\n<td style=\"text-align: right;\" width=\"61\">&lt; 0.01<\/td>\n<td style=\"text-align: right;\" width=\"35\"><\/td>\n<td style=\"text-align: right;\" width=\"149\">\n<p style=\"text-align: left;\">Income Level<\/p>\n<\/td>\n<td style=\"text-align: right;\" width=\"24\">3<\/td>\n<td style=\"text-align: right;\" width=\"87\">37.92<\/td>\n<td width=\"61\">\n<p style=\"text-align: right;\">&lt;.0001<\/p>\n<\/td>\n<\/tr>\n<\/tbody>\n<\/table>\n<h1><\/h1>\n<h1>Discussion<\/h1>\n<p>This article presented results from an experiment fielded during a Web-based organizational climate survey, the 2017 Federal Employee Viewpoint Survey, in which a portion of sampled individuals were given the opportunity to opt out, or effectively \u201cunsubscribe,\u201d from a sequence of weekly follow-up email reminders about the survey sent to nonrespondents over the course of its six-week field period.\u00a0 Prior to being able to do so, however, the individual was asked to cite the primary reason for deciding not to participate.\u00a0 Using auxiliary information from the sampling frame, we demonstrated how the covariate distributions of those who opt out align well with the covariate distributions of the larger pool of 2017 FEVS nonrespondents.\u00a0 Hence, at least for the given survey population, the practice appears to be a promising way to learn more about nonrespondents and potentially improve response rates in future administrations.\u00a0 One example use would be to model the relative propensities of particular reasons cited, and subsequently tailor the messaging of invitations or other communications in the spirit of strategies discussed in Lynn (2016).<\/p>\n<p>The biggest challenge to the approach is enticing enough individuals to click on the opt out link.\u00a0 The fact that roughly 45% of individuals clicked on the link to launch and complete the survey, yet a mere 1.5% of individuals clicked on the opt out link just a few lines below, was somewhat puzzling and discouraging.\u00a0 Indeed, Couper (2008, p. 325) acknowledges getting respondents to open, read, and act upon stimuli in email invitations remains a critical challenge to Web survey practitioners.\u00a0 Therein lies the biggest potential avenue for further research into improvements, however.\u00a0 Focus groups could be conducted with sampled individuals to learn more about processes for viewing and acting upon survey email requests. \u00a0Utilizing and analyzing patterns in email read receipts could also be useful, as could exploring tools such as the Right Inbox add-on (<a href=\"https:\/\/www.rightinbox.com\/features\/email-tracking\">https:\/\/www.rightinbox.com\/features\/email-tracking<\/a>) where possible.\u00a0 A laboratory experiment with complementary cognitive interviewing and eye tracking (Neuert and Lenzner, 2016) or an investigation into mouse movements (Horwitz, Kreuter, and Conrad, 2017) could also be pursued to shed more light on the issue.<\/p>\n","protected":false},"excerpt":{"rendered":"<p>Background Relative to other survey modes, Web surveys offer the potential to collect higher quality data at a lower cost in a narrower field period (Bethlehem and Biffignandi, 2011), albeit oftentimes with a lower response rate (Manfreda et al., 2008; Shih and Fan, 2008).\u00a0 This is naturally of concern to applied researchers because, all else [&hellip;]<\/p>\n","protected":false},"author":1035,"featured_media":0,"comment_status":"closed","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"_acf_changed":false,"footnotes":""},"categories":[1],"tags":[306,45,60,264],"class_list":["post-13357","post","type-post","status-publish","format-standard","hentry","category-uncategorized","tag-nonresponse","tag-nonresponse-bias","tag-unit-nonresponse","tag-web-survey"],"acf":[],"_links":{"self":[{"href":"https:\/\/surveyinsights.org\/index.php?rest_route=\/wp\/v2\/posts\/13357","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/surveyinsights.org\/index.php?rest_route=\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/surveyinsights.org\/index.php?rest_route=\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/surveyinsights.org\/index.php?rest_route=\/wp\/v2\/users\/1035"}],"replies":[{"embeddable":true,"href":"https:\/\/surveyinsights.org\/index.php?rest_route=%2Fwp%2Fv2%2Fcomments&post=13357"}],"version-history":[{"count":32,"href":"https:\/\/surveyinsights.org\/index.php?rest_route=\/wp\/v2\/posts\/13357\/revisions"}],"predecessor-version":[{"id":14487,"href":"https:\/\/surveyinsights.org\/index.php?rest_route=\/wp\/v2\/posts\/13357\/revisions\/14487"}],"wp:attachment":[{"href":"https:\/\/surveyinsights.org\/index.php?rest_route=%2Fwp%2Fv2%2Fmedia&parent=13357"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/surveyinsights.org\/index.php?rest_route=%2Fwp%2Fv2%2Fcategories&post=13357"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/surveyinsights.org\/index.php?rest_route=%2Fwp%2Fv2%2Ftags&post=13357"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}