Zeitschriften und Ausgaben

Volumen 39 (2023): Heft 1 (March 2023)

Volumen 38 (2022): Heft 4 (December 2022)
Special Heft on Respondent Burden

Volumen 38 (2022): Heft 3 (September 2022)

Volumen 38 (2022): Heft 2 (June 2022)

Volumen 38 (2022): Heft 1 (March 2022)
Special Heft on Price Indices in Official Statistics

Volumen 37 (2021): Heft 4 (December 2021)

Volumen 37 (2021): Heft 3 (September 2021)
Special Heft on Population Statistics for the 21st Century

Volumen 37 (2021): Heft 2 (June 2021)
Special Heft on New Techniques and Technologies for Statistics

Volumen 37 (2021): Heft 1 (March 2021)

Volumen 36 (2020): Heft 4 (December 2020)

Volumen 36 (2020): Heft 3 (September 2020)
Special Heft on Nonresponse

Volumen 36 (2020): Heft 2 (June 2020)

Volumen 36 (2020): Heft 1 (March 2020)

Volumen 35 (2019): Heft 4 (December 2019)
Special Heft on Measuring LGBT Populations

Volumen 35 (2019): Heft 3 (September 2019)

Volumen 35 (2019): Heft 2 (June 2019)

Volumen 35 (2019): Heft 1 (March 2019)

Volumen 34 (2018): Heft 4 (December 2018)

Volumen 34 (2018): Heft 3 (September 2018)
Special Section on Responsive and Adaptive Survey Design

Volumen 34 (2018): Heft 2 (June 2018)
Special Heft on Establishment Surveys (ICES-V)

Volumen 34 (2018): Heft 1 (March 2018)

Volumen 33 (2017): Heft 4 (December 2017)

Volumen 33 (2017): Heft 3 (September 2017)
Special Heft on Responsive and Adaptive Survey Design

Volumen 33 (2017): Heft 2 (June 2017)
Special Heft on Total Survey Error (TSE)

Volumen 33 (2017): Heft 1 (March 2017)

Volumen 32 (2016): Heft 4 (December 2016)
Special Section on The Role of official Statistics in Statistical Capacity Building

Volumen 32 (2016): Heft 3 (September 2016)

Volumen 32 (2016): Heft 2 (June 2016)

Volumen 32 (2016): Heft 1 (March 2016)

Volumen 31 (2015): Heft 4 (December 2015)

Volumen 31 (2015): Heft 3 (September 2015)
Special Heft on Coverage Problems in Administrative Sources

Volumen 31 (2015): Heft 2 (June 2015)
Special Heft on New Techniques and Technologies for Statistics

Volumen 31 (2015): Heft 1 (March 2015)

Volumen 30 (2014): Heft 4 (December 2014)
Special Heft on Establishment Surveys

Volumen 30 (2014): Heft 3 (September 2014)

Volumen 30 (2014): Heft 2 (June 2014)
Special Heft on Surveying the Hard-to-Reach

Volumen 30 (2014): Heft 1 (March 2014)

Volumen 29 (2013): Heft 4 (December 2013)

Volumen 29 (2013): Heft 3 (September 2013)

Volumen 29 (2013): Heft 2 (June 2013)

Volumen 29 (2013): Heft 1 (March 2013)

Zeitschriftendaten
Format
Zeitschrift
eISSN
2001-7367
Erstveröffentlichung
01 Oct 2013
Erscheinungsweise
4 Hefte pro Jahr
Sprachen
Englisch

Suche

Volumen 36 (2020): Heft 3 (September 2020)
Special Heft on Nonresponse

Zeitschriftendaten
Format
Zeitschrift
eISSN
2001-7367
Erstveröffentlichung
01 Oct 2013
Erscheinungsweise
4 Hefte pro Jahr
Sprachen
Englisch

Suche

12 Artikel
Uneingeschränkter Zugang

Preface

Online veröffentlicht: 24 Jul 2020
Seitenbereich: 463 - 468

Zusammenfassung

Uneingeschränkter Zugang

Survey Nonresponse Trends and Fieldwork Effort in the 21st Century: Results of an International Study across Countries and Surveys

Online veröffentlicht: 24 Jul 2020
Seitenbereich: 469 - 487

Zusammenfassung

Abstract

For more than three decades, declining response rates have been of concern to both survey methodologists and practitioners. Still, international comparative studies have been scarce. In one of the first international trend analyses for the period 1980–1997, De Leeuw and De Heer (2002) describe that response rates declined over the years and that countries differed in response rates and nonresponse trends. In this article, we continued where De Leeuw and De Heer (2002) stopped, and present trend data for the next period 1998–2015 from National Statistical Institutes. When we looked at trends over time in this new data set, we found that response rates are still declining over the years. Furthermore, nonresponse trends do differ over countries, but not over surveys. Some countries show a steeper decline in response than others, but all types of surveys show the same downward trend. The differences in (non)response trends over countries can be partly explained by differences in survey design between the countries. Finally, for some countries cost indicators were available, these showed that costs increased over the years and are negatively correlated with noncontact rates.

Schlüsselwörter

  • Response trend
  • noncontact
  • refusal
  • survey design
  • fieldwork
  • costs
Uneingeschränkter Zugang

Continuing to Explore the Relation between Economic and Political Factors and Government Survey Refusal Rates: 1960–2015

Online veröffentlicht: 24 Jul 2020
Seitenbereich: 489 - 505

Zusammenfassung

Abstract

In the United States, government surveys’ refusal rates have been increasing at an alarming rate, despite traditional measures for mitigating nonresponse. Given this phenomenon, now is a good time to revisit the work of Harris-Kojetin and Tucker (1999). In that study, the authors explored the relation between economic and political conditions on Current Population Survey (CPS) refusal rates over the period 1960–1988.

They found evidence that economic and political factors are associated with survey refusals and acknowledged the need to extend this work as more data became available. In this study, our aim was to continue their analysis. First, we replicated their findings. Next, we ran the assumed underlying model on an extended time-period (1960–2015). Last, since we found that the model was not an ideal fit for the extended period, we revised it using available time series and incorporating information about the CPS sample design. In the extended, refined model, presidential approval, census year, number of jobs and not-in-labor-force rate were all significant predictors of survey refusal.

Schlüsselwörter

  • Refusal rates
  • response rates
  • nonresponse
  • time series
Uneingeschränkter Zugang

Evolution of the Initially Recruited SHARE Panel Sample Over the First Six Waves

Online veröffentlicht: 24 Jul 2020
Seitenbereich: 507 - 527

Zusammenfassung

Abstract

Attrition is a frequently observed phenomenon in panel studies. The loss of panel members over time can hamper the analysis of panel survey data. Based on data from the Survey of Health, Ageing and Retirement in Europe (SHARE), this study investigates changes in the composition of the initially recruited first-wave sample in a multi-national face-to-face panel survey of an older population over waves. By inspecting retention rates and R-indicators, we found that, despite declining retention rates, the composition of the initially recruited panel sample in Wave 1 remained stable after the second wave. Thus, after the second wave there is no further large decline in representativeness with regard to the first wave sample. Changes in the composition of the sample after the second wave over time were due mainly to mortality-related attrition. Non-mortality-related attrition had a slight effect on the changes in sample composition with regard to birth in survey country, area of residence, education, and social activities. Our study encourages researchers to investigate further the impact of mortality- and non-mortality-related attrition in multi-national surveys of older populations.

Schlüsselwörter

  • R-indicator
  • wave nonresponse
  • mortality- and non-morality-related attrition
  • panel sample composition
Uneingeschränkter Zugang

The Action Structure of Recruitment Calls and Its Analytic Implications: The Case of Disfluencies

Online veröffentlicht: 24 Jul 2020
Seitenbereich: 529 - 559

Zusammenfassung

Abstract

We describe interviewers’ actions in phone calls recruiting sample members. We illustrate (1) analytic challenges of studying how interviewers affect participation and (2) actions that undergird the variables in our models. We examine the impact of the interviewer’s disfluencies on whether a sample member accepts or declines the request for an interview as a case study. Disfluencies are potentially important if they communicate the competence or humanity of the interviewer to the sample member in a way that affects the decision to participate. Using the Wisconsin Longitudinal Study, we find that although as they begin, calls that become declinations are similar to those that become acceptances, they soon take different paths. Considering all recruitment actions together, we find that the ratio of disfluencies to words does not predict acceptance of the request for an interview, although the disfluency ratio before the turning point – request to participate or a declination – of the call does. However, after controlling for the number of actions, the disfluency ratio no longer predicts participation. Instead, when we examine actions before and after the first turning point separately, we find that the number of actions has a positive relationship with participation before and a negative relationship after.

Schlüsselwörter

  • Participation
  • nonresponse
  • disfluencies
  • recruitment
  • survey introduction
  • interviewer-respondent interaction
Uneingeschränkter Zugang

Measurement of Interviewer Workload within the Survey and an Exploration of Workload Effects on Interviewers’ Field Efforts and Performance

Online veröffentlicht: 24 Jul 2020
Seitenbereich: 561 - 588

Zusammenfassung

Abstract

Interviewer characteristics are usually assumed fixed over the fieldwork period. The number of sample units that require the interviewers’ attention, however, can vary strongly over the fieldwork period. Different workload levels produce different constraints on the time interviewers have available to contact, recruit and interview each target respondent, and may also induce different motivational effects on interviewers’ behavior as they perform their different tasks. In this article we show that fine-grained, time-varying operationalizations of project-specific workload can be useful to explain differences in interviewers’ field efforts and achieved response outcomes over the fieldwork period. We derive project-specific workload for each interviewer on each day of fieldwork in two rounds of the European Social Survey in Belgium from contact history and assignment paradata. Project-specific workload is measured as (1) the number of sample units which have been and remain assigned on any day t (assigned case workload), and (2) the number of sample units for which interviewer activity has started and not yet ceased on any day t (active case workload). Capturing temporal variation in interviewers’ workloads in a direct way, the time-varying operationalizations, are better predictors than are the interviewer-level operationalizations of typical (active or potential) workload that are derived from them, as well as the traditional total-count workload operationalization.

Schlüsselwörter

  • Nonresponse
  • interviewer effort
  • interviewer effects
  • time-varying interviewer characteristics
  • paradata
Uneingeschränkter Zugang

Assessing Interviewer Performance in Approaching Reissued Initial Nonrespondents

Online veröffentlicht: 24 Jul 2020
Seitenbereich: 589 - 607

Zusammenfassung

Abstract

Nonresponse is a repeatedly reported concern in survey research. In this article, we investigate the technique of reissuing nonrespondents to another interviewer and attempting to convert them into respondents, using data of Rounds 7 and 8 of the European Social Survey (ESS) in Belgium. The results show no marked differences between respondents interviewed by the more and the less successful interviewers, indicating that the latter are not more successful in persuading more reluctant respondents to participate. Sample units that were unsuccessfully approached in the initial phase by an interviewer with a high response rate are more difficult to convert during the reissue phase. Sample units that were unsuccessfully approached in the initial phase by an interviewer with a low response rate are easier to convert during the reissue phase.

Schlüsselwörter

  • Nonresponse
  • European social survey
  • reissuing
Uneingeschränkter Zugang

Implementing Adaptive Survey Design with an Application to the Dutch Health Survey

Online veröffentlicht: 24 Jul 2020
Seitenbereich: 609 - 629

Zusammenfassung

Abstract

Adaptive survey design has attracted great interest in recent years, but the number of case studies describing actual implementation is still thin. Reasons for this may be the gap between survey methodology and data collection, practical complications in differentiating effort across sample units and lack of flexibility of survey case management systems. Currently, adaptive survey design is a standard option in redesigns of person and household surveys at Statistics Netherlands and it has been implemented for the Dutch Health survey in 2018. In this article, the implementation of static adaptive survey designs is described and motivated with a focus on practical feasibility.

Uneingeschränkter Zugang

The Effects of Nonresponse and Sampling Omissions on Estimates on Various Topics in Federal Surveys: Telephone and IVR Surveys of Address-Based Samples

Online veröffentlicht: 24 Jul 2020
Seitenbereich: 631 - 645

Zusammenfassung

Abstract

With declining response rates and challenges of using RDD sampling for telephone surveys, collecting data from address-based samples has become more attractive. Two approaches are doing telephone interviews at telephone numbers matched to addresses and asking those at sampled addresses to call into an Interactive Voice Response (IVR) system to answer questions. This study used in-person interviewing to evaluate the effects of nonresponse and problems matching telephone numbers when telephone and IVR were used as the initial modes of data collection. The survey questions were selected from major US federal surveys covering a variety of topics. Both nonresponse and, for telephone, inability to find matches result in important nonresponse error for nearly half the measures across all topics, even after adjustments to fit the known demographic characteristics of the residents. Producing credible estimates requires using supplemental data collection strategies to reduce error from nonresponse.

Schlüsselwörter

  • Mixed modes
  • address-based samples
Uneingeschränkter Zugang

Working with Response Probabilities

Online veröffentlicht: 24 Jul 2020
Seitenbereich: 647 - 674

Zusammenfassung

Abstract

Sample surveys are often affected by nonresponse. These surveys have in common that their outcomes depend at least partly on a human decision whether or not to participate. If it would be completely clear how this decision mechanism works, estimates could be corrected. An often used approach is to introduce the concept of the response probability. Of course, these probabilities are a theoretical concept and therefore unknown. The idea is to estimate them by using the available data. If it is possible to obtain good estimates of the response probabilities, they can be used to improve estimators of population characteristics.

Estimating response probabilities relies heavily on the use of models. An often used model is the logit model. In the article, this model is compared with the simple linear model.

Estimation of response probabilities models requires the individual values of the auxiliary variables to be available for both the respondents and the nonrespondents of the survey. Unfortunately, this is often not the case. This article explores some approaches for estimating response probabilities that have less heavy data requirements. The estimated response probabilities were also used to measure possible deviations from representativity of the survey response. The indicator used is the coefficient of variation (CV) of the response probabilities.

Schlüsselwörter

  • Nonresponse
  • adjustment weighting
  • response propensity
  • representativity
Uneingeschränkter Zugang

A Validation of R-Indicators as a Measure of the Risk of Bias using Data from a Nonresponse Follow-Up Survey

Online veröffentlicht: 24 Jul 2020
Seitenbereich: 675 - 701

Zusammenfassung

Abstract

R-indicators are increasingly used as nonresponse bias indicators. However, their effectiveness depends on the auxiliary data used in their estimation. Because of this, it is not always clear for practitioners what the magnitude of the R-indicator implies for bias in other survey variables, or how adjustment on auxiliary variables will affect it. In this article, we investigate these potential limitations of R-indicators in a case study using data from the Swiss European Social Survey (ESS5), which included a nonresponse follow-up (NRFU) survey. First, we analyse correlations between estimated response propensities based on auxiliary data from the register-based sampling frame, and responses to survey questions also included in the NRFU. We then examine how these relate to bias detected by the NRFU, before and after adjustment, and to predictions of the risk of bias provided by the R-indicator. While the results lend support for the utility of R-indicators as summary statistics of bias risk, they suggest a need for caution in their interpretation. Even where auxiliary variables are correlated with target variables, more bias in the former (resulting in a larger R-indicator) does not automatically imply more bias in the latter, nor does adjustment on the former necessarily reduce bias in the latter.

Schlüsselwörter

  • Nonresponse
  • R-indicator
  • propensity score weighting
  • nonresponse survey
Uneingeschränkter Zugang

Proxy Pattern-Mixture Analysis for a Binary Variable Subject to Nonresponse

Online veröffentlicht: 24 Jul 2020
Seitenbereich: 703 - 728

Zusammenfassung

Abstract

Given increasing survey nonresponse, good measures of the potential impact of nonresponse on survey estimates are particularly important. Existing measures, such as the R-indicator, make the strong assumption that missingness is missing at random, meaning that it depends only on variables that are observed for respondents and nonrespondents. We consider assessment of the impact of nonresponse for a binary survey variable Y subject to nonresponse when missingness may be not at random, meaning that missingness may depend on Y itself. Our work is motivated by missing categorical income data in the 2015 Ohio Medicaid Assessment Survey (OMAS), where whether or not income is missing may be related to the income value itself, with low-income earners more reluctant to respond. We assume there is a set of covariates observed for nonrespondents and respondents, which for the item nonresponse (as in OMAS) is often a rich set of variables, but which may be potentially limited in cases of unit nonresponse. To reduce dimensionality and for simplicity we reduce these available covariates to a continuous proxy variable X, available for both respondents and nonrespondents, that has the highest correlation with Y, estimated from a probit regression analysis of respondent data. We extend the previously proposed proxy-pattern mixture (PPM) analysis for continuous outcomes to the binary outcome using a latent variable approach for modeling the joint distribution of Y and X. Our method does not assume data are missing at random but includes it as a special case, thus creating a convenient framework for sensitivity analyses. Maximum likelihood, Bayesian, and multiple imputation versions of PPM analysis are described, and robustness of these methods to model assumptions is discussed. Properties are demonstrated through simulation and with the 2015 OMAS data.

Schlüsselwörter

  • Missing data
  • nonignorable nonresponse
  • nonresponse bias
  • survey data
  • bayesian methods
12 Artikel
Uneingeschränkter Zugang

Preface

Online veröffentlicht: 24 Jul 2020
Seitenbereich: 463 - 468

Zusammenfassung

Uneingeschränkter Zugang

Survey Nonresponse Trends and Fieldwork Effort in the 21st Century: Results of an International Study across Countries and Surveys

Online veröffentlicht: 24 Jul 2020
Seitenbereich: 469 - 487

Zusammenfassung

Abstract

For more than three decades, declining response rates have been of concern to both survey methodologists and practitioners. Still, international comparative studies have been scarce. In one of the first international trend analyses for the period 1980–1997, De Leeuw and De Heer (2002) describe that response rates declined over the years and that countries differed in response rates and nonresponse trends. In this article, we continued where De Leeuw and De Heer (2002) stopped, and present trend data for the next period 1998–2015 from National Statistical Institutes. When we looked at trends over time in this new data set, we found that response rates are still declining over the years. Furthermore, nonresponse trends do differ over countries, but not over surveys. Some countries show a steeper decline in response than others, but all types of surveys show the same downward trend. The differences in (non)response trends over countries can be partly explained by differences in survey design between the countries. Finally, for some countries cost indicators were available, these showed that costs increased over the years and are negatively correlated with noncontact rates.

Schlüsselwörter

  • Response trend
  • noncontact
  • refusal
  • survey design
  • fieldwork
  • costs
Uneingeschränkter Zugang

Continuing to Explore the Relation between Economic and Political Factors and Government Survey Refusal Rates: 1960–2015

Online veröffentlicht: 24 Jul 2020
Seitenbereich: 489 - 505

Zusammenfassung

Abstract

In the United States, government surveys’ refusal rates have been increasing at an alarming rate, despite traditional measures for mitigating nonresponse. Given this phenomenon, now is a good time to revisit the work of Harris-Kojetin and Tucker (1999). In that study, the authors explored the relation between economic and political conditions on Current Population Survey (CPS) refusal rates over the period 1960–1988.

They found evidence that economic and political factors are associated with survey refusals and acknowledged the need to extend this work as more data became available. In this study, our aim was to continue their analysis. First, we replicated their findings. Next, we ran the assumed underlying model on an extended time-period (1960–2015). Last, since we found that the model was not an ideal fit for the extended period, we revised it using available time series and incorporating information about the CPS sample design. In the extended, refined model, presidential approval, census year, number of jobs and not-in-labor-force rate were all significant predictors of survey refusal.

Schlüsselwörter

  • Refusal rates
  • response rates
  • nonresponse
  • time series
Uneingeschränkter Zugang

Evolution of the Initially Recruited SHARE Panel Sample Over the First Six Waves

Online veröffentlicht: 24 Jul 2020
Seitenbereich: 507 - 527

Zusammenfassung

Abstract

Attrition is a frequently observed phenomenon in panel studies. The loss of panel members over time can hamper the analysis of panel survey data. Based on data from the Survey of Health, Ageing and Retirement in Europe (SHARE), this study investigates changes in the composition of the initially recruited first-wave sample in a multi-national face-to-face panel survey of an older population over waves. By inspecting retention rates and R-indicators, we found that, despite declining retention rates, the composition of the initially recruited panel sample in Wave 1 remained stable after the second wave. Thus, after the second wave there is no further large decline in representativeness with regard to the first wave sample. Changes in the composition of the sample after the second wave over time were due mainly to mortality-related attrition. Non-mortality-related attrition had a slight effect on the changes in sample composition with regard to birth in survey country, area of residence, education, and social activities. Our study encourages researchers to investigate further the impact of mortality- and non-mortality-related attrition in multi-national surveys of older populations.

Schlüsselwörter

  • R-indicator
  • wave nonresponse
  • mortality- and non-morality-related attrition
  • panel sample composition
Uneingeschränkter Zugang

The Action Structure of Recruitment Calls and Its Analytic Implications: The Case of Disfluencies

Online veröffentlicht: 24 Jul 2020
Seitenbereich: 529 - 559

Zusammenfassung

Abstract

We describe interviewers’ actions in phone calls recruiting sample members. We illustrate (1) analytic challenges of studying how interviewers affect participation and (2) actions that undergird the variables in our models. We examine the impact of the interviewer’s disfluencies on whether a sample member accepts or declines the request for an interview as a case study. Disfluencies are potentially important if they communicate the competence or humanity of the interviewer to the sample member in a way that affects the decision to participate. Using the Wisconsin Longitudinal Study, we find that although as they begin, calls that become declinations are similar to those that become acceptances, they soon take different paths. Considering all recruitment actions together, we find that the ratio of disfluencies to words does not predict acceptance of the request for an interview, although the disfluency ratio before the turning point – request to participate or a declination – of the call does. However, after controlling for the number of actions, the disfluency ratio no longer predicts participation. Instead, when we examine actions before and after the first turning point separately, we find that the number of actions has a positive relationship with participation before and a negative relationship after.

Schlüsselwörter

  • Participation
  • nonresponse
  • disfluencies
  • recruitment
  • survey introduction
  • interviewer-respondent interaction
Uneingeschränkter Zugang

Measurement of Interviewer Workload within the Survey and an Exploration of Workload Effects on Interviewers’ Field Efforts and Performance

Online veröffentlicht: 24 Jul 2020
Seitenbereich: 561 - 588

Zusammenfassung

Abstract

Interviewer characteristics are usually assumed fixed over the fieldwork period. The number of sample units that require the interviewers’ attention, however, can vary strongly over the fieldwork period. Different workload levels produce different constraints on the time interviewers have available to contact, recruit and interview each target respondent, and may also induce different motivational effects on interviewers’ behavior as they perform their different tasks. In this article we show that fine-grained, time-varying operationalizations of project-specific workload can be useful to explain differences in interviewers’ field efforts and achieved response outcomes over the fieldwork period. We derive project-specific workload for each interviewer on each day of fieldwork in two rounds of the European Social Survey in Belgium from contact history and assignment paradata. Project-specific workload is measured as (1) the number of sample units which have been and remain assigned on any day t (assigned case workload), and (2) the number of sample units for which interviewer activity has started and not yet ceased on any day t (active case workload). Capturing temporal variation in interviewers’ workloads in a direct way, the time-varying operationalizations, are better predictors than are the interviewer-level operationalizations of typical (active or potential) workload that are derived from them, as well as the traditional total-count workload operationalization.

Schlüsselwörter

  • Nonresponse
  • interviewer effort
  • interviewer effects
  • time-varying interviewer characteristics
  • paradata
Uneingeschränkter Zugang

Assessing Interviewer Performance in Approaching Reissued Initial Nonrespondents

Online veröffentlicht: 24 Jul 2020
Seitenbereich: 589 - 607

Zusammenfassung

Abstract

Nonresponse is a repeatedly reported concern in survey research. In this article, we investigate the technique of reissuing nonrespondents to another interviewer and attempting to convert them into respondents, using data of Rounds 7 and 8 of the European Social Survey (ESS) in Belgium. The results show no marked differences between respondents interviewed by the more and the less successful interviewers, indicating that the latter are not more successful in persuading more reluctant respondents to participate. Sample units that were unsuccessfully approached in the initial phase by an interviewer with a high response rate are more difficult to convert during the reissue phase. Sample units that were unsuccessfully approached in the initial phase by an interviewer with a low response rate are easier to convert during the reissue phase.

Schlüsselwörter

  • Nonresponse
  • European social survey
  • reissuing
Uneingeschränkter Zugang

Implementing Adaptive Survey Design with an Application to the Dutch Health Survey

Online veröffentlicht: 24 Jul 2020
Seitenbereich: 609 - 629

Zusammenfassung

Abstract

Adaptive survey design has attracted great interest in recent years, but the number of case studies describing actual implementation is still thin. Reasons for this may be the gap between survey methodology and data collection, practical complications in differentiating effort across sample units and lack of flexibility of survey case management systems. Currently, adaptive survey design is a standard option in redesigns of person and household surveys at Statistics Netherlands and it has been implemented for the Dutch Health survey in 2018. In this article, the implementation of static adaptive survey designs is described and motivated with a focus on practical feasibility.

Uneingeschränkter Zugang

The Effects of Nonresponse and Sampling Omissions on Estimates on Various Topics in Federal Surveys: Telephone and IVR Surveys of Address-Based Samples

Online veröffentlicht: 24 Jul 2020
Seitenbereich: 631 - 645

Zusammenfassung

Abstract

With declining response rates and challenges of using RDD sampling for telephone surveys, collecting data from address-based samples has become more attractive. Two approaches are doing telephone interviews at telephone numbers matched to addresses and asking those at sampled addresses to call into an Interactive Voice Response (IVR) system to answer questions. This study used in-person interviewing to evaluate the effects of nonresponse and problems matching telephone numbers when telephone and IVR were used as the initial modes of data collection. The survey questions were selected from major US federal surveys covering a variety of topics. Both nonresponse and, for telephone, inability to find matches result in important nonresponse error for nearly half the measures across all topics, even after adjustments to fit the known demographic characteristics of the residents. Producing credible estimates requires using supplemental data collection strategies to reduce error from nonresponse.

Schlüsselwörter

  • Mixed modes
  • address-based samples
Uneingeschränkter Zugang

Working with Response Probabilities

Online veröffentlicht: 24 Jul 2020
Seitenbereich: 647 - 674

Zusammenfassung

Abstract

Sample surveys are often affected by nonresponse. These surveys have in common that their outcomes depend at least partly on a human decision whether or not to participate. If it would be completely clear how this decision mechanism works, estimates could be corrected. An often used approach is to introduce the concept of the response probability. Of course, these probabilities are a theoretical concept and therefore unknown. The idea is to estimate them by using the available data. If it is possible to obtain good estimates of the response probabilities, they can be used to improve estimators of population characteristics.

Estimating response probabilities relies heavily on the use of models. An often used model is the logit model. In the article, this model is compared with the simple linear model.

Estimation of response probabilities models requires the individual values of the auxiliary variables to be available for both the respondents and the nonrespondents of the survey. Unfortunately, this is often not the case. This article explores some approaches for estimating response probabilities that have less heavy data requirements. The estimated response probabilities were also used to measure possible deviations from representativity of the survey response. The indicator used is the coefficient of variation (CV) of the response probabilities.

Schlüsselwörter

  • Nonresponse
  • adjustment weighting
  • response propensity
  • representativity
Uneingeschränkter Zugang

A Validation of R-Indicators as a Measure of the Risk of Bias using Data from a Nonresponse Follow-Up Survey

Online veröffentlicht: 24 Jul 2020
Seitenbereich: 675 - 701

Zusammenfassung

Abstract

R-indicators are increasingly used as nonresponse bias indicators. However, their effectiveness depends on the auxiliary data used in their estimation. Because of this, it is not always clear for practitioners what the magnitude of the R-indicator implies for bias in other survey variables, or how adjustment on auxiliary variables will affect it. In this article, we investigate these potential limitations of R-indicators in a case study using data from the Swiss European Social Survey (ESS5), which included a nonresponse follow-up (NRFU) survey. First, we analyse correlations between estimated response propensities based on auxiliary data from the register-based sampling frame, and responses to survey questions also included in the NRFU. We then examine how these relate to bias detected by the NRFU, before and after adjustment, and to predictions of the risk of bias provided by the R-indicator. While the results lend support for the utility of R-indicators as summary statistics of bias risk, they suggest a need for caution in their interpretation. Even where auxiliary variables are correlated with target variables, more bias in the former (resulting in a larger R-indicator) does not automatically imply more bias in the latter, nor does adjustment on the former necessarily reduce bias in the latter.

Schlüsselwörter

  • Nonresponse
  • R-indicator
  • propensity score weighting
  • nonresponse survey
Uneingeschränkter Zugang

Proxy Pattern-Mixture Analysis for a Binary Variable Subject to Nonresponse

Online veröffentlicht: 24 Jul 2020
Seitenbereich: 703 - 728

Zusammenfassung

Abstract

Given increasing survey nonresponse, good measures of the potential impact of nonresponse on survey estimates are particularly important. Existing measures, such as the R-indicator, make the strong assumption that missingness is missing at random, meaning that it depends only on variables that are observed for respondents and nonrespondents. We consider assessment of the impact of nonresponse for a binary survey variable Y subject to nonresponse when missingness may be not at random, meaning that missingness may depend on Y itself. Our work is motivated by missing categorical income data in the 2015 Ohio Medicaid Assessment Survey (OMAS), where whether or not income is missing may be related to the income value itself, with low-income earners more reluctant to respond. We assume there is a set of covariates observed for nonrespondents and respondents, which for the item nonresponse (as in OMAS) is often a rich set of variables, but which may be potentially limited in cases of unit nonresponse. To reduce dimensionality and for simplicity we reduce these available covariates to a continuous proxy variable X, available for both respondents and nonrespondents, that has the highest correlation with Y, estimated from a probit regression analysis of respondent data. We extend the previously proposed proxy-pattern mixture (PPM) analysis for continuous outcomes to the binary outcome using a latent variable approach for modeling the joint distribution of Y and X. Our method does not assume data are missing at random but includes it as a special case, thus creating a convenient framework for sensitivity analyses. Maximum likelihood, Bayesian, and multiple imputation versions of PPM analysis are described, and robustness of these methods to model assumptions is discussed. Properties are demonstrated through simulation and with the 2015 OMAS data.

Schlüsselwörter

  • Missing data
  • nonignorable nonresponse
  • nonresponse bias
  • survey data
  • bayesian methods

Planen Sie Ihre Fernkonferenz mit Scienceendo