[
Abraham, K.G., S. Helms, and S. Presser. 2009. “How social processes distort measurement: The impact of survey nonresponse on estimates of volunteer work in the United States.” American Journal of Sociology 114 (4): 1129–1165. DOI: https://doi.org/10.1086/595945.19824303
]Search in Google Scholar
[
Adua, L., and J.S. Sharp. 2010. “Examining survey participation and response quality: The significance of topic salience and incentives.” Survey methodology 36 (1): 95–109.
]Search in Google Scholar
[
Bates, Nancy. 2017. “The Morris Hansen Lecture. Hard-to-Survey Populations and the US Census: Making Use of Social Marketing Campaigns.” Journal of Official Statistics 33 (4): 873–885. DOI: https://doi.org/10.1515/jos-2017-0040.
]Search in Google Scholar
[
Below, T., B. Khamaldin, D. Mutabazi, D. Kirschke, C. Franke, S. Sieber, R. Siebert, and K. Tscherning. 2012. “Can farmers’ adaptation to climate change be explained by socioeconomic household-level variables?.” Global environmental change 22(1): 223–235. DOI: https://doi.org/10.1016/j.gloenvcha.2011.11.012.
]Search in Google Scholar
[
Berinsky, A. J. 2008. “Survey non-response.” In The SAGE Handbook of Public Opinion Research: 309–322.10.4135/9781848607910.n29
]Search in Google Scholar
[
Bickart, B.A., J.M. Phillips, and J. Blair. 2006. “The Effects of Discussion and Question Wording on Self and Proxy Reports of Behavioral Frequencies.” Marketing Letters 17: 167–180. DOI: https://doi.org/10.1007/s11002-006-5232-1.
]Search in Google Scholar
[
Borlin, S. 2019. “Data quality of proxy reports: Inconsistent educational information in the German Microcensus Panel.” Paper presented at the 8th Conference of the European Survey Research Association, July 16, Zagreb, Croatia. Available at: https://www.europeansurveyresearch.org/conf2019/uploads/654/2802/99/Data_Quality_Proxy_Reports_Boerlin.pdf (accessed November 2022).
]Search in Google Scholar
[
Bradburn, N. 1978. “Respondent burden.” In Proceedings of the Survey Research Methods Section of the American Statistical Association 35(40). Alexandria, VA, USA: American Statistical Association. Available at: http://www.asasrms.org/Proceedings/papers/1978_007.pdf (accessed November 2022).
]Search in Google Scholar
[
Branden, L. 1995. The Effect of Interview Length on Attrition in the National Longitudinal Survey of Youth. National Longitudinal Surveys Discussion Paper. Available at: https://files.eric.ed.gov/fulltext/ED406399.pdf (accessed November 2022).
]Search in Google Scholar
[
Brick, J.M., and R. Tourangeau. 2017. “Responsive survey designs for reducing nonresponse bias.” Journal of Official Statistics 33 (3): 735–752. DOI: http://dx.doi.org/10.1515/JOS-2017-0034.10.1515/jos-2017-0034
]Search in Google Scholar
[
Brick, J.M., and D. Williams. 2013. “Explaining rising nonresponse rates in cross-sectional surveys.” The ANNALS of the American academy of political and social science 645(1): 36–59. DOI: https://doi.org/10.1177%2F0002716212456834.10.1177/0002716212456834
]Search in Google Scholar
[
Brust, O.A., S. Häder, and M. Häder. 2016. “Is the Short Version of the Big Five Inventory (BFI-S) Applicable for Use in Telephone Surveys?” Journal of Official Statistics 32(3) (2016): 601. DOI: https://doi.org/10.1515/jos-2016-0031.
]Search in Google Scholar
[
Cattell, R.B. 1966 “The screen test for the number of factors.” Multivariate behavioral research 1 (2): 245–276. DOI: https://doi.org/10.1207/s15327906mbr0102_10.26828106
]Search in Google Scholar
[
Cobb, C. 2018 “Answering for someone else: proxy reports in survey research.” In The Palgrave Handbook of Survey Research: 87–93. Palgrave Macmillan, Cham. DOI: https://doi.org/10.1007/978-3-319-54395-6_12.
]Search in Google Scholar
[
Coder, J., and L. Scoon-Rogers. 1996. Evaluating the quality of income data collected in the annual supplement to the March Current Population Survey and the Survey of Income and Program Participation. Working Paper. U.S. Bureau of the Census. Available at: citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.283.6445&rep = rep1&type = pdf (accessed November 2022).
]Search in Google Scholar
[
Copeland, K., and J.M. Rothgeb. 1990. “Testing alternative questionnaires for the Current Population Survey.”In Proceedings of the Section on Survey Research Methods 1990: 63 – 71. Available at: http://www.asasrms.org/Proceedings/papers/1990_010.pdf (accessed November 2022).
]Search in Google Scholar
[
Costello, A.B., and J. Osborne. 2005. “Best practices in exploratory factor analysis: Four recommendations for getting the most from your analysis.” Practical assessment, research, and evalua-tion 10(1): 7. DOI: https://doi.org/10.7275/jyj1-4868.
]Search in Google Scholar
[
Couper, M., A. Kapteyn, M. Schonlau, and J. Winter. 2007. “Noncoverage and nonresponse in an Internet survey.” Social Science Research 36 (1): 131–48.10.1016/j.ssresearch.2005.10.002
]Search in Google Scholar
[
Couper, M.P., E. Singer, F.G. Conrad, and R.M. Groves. 2008. “Risk of Disclosure, Perceptions of Risk, and Concerns about Privacy and Confidentiality as Factors in Survey Participation.” Journal of Official Statistics 24(2): 255–275. Available at: https://www.scb.se/contentassets/ca21efb41fee47d293bbee5bf7be7fb3/risk-of-disclosure-perceptions-of-risk-and-concerns-about-privacy-and-confidentiality-as-factors-in-survey-participation.pdf.
]Search in Google Scholar
[
Crawford, S.D., M.P. Couper, and M.J. Lamias. 2001. “Web surveys: Perceptions of burden.” Social science computer review 19 (2): 146–162. DOI: https://doi.org/10.1177%2F089443930101900202.10.1177/089443930101900202
]Search in Google Scholar
[
Cunningham, J.A., D. Ansara, T.C. Wild, T. Toneatto, and A. Koski-Jännes. 1999. “What is the price of perfection? The hidden costs of using detailed assessment instruments to measure alcohol consumption.” Journal of Studies on Alcohol 60 (6): 756–758. DOI: https://doi.org/10.15288/jsa.1999.60.756.10606486
]Search in Google Scholar
[
DeMaio, T.J. 1980. “Refusals: Who, where and why.” Public Opinion Quarterly 44 (2): 223–233. DOI: https://doi.org/10.1086/268586.
]Search in Google Scholar
[
Earp, M., M. Mitchell, J. McCarthy, and F. Kreuter. 2014. “Modeling Nonresponse in Establishment Surveys: Using an Ensemble Tree Model to Create Nonresponse Propensity Scores and Detect Potential Bias in an Agricultural Survey.” Journal of Official Statistics (JOS) 30 (4): 701–719. DOI: http://dx.doi.org/10.2478/JOS-2014-0044.10.2478/jos-2014-0044
]Search in Google Scholar
[
Earp, M., D. Toth, P. Phipps, and C. Oslund. 2018. “Assessing Nonresponse in a Longitudinal Establishment Survey Using Regression Trees” Journal of Official Statistics 34 (2) 463–481. DOI: http://dx.doi.org/10.2478/JOS-2018-0021.10.2478/jos-2018-0021
]Search in Google Scholar
[
Edwards, N.E., and P.S. Scheetz. 2002. “Predictors of burden for caregivers of patients with Parkinson’s disease.” Journal of Neuroscience Nursing 34 (4): 184. Available at: https://www.proquest.com/scholarly-journals/predictors-burden-cregivers-patients-with/docview/219178824/se-2?accountid=26724 (accessed November 2022).10.1097/01376517-200208000-0000312197259
]Search in Google Scholar
[
Finch, H. 2006. “Comparison of the performance of varimax and promax rotations: Factor structure recovery for dichotomous items.” Journal of Educational Measurement 43(1): 39–52. DOI: https://doi.org/10.1111/j.1745-3984.2006.00003.x.
]Search in Google Scholar
[
Fricker, S.S. 2007. The relationship between response propensity and data quality in the Current Population Survey and the American Time Use Survey. University of Maryland, College Park. Available at: https://www.proquest.com/dissertations-theses/relationship-between-response-propensity-data/docview/304850876/se-2?accountid=26724 (accessed November 2022).
]Search in Google Scholar
[
Fricker, S., C. Kreisler, and L. Tan. 2012. “An exploration of the application of PLS path modeling approach to creating a summary index of respondent burden.” In JSM Proceedings: 4141–4155. Available at: http://www.asasrms.org/Proceedings/y2012/Files/304802_73760.pdf (accessed November 2022).
]Search in Google Scholar
[
Fricker, S., T. Yan, and S. Tsai. 2014. “Response burden: What predicts it and who is burdened out.” In JSM proceedings : 4568–4577. Available at: http://www.asasrms.org/Proceedings/y2014/files/400298_500838.pdf (accessed November 2022).
]Search in Google Scholar
[
Fuchs, M. 2005. “Children and Adolescents as Respondents. Experiments on Question Order, Response Order, Scale Effects and the Effect of Numeric Values Associated with Response Options.” Journal of Official Statistics 21 (4): 701–725. Available at: https://www.scb.se/contentassets/ca21efb41fee47d293bbee5bf7be7fb3/children-and-adolescents-as-respondents.-experiments-on-question-order-response-order-scale-effects-and-the-effect-of-numeric-values-associated-with-response-options.pdf (accessed November 2022).
]Search in Google Scholar
[
Galesic, M. 2006. “ Dropouts on the Web: Effects of Interest and Burden Experienced During an Online Survey.” Journal of Official Statistics 22 (2): 313–328. Available at: https://www.scb.se/contentassets/ca21efb41fee47d293bbee5bf7be7fb3/dropouts-onthe-web-effects-of-interest-and-burden-experienced-during-an-online-survey.pdf (accessed November 2022).
]Search in Google Scholar
[
Groves, R.M. 2006. “Nonresponse rates and nonresponse bias in household surveys.” Public opinion quarterly 70(5): 646–675. DOI: https://doi.org/10.1093/poq/nfl033.
]Search in Google Scholar
[
Groves, R.M., R.B. Cialdini, and M.P. Couper. 1992. “Understanding the decision to participate in a survey.” Public opinion quarterly 56 (4): 475–495. DOI: https://doi.org/10.1086/269338.
]Search in Google Scholar
[
Groves, R.M., S. Presser, and S. Dipko. 2004. “The role of topic interest in survey participation decisions.” Public Opinion Quarterly 68 (1): 2–31. DOI: https://doi.org/10.1093/poq/nfh002.
]Search in Google Scholar
[
Groves, R.M., E. Singer, and A. Corning. 2000. “Leverage-saliency theory of survey participation: description and an illustration.” The Public Opinion Quarterly 64 (3): 299–308. Available at: https://www.jstor.org/stable/3078721 (accessed November 2022).10.1086/31799011114270
]Search in Google Scholar
[
Haraldsen, G. 2004 “Identifying and Reducing Response Burdens in Internet Business Surveys.” Journal of Official Statistics 20(2): 393–410. Available at: https://www.scb.se/contentassets/ca21efb41fee47d293bbee5bf7be7fb3/identifying-and-reducing-response-burdens-in-internet-business-surveys.pdf (accessed November 2022).
]Search in Google Scholar
[
Hinkin, T.R. 1995. “A review of scale development practices in the study of organizations.” Journal of management 21 (5): 967–988. DOI: https://doi.org/10.1177%2F014920639502100509.10.1177/014920639502100509
]Search in Google Scholar
[
Hinkin, T.R. 1998. “A brief tutorial on the development of measures for use in survey questionnaires.” Organizational research methods 1 (1): 104–121. DOI: https://doi.org/10.1177%2F109442819800100106.10.1177/109442819800100106
]Search in Google Scholar
[
Howard, M.C. 2016. “A review of exploratory factor analysis decisions and overview of current practices: What we are doing and how can we improve?.” International Journal of Human-Computer Interaction 32 (1): 51–62. DOI: https://doi.org/10.1080/10447318.2015.1087664.
]Search in Google Scholar
[
Kaiser, H.F. 1960. “The application of electronic computers to factor analysis.” Educational and psychological measurement 20 (1): 141–151. Available at: https://journals.sagepub.com/doi/pdf/10.1177/001316446002000116 (accessed November 2022).10.1177/001316446002000116
]Search in Google Scholar
[
Kaplan, R., and J. Holzberg. 2020. Measuring subjective perceptions of burden over time. Paper presented at the American Association for Public Opinion Research Conference, June 11.
]Search in Google Scholar
[
Kojetin, B.A., and P. Mullin. 1995. “The quality of proxy reports on the current population survey (CPS).” In 50th Annual Conference of the American Association for Public Opinion Research. Available at: http://www.asasrms.org/Proceedings/papers/1995_193.pdf (accessed November 2022).
]Search in Google Scholar
[
Kolenikov, S., and G. Angeles. 2009. “Socioeconomic status measurement with discrete proxy variables: Is principal component analysis a reliable answer?” Review of Income and Wealth 55 (1): 128–165. DOI: https://doi.org/10.1111/j.1475-4991.2008.00309.x.
]Search in Google Scholar
[
Lee, S., N.A. Mathiowetz, and R. Tourangeau. 2004. “Perceptions of Disability: the Effect of Self-and Proxy Response.” Journal of Official Statistics 20 (4): 671–686. Available at: https://www.scb.se/contentassets/ca21efb41fee47d293bbee5bf7be7fb3/perceptions-of-disability-the-effect-of-self–and-proxy-response.pdf (accessed November 2022).
]Search in Google Scholar
[
Loosveldt, G., J. Pickery, and J. Billiet. 2002.“Item Nonresponse as a Predictor of Unit Nonresponse in a Panel Survey.” Journal of Official Statistics 18 (4) 4 (2002): 545–557. Available at: https://www.scb.se/contentassets/ca21efb41fee47d293bbee5bf7be7fb3/item-nonresponse-as-a-predictor-of-unit-nonresponse-in-a-panel-survey.pdf (accessed November 2022).
]Search in Google Scholar
[
Marcus, B., M. Bosnjak, S. Lindner, S. Pilischenko, and A. Schütz. 2007 “Compensating for low topic interest and long surveys: a field experiment on nonresponse in web surveys.” Social Science Computer Review 25 (3): 372–383. DOI: https://doi.org/10.1177%2F0894439307297606.10.1177/0894439307297606
]Search in Google Scholar
[
Mechanic, D., and M. Newton. 1965. “Some problems in the analysis of morbidity data.” Journal of Chronic Diseases 18 (6): 569–580. DOI: https://doi.org/10.1016/0021-9681(65)90078-0.14300979
]Search in Google Scholar
[
Montgomery, M.R., M. Gragnolati, K.A. Burke, and E. Paredes. 2000. “Measuring living standards with proxy variables.” Demography 37 (2): 155–174. DOI: https://doi.org/10.2307/2648118.
]Search in Google Scholar
[
Morgan, J.N., and J.A. Sonquist. 1963. “Problems in the analysis of survey data, and a proposal.” Journal of the American statistical association 58 (302): 415–434. DOI: https://www.tandfonline.com/action/showCitFormats?doi=10.1080/01621459.1963.10500855.
]Search in Google Scholar
[
Norris, M., and L. Lecavalier. 2010. “Evaluating the use of exploratory factor analysis in developmental disability psychological research.” Journal of autism and developmental disorders 40 (1): 8–20. DOI: https://doi.org/10.1007/s10803-009-0816-2.19609833
]Search in Google Scholar
[
Paperwork Reduction Act Guide. 2011. Office of Personnel Management. Available at: https://www.opm.gov/about-us/open-government/digital-government-strategy/fitara/-paperwork-reduction-act-guide.pdf (accessed November 2022).
]Search in Google Scholar
[
Perruccio, A.V., and E.M. Badley. 2004. “Proxy reporting and the increasing prevalence of arthritis in Canada.” Canadian journal of public health 95 (3): 168–173. DOI: https://doi.org/10.1007/BF03403641.
]Search in Google Scholar
[
Phipps, P., and D. Toth. 2012. “Analyzing establishment nonresponse using an interpretable regression tree model with linked administrative data.” The Annals of Applied Statistics: 772–794. DOI: http://doi.org/10.1214/11-AOAS521.10.1214/11-AOAS521
]Search in Google Scholar
[
Polivka, A.E. 1996. “Data watch: The redesigned current population survey. Journal of Economic Perspectives 10(3): 169–180. Available at: https://www.jstor.org/stable/41713473 (accessed November 2022).10.1257/jep.10.3.169
]Search in Google Scholar
[
Rohe, W.M., S. van Zandt, and G. McCarthy. 2002. “Home ownership and access to opportunity.” Housing Studies 17 (1): 51–61. DOI: https://doi.org/10.1080/02673030120105884.
]Search in Google Scholar
[
Rolstad, S., J. Adler, and A. Rydén. 2011. “Response burden and questionnaire length: is shorter better? A review and meta-analysis.” Value in Health 14 (8): 1101–1108. DOI: https://doi.org/10.1016/j.jval.2011.06.003.22152180
]Search in Google Scholar
[
Serfling, O. 2004. The interaction between unit and item nonresponse in view of the reverse cooperation continuum. No. 2004/02. WWZ Discussion Paper. DOI: https://doi.org/10.5451/unibas-ep61271.
]Search in Google Scholar
[
Sharp, L.M., and J. Frankel. 1983. “Respondent burden: A test of some common assumptions.” Public opinion quarterly 47(1): 36–53. DOI: https://doi.org/10.1086/268765.
]Search in Google Scholar
[
Sitzia, J., and N. Wood. 1998. “Response rate in patient satisfaction research: an analysis of 210 published studies.” International Journal for Quality in Health Care 10 (4): 311 – 317. Available at: https://academic.oup.com/intqhc/article-pdf/10/4/311/5083940/100311.pdf (accessed November 2022).10.1093/intqhc/10.4.3119835247
]Search in Google Scholar
[
Stanley McCarthy, J., D.G. Beckler, and S.M. Qualey. 2006. “An Analysis of the Relationship Between Survey Burden and Nonresponse: If we Bother Them More, Are They Less Cooperative?” Journal of Official Statistics 22 (1): 97–112. Available at: https://www.scb.se/contentassets/ca21efb41fee47d293bbee5bf7be7fb3/an-analysis-ofthe-relationship-between-survey-burden-and-nonresponse-if-we-bother-them-more-are-they-less-cooperative.pdf (accessed November 2022).
]Search in Google Scholar
[
Sudman, S., B. Bickart, J. Blair, and G. Menon. 1994. “The effect of participation level on reports of behavior and attitudes by proxy reporters.” In Autobiographical memory and the validity of retrospective reports: 251–265. Springer, New York, NY, 1994. DOI: https://doi.org/10.1007/978-1-4612-2624-6_17.
]Search in Google Scholar
[
Tabachnick, B.G., and L.S. Fidell. 2001. “Principal components and factor analysis.” Using multivariate statistics 4 (1): 582–633.
]Search in Google Scholar
[
Tennant, A., E.M. Badley, and M. Sullivan. 1991. “Investigating the proxy effect and the saliency principle in household based postal questionnaires.” Journal of Epidemiology & Community Health 45(4): 312–316. DOI: http://dx.doi.org/10.1136/jech.45.4.312.10.1136/jech.45.4.31210594681839034
]Search in Google Scholar
[
Tourangeau, R., and T. Yan. 2007. “Sensitive questions in surveys.” Psychological bulletin 133 (5): 859–883. DOI: http://dx.doi.org/10.1037/0033-2909.133.5.859.10.1037/0033-2909.133.5.85917723033
]Search in Google Scholar
[
Toth, D. 2019. Recursive Partitioning for Modeling Survey Data. R package version 0.4.0. Available at: https://CRAN.R-project.org/package=rpms (accessed November 2022).
]Search in Google Scholar
[
Toth, D. 2020. “A Permutation Test on Complex Sample Data.” Journal of Survey Statistics and Methodology 8 (4): 772–791. DOI: https://doi.org/10.1093/jssam/smz018.
]Search in Google Scholar
[
Toth, D., and P. Phipps. 2014. Regression tree models for analyzing survey response. In Proceedings of the government statistics section: 339–351. American Statistical Association.
]Search in Google Scholar
[
U.S. Bureau of Labor Statistics Handbook of Methods. 2018. Consumer Expenditure and income. Available at: https://www.bls.gov/opub/hom/cex/home.htm (accessed November 2022).
]Search in Google Scholar
[
U.S. Bureau of Labor Statistics Handbook of Methods 2018. Current Population Survey. Available at: https://www.bls.gov/opub/hom/cps/home.htm (accessed November 2022).
]Search in Google Scholar
[
U.S. Census Bureau. 2021. Current Population Survey Technical Documentation: Methodology. Available at: https://www.census.gov/programs-surveys/cps/technical-documentation/methodology.html.
]Search in Google Scholar
[
Van Liere, K.D., and R.E. Dunlap. 1980. “The social bases of environmental concern: A review of hypotheses, explanations and empirical evidence.” Public opinion quarterly 44 (2): 181–197. DOI: https://doi.org/10.1086/268583.
]Search in Google Scholar
[
Yan, T., and R. Curtin. 2010. “The relation between unit nonresponse and item nonresponse: A response continuum perspective.” International Journal of Public Opinion Research 22 (4): 535–551. DOI: https://doi.org/10.1093/ijpor/edq037.
]Search in Google Scholar
[
Yan, T., S. Fricker, and S. Tsai. 2020. “Response Burden: What Is It and What Predicts It?” In Advances in Questionnaire Design, Development, Evaluation and Testing, edited by P. Beatty, D. Collins, L. Kaye, J. Luis Padilla, G. Willis, and A. Wilmot: 193–212. Wiley & Sons. DOI: https://doi.org/10.1002/9781119263685.ch8.
]Search in Google Scholar
[
Yan, T., and R. Tourangeau. 2008. “Fast times and easy questions: The effects of age, experience and question complexity on web survey response times.” Applied Cognitive Psychology: The Official Journal of the Society for Applied Research in Memory and Cognition 22 (1): 51–68.10.1002/acp.1331
]Search in Google Scholar
[
Zhang, C., S. Lonn, and S.D. Teasley. 2017. “Understanding the impact of lottery incentives on web survey participation and response quality: A leverage-salience theory perspective.” Field Methods 29 (1): 42–60. DOI: https://doi.org/10.1177%2F1525822X16647932.10.1177/1525822X16647932
]Search in Google Scholar