[
Antoun, C., J. Katz, J. Argueta, and L. Wang. 2018. “Design Heuristics for effective Smartphone Questionnaires.” Social Science Computer Review 36(5): 557–574. DOI: https://doi.org/10.1177/0894439317727072.
]Search in Google Scholar
[
Brosnan, K., B. Grün, and S. Dolnicar. 2021. “Cognitive load reduction strategies in questionnaire design.” International Journal of Market Research 63(2): 125–133. DOI: https://doi.org/10.1177%2F1470785320986797.10.1177/1470785320986797
]Search in Google Scholar
[
Callegaro, M., J. Shand-Lubbers, and J.M. Dennis. 2009 “Presentation of a Single Item versus a Grid: Effects on the Vitality and Mental Health Scales of the SF-36v2 Health Survey.” 64th Annual Conference of the American Association for Public Opinion Research (AAPOR), May 14, 2009: 5887–5897. Hollywood, Florida. Available at: http://www.asasrms.org/Proceedings/y2009/Files/400045.pdf (accessed July 2022).
]Search in Google Scholar
[
Cohen, J. 1960. “A Coefficient of Agreement for Nominal Scales.” Educational and psychological measurement 20(1): 37–46. DOI: https://doi.org/10.1177/001316446002000104.
]Search in Google Scholar
[
Couper, M.P. 2008. Designing effective Web surveys. Cambridge University Press. DOI: https://doi.org/10.1017/CBO9780511499371.
]Search in Google Scholar
[
Couper, M.P., R. Tourangeau, F.G. Conrad, F.C. Zhang. 2013. “The Design of Grids in Web Surveys.” Social Science Computer Review 31(3): 322–345. DOI: https://doi.org/10.1177/0894439312469865.417236125258472
]Search in Google Scholar
[
Couper, M.P., M.W. Traugott, and M.J. Lamias. 2001. “Web Survey Design and Administration.” Public Opinion Quarterly: 65(2): 230–253. DOI: https://doi.org/10.1086/322199.11420757
]Search in Google Scholar
[
DeBell, M., C. Wilson, S. Jackman, and L. Figueroa. 2021. “Optimal Response Formats for Online Surveys: Branch, Grid, or Single Item?” Journal of Survey Statistics and Methodology 9(1): 1–24. DOI: https://doi.org/10.1093/jssam/smz039.
]Search in Google Scholar
[
De Leeuw, E.D., Z.T. Suzer-Gurtekin, and J.J. Hox. 2018. “The Design and Implementation of Mixed-mode Surveys.” In Advances in Comparative Survey Methods: Multinational, Multiregional, and Multicultural Contexts (3MC), edited by T.P. Johnson, B. Pennell, I.A.L. Stoop, and B. Dorer: 387–409.
]Search in Google Scholar
[
Hoboken: Wiley. Dillman, D.A., J.D. Smyth, and L.M. Christian. 2014. Internet, Phone, Mail, and Mixed-mode Surveys: the Tailored Design Method (4th edition). Hoboken: Wiley.
]Search in Google Scholar
[
Galesic, M., and T. Yan. 2011. “Use of Eye Tracking for Studying Survey Response Processes.” In Social and Behavioral Research and the Internet, edited by M. Das, P. Ester, and L. Kaczmirek: 349–370. New York: Taylor and Francis.10.4324/9780203844922-14
]Search in Google Scholar
[
GLES. 2019. “Longterm-Online-Tracking, Cumulation 2009–2017 (GLES).” GESIS Data Archive, Cologne. ZA6832 Data file Version 1.1.0, DOI: https://doi.org/10.4232/1.13416.
]Search in Google Scholar
[
Grady, R.H., R.L. Greenspan, and M. Liu. 2019. “What Is the Best Size for Matrix-Style Questions in Online Surveys?” Social Science Computer Review 37(3): 435–445. DOI: https://doi.org/10.1177/0894439318773733.
]Search in Google Scholar
[
Gummer, T., F. Quoß, and J. Roßmann. 2019. “Does Increasing Mobile Device Coverage Reduce Heterogeneity in Completing Web Surveys on Smartphones?” Social Science Computer Review 37(3): 371–384. DOI: https://doi.org/10.1177/0894439318766836.
]Search in Google Scholar
[
Heerwegh, D. 2009. “Mode Differences Between Face-to-Face and Web Surveys: an Experimental Investigation of Data Quality and Social Desirability Effects.” International Journal of Public Opinion Research 21(1): 111–121. DOI: https://doi.org/10.1093/ijpor/edn054.
]Search in Google Scholar
[
ISSP Research Group (2016): International Social Survey Programme: Citizenship II – ISSP 2014. GESIS Data Archive, Cologne. ZA6670 Data file Version 2.0.0. DOI: https://doi.org/10.4232/1.12590.
]Search in Google Scholar
[
Jenkins C.R, and D.A. Dillman. 1995. “Towards a Theory of Self-Administered Questionnaire Design.” In Survey Measurement and Process Quality, edited by L. Lyberg, P. Biemer, M. Collins, E. de Leeuw, C. Dippo, N. Schwarz, and D. Trewin: 165–196. New York Wiley.
]Search in Google Scholar
[
Just, M.A., and P.A. Carpenter. 1980. “A Theory of Reading: From Eye Fixations to Comprehension.” Psychological Review 87: 329–354. DOI: https://doi.org/10.1037/0033-295X.87.4.329.
]Search in Google Scholar
[
Kaczmirek, L. 2005. “A Framework for the Collection of Universal Client Side Paradata (UCSP).” Available at: http://kaczmirek.de/ucsp/ucsp.html (accessed January 2021).
]Search in Google Scholar
[
Kamoen, N., B. Holleman, P. Mak, T. Sanders, and H. van den Bergh. 2017. “Why are Negative Questions Difficult to Answer? On the Processing of Linguistic Contrasts in Surveys.” Public Opinion Quarterly 81(3): 613–635. DOI: https://doi.org/10.1093/poq/nfx010
]Search in Google Scholar
[
Koffka, K. 1935. Principles of Gestalt psychology. New York: Harcourt.
]Search in Google Scholar
[
Krosnick, J.A. 1991. “Response Strategies for Coping with the Cognitive Demands of Attitude Measures in Surveys.” Applied Cognitive Psychology 5(3): 213–236. DOI: https://doi.org/10.1002/acp.2350050305.
]Search in Google Scholar
[
Krosnick, J.A., and D.F. Alwin. 1988. “A Test of the Form-Resistant Correlation Hypothesis. Ratings, Rankings, and the Measurement of Values.” Public Opinion Quarterly 52(4): 526–538. DOI: https://doi.org/10.1086/269128.
]Search in Google Scholar
[
Krosnick, J.A., and S. Presser. 2010. “Question and Questionnaire Design.” In Handbook of Survey Research, edited by P.V. Marsden, and J.D. Wright: 263–314. Emerald Group Publishing.
]Search in Google Scholar
[
Landis, J.R., and G.G. Koch. 1977. “The Measurement of Observer Agreement for Categorical Data.” Biometrics 33(1): 159–174. DOI: https://doi.org/10.2307/2529310.
]Search in Google Scholar
[
Liu, M., and A. Cernat. 2018. “Item-by-Item versus Matrix Questions: A Web Survey Experiment.” Social Science Computer Review 36(6): 690–706. DOI: https://doi.org10.1177/0894439316674459.10.1177/0894439316674459
]Search in Google Scholar
[
Mavletova, A., and M.P. Couper. 2015. “A Meta-Analysis of Breakoff Rates in Mobile Web Surveys.” In Mobile Research Methods: Opportunities and Challenges of Mobile Research Methodologies, edited by D. Toninelli, R. Pinter, and P. de Pedraza: 81–98. London: Ubiquity Press. DOI: http://dx.doi.org/10.5334/bar.f.10.5334/bar.f
]Search in Google Scholar
[
Mavletova, A., M.P Couper, and D. Lebedev. 2018. “Grid and item-by-item formats in PC and mobile web surveys.” Social Science Computer Review 36(6): 647–668. DOI: https://doi.org/10.1177%2F0894439317735307.10.1177/0894439317735307
]Search in Google Scholar
[
Mayerl, J. 2013. “Response Latency Measurement in Surveys. Detecting Strong Attitudes and Response Effects”. Survey Methods: Insights from the Field. Retrieved December 17, 2020, from https://surveyinsights.org/?p=1063. DOI: https://doi.org/10.13094/SMIF-2013-00005.
]Search in Google Scholar
[
McCarty, J.A., and L.J. Shrum. 2000. “The Measurement of Personal Values in Survey Research: A Test of Alternative Rating Procedures.” Public Opinion Quarterly 64(3): 271–298. DOI: https://doi.org/10.1086/317989.11114269
]Search in Google Scholar
[
Neuert, C.E., and T. Lenzner. 2017. “Incorporating Eye Tracking into Cognitive Interviewing to Pretest Survey Questions.” International Journal of Social Research Methodology 19(5): 501–519. DOI: https://doi.org/10.1080/13645579.2015.1049448.
]Search in Google Scholar
[
Peytchev, A. 2005. “How Questionnaire Layout Induces Measurement Error.” Paper presented at the 60th annual meeting of the American Association for Public Opinion Research, May, 2005. Miami Beach, FL,USA. Available at: http://www.websm.org/db/12/3636/Bibliography/Causes%20of%20Context%20Effects:%20How%20Questionnaire%20Layout%20Induces%20Measurement%20Error/.
]Search in Google Scholar
[
Rammstedt, B., and O.P. John. 2007. “Measuring Personality in One Minute or Less: A 10-Item Short Version of the Big Five Inventory in English and German.” Journal of Research in Personality 41: 203–212. DOI: https://doi.org/10.1016/j.jrp.2006.02.001.
]Search in Google Scholar
[
Rayner, K. 1998. “Eye Movements in Reading and Information Processing: 20 Years of Research.” Psychological Bulletin 124: 372–422. DOI: https://doi.org/10.1037/0033-2909.124.3.372.9849112
]Search in Google Scholar
[
Romano Bergstrom, J., and A. Schall. 2014. Eye Tracking in User Experience Design. San Francisco, CA: Morgan Kaufmann.
]Search in Google Scholar
[
Roßmann, J., T. Gummer, and H. Silber. 2018. “Mitigating Satisficing in Cognitively Demanding Grid Questions: Evidence from Two Web-Based Experiments.” Journal of Survey Statistics and Methodology 6(3): 376–400. DOI: https://doi.org/10.1093/jssam/smx020.
]Search in Google Scholar
[
Roßmann, J., and H. Silber. 2020. “Satisficing and Measurement Error.” In SAGE Research Methods Foundations, edited by P. Atkinson, S. Delamont, A. Cernat, J.W. Sakshaug, and R.A. Williams. London: SAGE Publications. DOI: https://dx.doi.org/10.4135/9781526421036912794.10.4135/9781526421036912794
]Search in Google Scholar
[
Silber, H., J. Roßmann, and T. Gummer. 2018. “When Near Means Related: Evidence from Three Web Survey Experiments on Inter-Item Correlations in Grid Questions.” International Journal of Social Research Methodology 21(3): 275–288. DOI: https://doi.org/10.1080/13645579.2017.1381478.
]Search in Google Scholar
[
Staub, A., and K. Rayner. 2007. “Eye movements and on-line comprehension processes.” In The Oxford Handbook of Psycholinguistics, edited by G. Gaskell: 327–342. Oxford, UK: Oxford University Press.10.1093/oxfordhb/9780198568971.013.0019
]Search in Google Scholar
[
Toepoel, V., M. Das, and A. van Soest. 2009. “Design of Web Questionnaires: The Effects of the Number of Items per Screen.” Field Methods 21(2): 200–213. DOI: https://doi.org/10.1177/1525822X08330261.
]Search in Google Scholar
[
Tourangeau R., M.P. Couper, and F. Conrad. 2004. “Spacing, Position, and Order. Interpretive Heuristics for Visual Features of Survey Questions.” Public Opinion Quarterly 68(3): 368–393. DOI: https://doi.org/10.1093/poq/nfh035.
]Search in Google Scholar
[
Tourangeau, R., and K. Rasinski. 1988. “Cognitive Processes Underlying Context Effects in Attitude Measurement.” Psychological Bulletin 103(3): 299–314. DOI: 10.1037/0033-2909.103.3.299.10.1037/0033-2909.103.3.299
]Search in Google Scholar
[
Tourangeau, R., L.J. Rips, and K. Rasinski. 2000. The Psychology of Survey Response. Cambridge: Cambridge University Press.10.1017/CBO9780511819322
]Search in Google Scholar
[
Wertheimer, M. 1923. Laws of organization in perceptual forms: A source book of Gestalt psychology. London: Routledge.
]Search in Google Scholar
[
W3C: World Wide Web consortium. 2018. “Web Content Accessibility Guidelines (WCAG) 2.1.” Available at: https://www.w3.org/TR/WCAG21/ (accessed June 2021).
]Search in Google Scholar
[
Zhang, C., and F. Conrad. 2014. “Speeding in Web Surveys: The Tendency to Answer very Fast and its Association with Straightlining.” Survey Research Methods 8(2): 127–135. DOI: https://doi.org/10.18148/srm/2014.v8i2.5453.
]Search in Google Scholar