1. bookVolume 2020 (2020): Issue 2 (April 2020)
Journal Details
First Published
16 Apr 2015
Publication timeframe
4 times per year
access type Open Access

Multiple Purposes, Multiple Problems: A User Study of Consent Dialogs after GDPR

Published Online: 08 May 2020
Page range: 481 - 498
Received: 31 Aug 2019
Accepted: 16 Dec 2019
Journal Details
First Published
16 Apr 2015
Publication timeframe
4 times per year

The European Union’s General Data Protection Regulation (GDPR) requires websites to ask for consent to the use of cookies for specific purposes. This enlarges the relevant design space for consent dialogs. Websites could try to maximize click-through rates and positive consent decision, even at the risk of users agreeing to more purposes than intended. We evaluate a practice observed on popular websites by conducting an experiment with one control and two treatment groups (N = 150 university students in two countries). We hypothesize that users’ consent decision is influenced by (1) the number of options, connecting to the theory of choice proliferation, and (2) the presence of a highlighted default button (“select all”), connecting to theories of social norms and deception in consumer research. The results show that participants who see a default button accept cookies for more purposes than the control group, while being less able to correctly recall their choice. After being reminded of their choice, they regret it more often and perceive the consent dialog as more deceptive than the control group. Whether users are presented one or three purposes has no significant effect on their decisions and perceptions. We discuss the results and outline policy implications.


[1] European Parliament and the Council of the European Union. Regulation (EU) 2016/679 of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data, and repealing Directive 95/46/EC (General Data Protection Regulation) (2016)Search in Google Scholar

[2] European Parliament and the Council of the European Union. Directive 2002/58/EC of 12 July 2002 concerning the processing of personal data and the protection of privacy in the electronic communications sector (Directive on privacy and electronic communications) (2002)Search in Google Scholar

[3] European Parliament and the Council of the European Union. Directive 2009/136/EC of 25 November 2009 amending Directive 2002/22/EC universal service and users’ rights relating to electronic communications networks and services, Directive 2002/58/EC concerning the processing of personal data and the protection of privacy in the electronic communications sector and Regulation (EC) No 2006/2004 on cooperation between national authorities responsible for the enforcement of consumer protection laws. (2009)Search in Google Scholar

[4] R. Leenes, E. Kosta. Taming the cookie monster with Dutch law — A tale of regulatory failure. Computer Law & Security Review (2015) 31, 3, 317–33510.1016/j.clsr.2015.01.004Search in Google Scholar

[5] M. Trevisan, B. E. Traverso, Stefano, M. Mellia. 4 years of EU cookie law: Results and lessons learned. In: Proceedings on Privacy Enhancing Technologies (PoPETs) (De Gruyter Open, 2019) 126–145Search in Google Scholar

[6] R. van Eijk, H. Asghari, P. Winter, A. Narayanan. The impact of user location on cookie notices (inside and outside of the European Union). In: Workshop on Technology and Consumer Protection (ConPro) (2019)Search in Google Scholar

[7] S. Englehardt, A. Narayanan. Online tracking: A 1-million-site measurement and analysis. In: Conference on Computer and Communications Security (CCS) (ACM, 2016) 1388–1401Search in Google Scholar

[8] M. Degeling, C. Utz, C. Lentzsch, H. Hosseini, F. Schaub, T. Holz. Measuring the GDPR’s impact on web privacy. In: Network and Distributed System Security Symposium (NDSS) (Internet Society, 2019)Search in Google Scholar

[9] C. Utz, M. Degeling, S. Fahl, F. Schaub, T. Holz. (Un)informed consent: Studying GDPR consent notices in the field. In: Conference on Computer and Communications Security (CCS) (ACM, 2019) 973–990Search in Google Scholar

[10] I. Sánchez-Rola, M. Dell’Amico, P. Kotzias, D. Balzarotti, L. Bilge, P. Vervier, et al. Can I opt out yet?: GDPR and the global illusion of cookie control. In: Conference on Computer and Communications Security (AsiaCCS) (ACM, 2019) 340–351Search in Google Scholar

[11] J. R. Kling, S. Mullainathan, E. Shafir, L. Vermeulen, M. V. Wrobel. Misperception in choosing medicare drug plans. Harvard University working paper (2008)Search in Google Scholar

[12] H. Cronqvist, R. H. Thaler. Design choices in privatized social-security systems: Learning from the Swedish experience. American Economic Review (2004) 94, 2, 424—42810.1257/0002828041301632Search in Google Scholar

[13] S. Korff, R. Böhme. Too much choice: End-user privacy decisions in the context of choice proliferation. In: Symposium On Usable Privacy and Security (SOUPS) (USENIX, 2014) 69–87Search in Google Scholar

[14] European Commission. The GDPR: New opportunities, new obligations. Tech. rep., Publications Office of the European Union, Brussels, Luxembourg (2018)Search in Google Scholar

[15] European Court of Justice. Judgement in Case C-673/17 in the proceedings Bundesverband der Verbraucherzentralen und Verbraucherverbände vs Planet49 GmbH (2019)Search in Google Scholar

[16] Cybot A/S. WordPress and GDPR, and how to deal with cookies and plugins. Copenhagen, Denmark (2019). https://www.cookiebot.com/en/wordpress-cookie-plugin/Search in Google Scholar

[17] M. Ackerman, L. F. Cranor, J. Reagle. Privacy in e-commerce: Examining user scenarios and privacy preferences. In: Conference on Electronic Commerce (EC) (ACM, 1999) 1–8Search in Google Scholar

[18] L. I. Millett, B. Friedman, E. Felten. Cookies and web browser design: Toward realizing informed consent online. In: Conference on Human Factors in Computing System (CHI) (ACM, 2001) 46–52Search in Google Scholar

[19] F. Schaub, R. Balebako, A. L. Durity, L. F. Cranor. A design space for effective privacy notices. In: Symposium On Usable Privacy and Security (SOUPS) (USENIX, 2015) 1–1710.1109/MIC.2017.265102930Search in Google Scholar

[20] M. Bergmann. Generic predefined privacy preferences for online applications. In: IFIP International Summer School on the Future of Identity in the Information Society (Springer, 2007) 259–27310.1007/978-0-387-79026-8_18Search in Google Scholar

[21] J. S. Pettersson, S. Fischer-Hubner, M. C. Mont, S. Pearson. How ordinary internet users can have a chance to influence privacy policies. In: Nordic conference on Human-computer interaction: Changing roles (NordiCHI) (ACM, 2006) 473–47610.1145/1182475.1182540Search in Google Scholar

[22] J. S. Pettersson, S. Fischer-Hübner, N. Danielsson, J. Nilsson, M. Bergmann, S. Clauss, et al. Making PRIME usable. In: Symposium on Usable Privacy and Security (SOUPS) (ACM, 2005) 53–6410.1145/1073001.1073007Search in Google Scholar

[23] J. J. Borking. Privacy incorporated software agent (PISA): Proposal for building a privacy guardian for the electronic age. In: Designing Privacy Enhancing Technologies: International Workshop on Design Issues in Anonymity and Unobservability (Springer, 2001) 130–140Search in Google Scholar

[24] M. Bergmann. Testing privacy awareness. In: IFIP Summer School on the Future of Identity in the Information Society (Springer, 2008) 237–25310.1007/978-3-642-03315-5_18Search in Google Scholar

[25] L. F. Cranor. P3P: Making privacy policies more useful. Security & Privacy (2003) 99, 6, 50–5510.1109/MSECP.2003.1253568Search in Google Scholar

[26] M. Langheinrich, L. Cranor, M. Marchiori. Appel: A P3P preference exchange language. W3C Working Draft (2002)Search in Google Scholar

[27] M.-R. Ulbricht, F. Pallas. YaPPL – a lightweight privacy preference language for legally sufficient and automated consent provision in IoT scenarios. In: J. García-Alfaro, J. Herrera-Joancomartí, G. Livraga, R. Rios (Eds.) Data Privacy Management, Cryptocurrencies and Blockchain Technology (Springer, 2018), no. 11025 in Lecture Notes in Computer Science, 329–34410.1007/978-3-030-00305-0_23Search in Google Scholar

[28] T. Vila, R. Greenstadt, D. Molnar. Why we can’t be bothered to read privacy policies. In: Economics of Information Security (Springer, 2004), 143–15310.1007/1-4020-8090-5_11Search in Google Scholar

[29] J. Grossklags, N. Good. Empirical studies on software notices to inform policy makers and usability designers. In: Financial Cryptography and Data Security (FC) (Springer, 2007) 341–35510.1007/978-3-540-77366-5_31Search in Google Scholar

[30] O. Kulyk, A. Hilt, N. Gerber, M. Volkamer. Users’ perceptions and reactions to the cookie disclaimer. In: European Workshop on Usable Security (EuroUSEC) (2018)10.14722/eurousec.2018.23012Search in Google Scholar

[31] R. Böhme, S. Köpsell. Trained to accept?: A field experiment on consent dialogs. In: Conference on Human Factors in Computing System (CHI) (ACM, 2010) 2403–2406Search in Google Scholar

[32] A. P. Felt, S. Egelman, M. Finifter, D. Akhawe, D. A. Wagner, et al. How to ask for permission. HotSec (2012)Search in Google Scholar

[33] S. Spiekermann, A. Acquisti, R. Böhme, K. L. Hui. The challenges of personal data markets and privacy. Electronic Markets (2015) 25, 2, 161–16710.1007/s12525-015-0191-0Search in Google Scholar

[34] B. Scheibehenne, R. Greifeneder, P. M. Todd. Can there ever be too many options? A meta-analytic review of choice overload. Journal of Consumer Research (2010) 37, 3, 409–42510.1086/651235Search in Google Scholar

[35] E. J. Johnson, S. B. Shu, B. G. Dellaert, C. Fox, D. G. Goldstein, G. Häubl, et al. Beyond nudges: Tools of a choice architecture. Marketing Letters (2012) 23, 2, 487–50410.1007/s11002-012-9186-1Search in Google Scholar

[36] B. P. Knijnenburg, A. Kobsa, H. Jin. Preference-based location sharing: Are more privacy options really better? In: Conference on Human Factors in Computing System (CHI) (ACM, 2013) 2667–267610.1145/2470654.2481369Search in Google Scholar

[37] K. Tang, J. Hong, D. Siewiorek. The implications of offering more disclosure choices for social location sharing. In: Conference on Human Factors in Computing System (CHI) (ACM, 2012) 391–39410.1145/2207676.2207730Search in Google Scholar

[38] H. Krasnova, N. Eling, O. Schneider, H. Wenninger, T. Widjaja, P. Buxmann, et al. Does this app ask for too much data? The role of privacy perceptions in user behavior towards Facebook applications and permission dialogs. In: European Conference on Information Systems (ECIS) (2013)Search in Google Scholar

[39] C. Anderson. The long tail: Why the future of business is selling less of more (Hachette Books, London, UK, 2006)Search in Google Scholar

[40] J. M. Hutchinson. Is more choice always desirable? Evidence and arguments from leks, food selection, and environmental enrichment. Biological Reviews (2005) 80, 1, 73–9210.1017/S146479310400655415727039Search in Google Scholar

[41] R. Böhme, J. Grossklags. The security cost of cheap user interaction. In: New Security Paradigms Workshop (NSPW) (ACM, 2011) 67–8210.1145/2073276.2073284Search in Google Scholar

[42] S. Egelman. My profile is my password, verify me!: The privacy/convenience tradeoff of Facebook Connect. In: Conference on Human Factors in Computing System (CHI) (ACM, 2013) 2369–2378Search in Google Scholar

[43] P. E. Johnson, S. Grazioli, K. Jamal, R. G. Berryman. Detecting deception: Adversarial problem solving in a low baserate world. Cognitive Science (2001) 25, 3, 355–39210.1207/s15516709cog2503_2Search in Google Scholar

[44] S. Román. Relational consequences of perceived deception in online shopping: The moderating roles of type of product, consumer’s attitude toward the internet and consumer’s demographics. Journal of Business Ethics (2010) 95, 3, 373–39110.1007/s10551-010-0365-9Search in Google Scholar

[45] B. Xiao, I. Benbasat. Product-related deception in e-commerce: A theoretical perspective. MIS Quarterly (2011) 35, 1, 169–19610.2307/23043494Search in Google Scholar

[46] A. Nochenson, J. Grossklags. An online experiment on consumers’ susceptibility to fall for post-transaction marketing scams. In: European Conference on Information Systems (ECIS) (Association for Information Systems, 2014)Search in Google Scholar

[47] D. M. Boush, M. Friestad, P. Wright. Deception in the marketplace: The psychology of deceptive persuasion and consumer self-protection (Routledge/Taylor & Francis Group, 2015)10.4324/9780203805527Search in Google Scholar

[48] P. Fleming, S. C. Zyglidopoulos. The escalation of deception in organizations. Journal of Business Ethics (2008) 81, 4, 837–85010.1007/s10551-007-9551-9Search in Google Scholar

[49] K. A. Jehn, E. D. Scott. Perceptions of deception: Making sense of responses to employee deceit. Journal of Business Ethics (2008) 80, 2, 327–34710.1007/s10551-007-9423-3Search in Google Scholar

[50] K. Yoon, K. Knight, D. Martin. Deceiving team members about competence: Its motives and consequences. Western Journal of Communication (2018) 1–22Search in Google Scholar

[51] B. Shneiderman, M. Leavitt, et al. Research-based web design & usability guidelines (Department of Health and Human Services, Washington, DC, 2006)Search in Google Scholar

[52] J. Watson, H. R. Lipford, A. Besmer. Mapping user preference to privacy default settings. Transactions on Computer-Human Interaction (TOCHI) (ACM, 2015) 22, 3210.1145/2811257Search in Google Scholar

[53] C. I. Hovland, I. L. Janis, H. H. Kelley. Communication and Persuasion: Psychological Studies of Opinion Change (Greenwood Press, 1953)Search in Google Scholar

[54] C. M. Gray, Y. Kou, B. Battles, J. Hoggatt, A. L. Toombs. The dark (patterns) side of UX design. In: Conference on Human Factors in Computing System (CHI) (ACM, 2018) 534:1–1410.1145/3173574.3174108Search in Google Scholar

[55] H. Brignull. Dark patterns. Tech. rep. (2019). https://darkpatterns.orgSearch in Google Scholar

[56] C. Bösch, B. Erb, F. Kargl, H. Kopp, S. Pfattheicher. Tales from the dark side: Privacy dark strategies and privacy dark patterns (De Gruyter Open, 2016), vol. 2016 237–25410.1515/popets-2016-0038Search in Google Scholar

[57] A. M. McDonald, L. F. Cranor. The cost of reading privacy policies. I/S: J. L (2008) 4, 3, 540–565Search in Google Scholar

[58] A. Mathur, G. Acar, M. J. Friedman, E. Lucherini, J. Mayer, M. Chetty, et al. Dark patterns at scale: Findings from a crawl of 11K shopping websites. Proceedings of the ACM on Human-Computer Interaction (2019) 3, 8110.1145/3359183Search in Google Scholar

[59] J. M. Bland, D. G. Altman. Cronbach’s alpha. British Medical Journal (1997) 314, 7080, 570–57210.1136/bmj.314.7080.57221260619055718Search in Google Scholar

[60] G. A. Miller. The magic number seven, plus or minus two: Some limits on our capacity for processing information. Psychological Review (1956) 63, 2, 81–9710.1037/h0043158Search in Google Scholar

[61] P. A. Norberg, D. R. Horne, D. A. Horne. The privacy paradox: Personal information disclosure intentions versus behaviors. Journal of Consumer Affairs (2007) 41, 1, 100–12610.1111/j.1745-6606.2006.00070.xSearch in Google Scholar

[62] S. S. Sundar, H. Kang, M. Wu, E. Go, B. Zhang. Unlocking the privacy paradox: Do cognitive heuristics hold the key? In: Extended Abstracts on Human Factors in Computing Systems (CHI EA) (ACM, 2013), 6 811–81610.1145/2468356.2468501Search in Google Scholar

[63] N. Gerber, P. Gerber, M. Volkamer. Explaining the privacy paradox: A systematic review of literature investigating privacy attitude and behavior. Computers & Security (2018) 77, 226–26110.1016/j.cose.2018.04.002Search in Google Scholar

[64] I. Ajzen. Models of human social behavior and their application to health psychology. Psychology and Health (1998) 13, 4, 735–73910.1080/08870449808407426Search in Google Scholar

[65] T. Dienlin, S. Trepte. Is the privacy paradox a relic of the past? An in-depth analysis of privacy attitudes and privacy behaviors. European Journal of Social Psychology (2015) 45, 3, 285–29710.1002/ejsp.2049Search in Google Scholar

[66] M. T. Orne. On the social psychology of the psychological experiment: With particular reference to demand characteristics and their implications. American psychologist (1962) 17, 11, 776–78310.1037/h0043424Search in Google Scholar

[67] J. P. Walsh, S. Kiesler, L. S. Sproull, B. W. Hesse. Self-selected and randomly selected respondents in a computer network survey. Public Opinion Quarterly (1992) 56, 2, 241–24410.1086/269315Search in Google Scholar

[68] H. Cho, R. LaRose. Privacy issues in internet surveys. Social Science Computer Review (1999) 17, 4, 421–43410.1177/089443939901700402Search in Google Scholar

[69] European Union Agency for Network and Information Security. Challenges and opportunities for EU cybersecurity start-ups (2019)Search in Google Scholar

[70] L. Olejni. A second life for the ‘do not track’ setting – with teeth. Wired (2019) https://www.wired.com/story/a-second-life-for-the-do-not-track-settingSearch in Google Scholar

Recommended articles from Trend MD

Plan your remote conference with Sciendo