Journal & Issues

Volume 15 (2023): Issue 1 (May 2023)
Special title: Marketing Dashboards – The Next Generation

Volume 14 (2022): Issue 2 (November 2022)
Special Title: MarTech and SalesTech

Volume 14 (2022): Issue 1 (May 2022)
Conscious Consumption

Volume 13 (2021): Issue 2 (November 2021)
Brand Activism

Volume 13 (2021): Issue 1 (May 2021)
The Dark Sides of Digital Marketing

Volume 12 (2020): Issue 2 (November 2020)
The Reputation Economy

Volume 12 (2020): Issue 1 (May 2020)
Crowd Innovation: Hype or Help

Volume 11 (2019): Issue 2 (November 2019)
AI and the Machine Age of Marketing

Volume 11 (2019): Issue 1 (May 2019)
The Future of Retailing

Volume 10 (2018): Issue 2 (October 2018)
IoT - Consumers and the Internet of Things

Volume 10 (2018): Issue 1 (May 2018)
Brand Risk Matters

Volume 9 (2017): Issue 2 (November 2017)
The Connected Consumer

Volume 9 (2017): Issue 1 (May 2017)
Digital Transformation

Volume 8 (2016): Issue 2 (November 2016)
Marketing and Data Science

Volume 8 (2016): Issue 1 (May 2016)
Responsible Marketing

Volume 7 (2015): Issue 2 (November 2015)
Marketing Meets Product Design

Volume 7 (2015): Issue 1 (May 2015)
Truly Accountable Marketing

Volume 6 (2014): Issue 2 (November 2014)
Social Brand Engagement

Volume 6 (2014): Issue 1 (May 2014)
Emotions in Marketing

Volume 5 (2013): Issue 2 (November 2013)

Volume 5 (2013): Issue 1 (May 2013)

Volume 4 (2012): Issue 2 (November 2012)

Volume 4 (2012): Issue 1 (May 2012)

Volume 3 (2011): Issue 2 (November 2011)

Volume 3 (2011): Issue 1 (May 2011)

Volume 2 (2010): Issue 2 (November 2010)

Volume 2 (2010): Issue 1 (May 2010)

Volume 1 (2009): Issue 2 (November 2009)

Volume 1 (2009): Issue 1 (May 2009)

Journal Details
Format
Journal
eISSN
2628-166X
First Published
30 May 2019
Publication timeframe
2 times per year
Languages
English

Search

Volume 13 (2021): Issue 1 (May 2021)
The Dark Sides of Digital Marketing

Journal Details
Format
Journal
eISSN
2628-166X
First Published
30 May 2019
Publication timeframe
2 times per year
Languages
English

Search

0 Articles
Open Access

Editorial

Published Online: 05 May 2021
Page range: 3 - 3

Abstract

Open Access

Illuminating the Dark: Exploring the Unintended Consequences of Digital Marketing

Published Online: 05 May 2021
Page range: 10 - 17

Abstract

Abstract

Our relationship to technology is deeply paradoxical. On the one hand, we buy and constantly use more devices and apps, leaving our traces in the digital space. On the other hand, we increasingly fear the dark sides of being dependent on technology and of data abuse. Inadequate knowledge and errors make it difficult to predict unintended consequences, and often problems emerge due to deliberate choices to pursue some interests while ignoring others. Hot topics include data privacy, potentially biased or discriminating algorithms, the tension between free choice and manipulation, and the optimization of questionable outputs while ignoring broader effects.

Fighting unintended consequences should get to the roots of the problems. As for personal data, users should get more control over what they share. Further, more transparency can help avoid dystopian outcomes. It concerns the use of data, in particular, by algorithms. The high concentration of power of a few global players should also be watched closely, and societies need to be critical towards their actions and objectives. Even seemingly noble motives come at a price, and this price needs to be negotiable.

Keywords

  • Digital Marketing
  • Algorithms
  • Marketing Utopia
  • Marketing Dystopia
  • Unintended Consequences
Open Access

Marketing Automation: Marketing Utopia or Marketing Dystopia?

Published Online: 05 May 2021
Page range: 18 - 23

Abstract

Abstract

Automated and personalized interactions may increase the relevance of marketing offers, but they also have less positive economic and psychological consequences for consumers. Machine learning-based prediction algorithms can approximate individuals’ preferences and their willingness to pay at ever greater levels of precision, and companies can use this knowledge to charge higher individual prices. Typically, consumers freely hand over all the information necessary to reveal their preferences and it seems that they underestimate the value of their personal data. And there is another discomforting aspect of giving away personal data. It means giving up privacy and as a result loosing autonomy.

Preventing negative outcomes is typically a task for regulators but finding solutions can be difficult. Therefore, companies need to address consumer concerns in their policies as well. To avoid dystopia, managers need to take consumer psychology into account and resist the temptation to maximize short-term profits at the cost of consumers. Avoiding marketing dystopia is in the best interest of all market participants – at least with a longer-term perspective.

Keywords

  • AI
  • Algorithms
  • Free Choice
  • Marketing Automation
  • Personalization
Open Access

Algorithm-Based Advertising: Unintended Effects and the Tricky Business of Mitigating Adverse Outcomes

Published Online: 05 May 2021
Page range: 24 - 29

Abstract

Abstract

Some algorithms may have similar discriminatory tendencies to humans. The presented study investigates gender bias in social media advertising in the context of STEM careers. Results suggest that advertising algorithms are not gender-biased as such, but that economic forces in the background might lead to unintended, uneven outcomes. Spillover effects across industries make reaching some consumer segments more likely than others. A gender-neutral strategy is less likely to reach women because women are more likely to react to advertising. Therefore, targeting them is more expensive and economic forces unintentionally favor men. One potential solution could be running separate campaigns for men and women to target both demographic groups equally. However, anti-discrimination legislation in many countries does not allow companies to target employment ads to only one gender. So ironically, laws that are designed to avoid discrimination actually rule out a fairly simple way to correct the bias in online targeting on Facebook and other platforms, illustrating further need for policy guidance in this area.

Keywords

  • Algorithms
  • Ad Auctions
  • Discrimination
  • Gender
  • Bias
  • STEM
Open Access

Metrics Gone Wrong: What Managers Can Learn from the 2016 US Presidential Election

Published Online: 05 May 2021
Page range: 30 - 35

Abstract

Abstract

In the 2016 presidential election, the vast majority of available polls showed a comfortable lead for Hillary Clinton throughout the whole race, but in the end, she lost. Campaign managers could have known better, if they had had a closer look at other data sources and variables that – like polls – show voter engagement and preferences. In the political arena, donations, media coverage, social media followership, engagement and sentiment may similarly indicate how well a candidate is doing, and most of these variables are available for free.

Validating the bigger picture with alternative data sources is not limited to politics. The latest marketing research shows that online-consumer-behavior metrics can enrich, and sometimes replace, traditional funnel metrics. Trusting a single ‘silver bullet’ metric does not just lead to surprises, it can also mislead managerial decision-making. Econometric models can help disentangle a complex web of dynamic interactions and show immediate and lagged effects of marketing or political events.

Keywords

  • Metrics
  • Dashboards
  • Decision-Making
  • Polls
  • Probabilistic Models
  • User-Generated Data
Open Access

Ghosts in the Dark: How to Prepare for Times of Hyper-Privacy

Published Online: 05 May 2021
Page range: 36 - 41

Abstract

Abstract

Even the dark web has its bright sides because it can be used as an unregulated testbed for technologies that will eventually appear on the surface. Also, it is a useful place to study consumer privacy and have a view of what the surface world might look like under an extreme level of consumer data protection. In such a world, even our best customers might look like never-before-seen individuals, until they decide to reveal themselves. If there is trust, and a worthwhile value exchange, consumers might be willing to share their data and not enact all of the hyper-privacy available to them. To seize the opportunities, companies should take stock of their customer relationships, hone their data needs, and learn what information is critical, advantageous, or irrelevant for their context. They should implement initiatives that drive choice carefully, in a trustful relationship.

Keywords

  • Dark Web
  • Privacy
  • Privacy Calculus
  • Privacy Paradox
  • Personalization
Open Access

Instead of People Using Technology, Technology Is Using People

Published Online: 05 May 2021
Page range: 42 - 45

Abstract

Abstract

The progress of artificial intelligence and new technologies triggers hot debates about the future of human life. While fans of the singularity say that artificial intelligence will become smarter than human beings and should take over the world, for others, such a vision is a sheer nightmare. Douglas Rushkoff is clearly part of the second group and takes a passionate pro-human stance. He explains why giving too much way to technologies is a mistake and why humans deserve a place in the digital future. Already today, technologies have a much stronger impact on our lives than most of us would believe. For him, being human is a team sport, and he asks for a more conscious use of technologies while keeping rapport with other people. To safeguard the humanness in a tech world, he advises to carefully select the values we embed in our algorithms. Rather than serving perpetual growth, technologies ought to help people reconnect with each other and their physical surroundings.

Open Access

The Illusion of Free Choice in the Age of Augmented Decisions

Published Online: 05 May 2021
Page range: 46 - 51

Abstract

Abstract

In our augmented world, many decision situations are designed by smart technologies. Artificial intelligence helps reduce information overload, filter relevant information and limit an otherwise overwhelming abundance of choices. While such algorithms make our lives more convenient, they also fulfill various organizational objectives that users may not be aware of and that may not be in their best interest. We do not know whether algorithms truly optimize the benefits of their users or rather the return on investment of a company. They are not only designed for convenience but also to be addictive, and this opens the doors for manipulation. Therefore, augmented decision making undermines the freedom of choice. To limit the threats of augmented decisions and enable humans to be critical towards the outcomes of artificial intelligence–driven recommendations, everybody should develop “algorithmic literacy.” It involves a basic understanding of artificial intelligence and how algorithms work in the background. Algorithmic literacy also requires that users understand the role and value of the personal data they sacrifice in exchange for decision augmentation.

Keywords

  • Augmented Intelligence
  • Decision-Making
  • AI
  • Algorithms
  • Free Choice
Open Access

Young but not Naive: Leaders of Tomorrow Expect Limits to Digital Freedom to Preserve Freedom

Published Online: 05 May 2021
Page range: 52 - 57

Abstract

Abstract

In a recent survey, about 900 “Leaders of Tomorrow” from more than 90 countries all over the world shared their opinions about “the impact of new technologies on human freedom of choice.” They take a very clear stance against unlimited freedom of speech on the Internet. The majority thinks platforms that until now have often taken a “hands off” approach, rejecting content filtering by claiming they are “just the messenger,” should be obliged to prevent and censor hate speech and fake news on the Internet. Platforms are expected to work hand-in-hand with state institutions to better prevent online manipulation and abuse and to protect personal data. The Leaders of Tomorrow also advocate that personal data should be controlled by their owners when they are used by online platforms. Applications that lack transparency and cannot be influenced by the customer are met with the highest extent of objection.

Keywords

  • Freedom of Choice
  • Internet
  • Social Media
  • Algorithms
  • Survey
  • Leaders of Tomorrow
0 Articles
Open Access

Editorial

Published Online: 05 May 2021
Page range: 3 - 3

Abstract

Open Access

Illuminating the Dark: Exploring the Unintended Consequences of Digital Marketing

Published Online: 05 May 2021
Page range: 10 - 17

Abstract

Abstract

Our relationship to technology is deeply paradoxical. On the one hand, we buy and constantly use more devices and apps, leaving our traces in the digital space. On the other hand, we increasingly fear the dark sides of being dependent on technology and of data abuse. Inadequate knowledge and errors make it difficult to predict unintended consequences, and often problems emerge due to deliberate choices to pursue some interests while ignoring others. Hot topics include data privacy, potentially biased or discriminating algorithms, the tension between free choice and manipulation, and the optimization of questionable outputs while ignoring broader effects.

Fighting unintended consequences should get to the roots of the problems. As for personal data, users should get more control over what they share. Further, more transparency can help avoid dystopian outcomes. It concerns the use of data, in particular, by algorithms. The high concentration of power of a few global players should also be watched closely, and societies need to be critical towards their actions and objectives. Even seemingly noble motives come at a price, and this price needs to be negotiable.

Keywords

  • Digital Marketing
  • Algorithms
  • Marketing Utopia
  • Marketing Dystopia
  • Unintended Consequences
Open Access

Marketing Automation: Marketing Utopia or Marketing Dystopia?

Published Online: 05 May 2021
Page range: 18 - 23

Abstract

Abstract

Automated and personalized interactions may increase the relevance of marketing offers, but they also have less positive economic and psychological consequences for consumers. Machine learning-based prediction algorithms can approximate individuals’ preferences and their willingness to pay at ever greater levels of precision, and companies can use this knowledge to charge higher individual prices. Typically, consumers freely hand over all the information necessary to reveal their preferences and it seems that they underestimate the value of their personal data. And there is another discomforting aspect of giving away personal data. It means giving up privacy and as a result loosing autonomy.

Preventing negative outcomes is typically a task for regulators but finding solutions can be difficult. Therefore, companies need to address consumer concerns in their policies as well. To avoid dystopia, managers need to take consumer psychology into account and resist the temptation to maximize short-term profits at the cost of consumers. Avoiding marketing dystopia is in the best interest of all market participants – at least with a longer-term perspective.

Keywords

  • AI
  • Algorithms
  • Free Choice
  • Marketing Automation
  • Personalization
Open Access

Algorithm-Based Advertising: Unintended Effects and the Tricky Business of Mitigating Adverse Outcomes

Published Online: 05 May 2021
Page range: 24 - 29

Abstract

Abstract

Some algorithms may have similar discriminatory tendencies to humans. The presented study investigates gender bias in social media advertising in the context of STEM careers. Results suggest that advertising algorithms are not gender-biased as such, but that economic forces in the background might lead to unintended, uneven outcomes. Spillover effects across industries make reaching some consumer segments more likely than others. A gender-neutral strategy is less likely to reach women because women are more likely to react to advertising. Therefore, targeting them is more expensive and economic forces unintentionally favor men. One potential solution could be running separate campaigns for men and women to target both demographic groups equally. However, anti-discrimination legislation in many countries does not allow companies to target employment ads to only one gender. So ironically, laws that are designed to avoid discrimination actually rule out a fairly simple way to correct the bias in online targeting on Facebook and other platforms, illustrating further need for policy guidance in this area.

Keywords

  • Algorithms
  • Ad Auctions
  • Discrimination
  • Gender
  • Bias
  • STEM
Open Access

Metrics Gone Wrong: What Managers Can Learn from the 2016 US Presidential Election

Published Online: 05 May 2021
Page range: 30 - 35

Abstract

Abstract

In the 2016 presidential election, the vast majority of available polls showed a comfortable lead for Hillary Clinton throughout the whole race, but in the end, she lost. Campaign managers could have known better, if they had had a closer look at other data sources and variables that – like polls – show voter engagement and preferences. In the political arena, donations, media coverage, social media followership, engagement and sentiment may similarly indicate how well a candidate is doing, and most of these variables are available for free.

Validating the bigger picture with alternative data sources is not limited to politics. The latest marketing research shows that online-consumer-behavior metrics can enrich, and sometimes replace, traditional funnel metrics. Trusting a single ‘silver bullet’ metric does not just lead to surprises, it can also mislead managerial decision-making. Econometric models can help disentangle a complex web of dynamic interactions and show immediate and lagged effects of marketing or political events.

Keywords

  • Metrics
  • Dashboards
  • Decision-Making
  • Polls
  • Probabilistic Models
  • User-Generated Data
Open Access

Ghosts in the Dark: How to Prepare for Times of Hyper-Privacy

Published Online: 05 May 2021
Page range: 36 - 41

Abstract

Abstract

Even the dark web has its bright sides because it can be used as an unregulated testbed for technologies that will eventually appear on the surface. Also, it is a useful place to study consumer privacy and have a view of what the surface world might look like under an extreme level of consumer data protection. In such a world, even our best customers might look like never-before-seen individuals, until they decide to reveal themselves. If there is trust, and a worthwhile value exchange, consumers might be willing to share their data and not enact all of the hyper-privacy available to them. To seize the opportunities, companies should take stock of their customer relationships, hone their data needs, and learn what information is critical, advantageous, or irrelevant for their context. They should implement initiatives that drive choice carefully, in a trustful relationship.

Keywords

  • Dark Web
  • Privacy
  • Privacy Calculus
  • Privacy Paradox
  • Personalization
Open Access

Instead of People Using Technology, Technology Is Using People

Published Online: 05 May 2021
Page range: 42 - 45

Abstract

Abstract

The progress of artificial intelligence and new technologies triggers hot debates about the future of human life. While fans of the singularity say that artificial intelligence will become smarter than human beings and should take over the world, for others, such a vision is a sheer nightmare. Douglas Rushkoff is clearly part of the second group and takes a passionate pro-human stance. He explains why giving too much way to technologies is a mistake and why humans deserve a place in the digital future. Already today, technologies have a much stronger impact on our lives than most of us would believe. For him, being human is a team sport, and he asks for a more conscious use of technologies while keeping rapport with other people. To safeguard the humanness in a tech world, he advises to carefully select the values we embed in our algorithms. Rather than serving perpetual growth, technologies ought to help people reconnect with each other and their physical surroundings.

Open Access

The Illusion of Free Choice in the Age of Augmented Decisions

Published Online: 05 May 2021
Page range: 46 - 51

Abstract

Abstract

In our augmented world, many decision situations are designed by smart technologies. Artificial intelligence helps reduce information overload, filter relevant information and limit an otherwise overwhelming abundance of choices. While such algorithms make our lives more convenient, they also fulfill various organizational objectives that users may not be aware of and that may not be in their best interest. We do not know whether algorithms truly optimize the benefits of their users or rather the return on investment of a company. They are not only designed for convenience but also to be addictive, and this opens the doors for manipulation. Therefore, augmented decision making undermines the freedom of choice. To limit the threats of augmented decisions and enable humans to be critical towards the outcomes of artificial intelligence–driven recommendations, everybody should develop “algorithmic literacy.” It involves a basic understanding of artificial intelligence and how algorithms work in the background. Algorithmic literacy also requires that users understand the role and value of the personal data they sacrifice in exchange for decision augmentation.

Keywords

  • Augmented Intelligence
  • Decision-Making
  • AI
  • Algorithms
  • Free Choice
Open Access

Young but not Naive: Leaders of Tomorrow Expect Limits to Digital Freedom to Preserve Freedom

Published Online: 05 May 2021
Page range: 52 - 57

Abstract

Abstract

In a recent survey, about 900 “Leaders of Tomorrow” from more than 90 countries all over the world shared their opinions about “the impact of new technologies on human freedom of choice.” They take a very clear stance against unlimited freedom of speech on the Internet. The majority thinks platforms that until now have often taken a “hands off” approach, rejecting content filtering by claiming they are “just the messenger,” should be obliged to prevent and censor hate speech and fake news on the Internet. Platforms are expected to work hand-in-hand with state institutions to better prevent online manipulation and abuse and to protect personal data. The Leaders of Tomorrow also advocate that personal data should be controlled by their owners when they are used by online platforms. Applications that lack transparency and cannot be influenced by the customer are met with the highest extent of objection.

Keywords

  • Freedom of Choice
  • Internet
  • Social Media
  • Algorithms
  • Survey
  • Leaders of Tomorrow