
An inherent risk of algorithmic personalization is disproportionate targeting of individuals from certain groups (or demographic characteristics such as gender or race), even when the decision maker does not intend to discriminate based on those “protected” attributes. This unintended discrimination is often caused by underlying correlations in the data between protected attributes and other observed characteristics used by the algorithm (or machine learning (ML) tool) to create predictions and target individuals optimally. Because these correlations are hidden in high dimensional data, removing protected attributes from the database does not solve the discrimination problem; instead, removing those attributes often exacerbates the problem by making it undetectable and, in some cases, even increases the bias generated by the algorithm.
We propose BEAT (Bias-Eliminating Adapted Trees) to address these issues. This approach allows decision makers to target individuals based on differences in their predicted behavior—hence capturing value from personalization—while ensuring a balanced allocation of resources across individuals, guaranteeing both group and individual fairness. Essentially, the method only extracts heterogeneity in the data that is unrelated to protected attributes. To do so, we build on the General Random Forest (GRF) framework (S. Athey et al., Ann. Stat. 47, 1148–1178 (2019)) and develop a targeting allocation that is “balanced” with respect to protected attributes. We validate BEAT using simulations and an online experiment with N=3,146 participants. This approach can be applied to any type of allocation decision that is based on prediction algorithms, such as medical treatments, hiring decisions, product recommendations, or dynamic pricing.
The success of customer relationship management (CRM) programs ultimately depends on the firm’s ability to identify and leverage differences across customers—a difficult task when firms attempt to manage new customers, for whom only the first purchase has been observed. The lack of repeated observations for these customers poses a structural challenge for firms to infer unobserved differences across them. This is what the authors call the “cold start” problem of customer relationship management, whereby companies have difficulties leveraging existing data when they attempt to make inferences about customers at the beginning of their relationship. The authors propose a solution to the cold start problem by developing a probabilistic machine learning modeling framework that leverages the information collected at the moment of acquisition. The main aspect of the model is that it flexibly captures latent dimensions that govern the behaviors observed at acquisition as well as future propensities to buy and to respond to marketing actions using deep exponential families. The model can be integrated with a variety of demand specifications and is flexible enough to capture a wide range of heterogeneity structures. The authors validate their approach in a retail context and empirically demonstrate the model’s ability to identify high-value customers as well as those most sensitive to marketing actions right after their first purchase.
Fewer than 40% of companies that invest in AI see gains from it, usually because of one or more of these errors: (1) They don’t ask the right question, and end up directing AI to solve the wrong problem. (2) They don’t recognize the differences between the value of being right and the costs of being wrong, and assume all prediction mistakes are equivalent. (3) They don’t leverage AI’s ability to make far more frequent and granular decisions, and keep following their old practices. If marketers and data science teams communicate better and take steps to avoid these pitfalls, they’ll get much higher returns on their AI efforts.
Eva Ascarza is the Jakurski Family Associate Professor of Business Administration in the Marketing Unit. She is the co-founder of the Customer Intelligence Lab at the D^3 institute at Harvard Business School. She teaches the Marketing core in the MBA required curriculum and an elective course titled Managing Customers for Growth.
As a marketing modeler, Professor Ascarza uses tools from statistics, economics, and machine learning to answer relevant marketing questions. Her main research areas are customer management (with special attention to the problem of customer retention), Personalization and Targeting, Marketing AI, and algorithmic decision making. She uses field experimentation (e.g., A/B testing) as well as econometric modeling and machine learning tools not only to understand and predict patterns of behavior, but also to optimize the impact of firms’ interventions.
Her research has appeared in leading journals including Marketing Science, Journal of Marketing Research, and the Proceedings of the National Academy of Sciences. She received the 2014 Frank Bass award, awarded to the best marketing paper derived from a Ph.D. thesis published in an INFORMS-sponsored journal. Her research has been recognized as a Paul E. Green Award finalist (2016 and 2017) and winner (2018), awarded to the best article in the Journal of Marketing Research that demonstrates the greatest potential to contribute significantly to the practice of marketing research. Her research has been recognized as a Weitz-Winer-O'Dell Award finalist (2021) and winner (203), awarded to research that has made the most significant long-term contribution to marketing theory, methodology, and/or practice. She was named a Marketing Science Institute (MSI) Young Scholar in 2017, received the Erin Anderson Award for an Emerging Female Marketing Scholar and Mentor in 2019, and was named MSI Scholar in 2020. She serves on the editorial review board of several top marketing journals including Marketing Science, Journal of Marketing Research, Journal of Marketing, and Quantitative Marketing and Economics.
Professor Ascarza earned a Ph.D. in marketing from London Business School, a B.S. in mathematics at the Universidad de Zaragoza (Spain), and a M.S. in economics and finance from Universidad de Navarra (Spain). Prior to joining HBS, she was an associate professor in the marketing department at Columbia Business School.
- Featured Work
-

An inherent risk of algorithmic personalization is disproportionate targeting of individuals from certain groups (or demographic characteristics such as gender or race), even when the decision maker does not intend to discriminate based on those “protected” attributes. This unintended discrimination is often caused by underlying correlations in the data between protected attributes and other observed characteristics used by the algorithm (or machine learning (ML) tool) to create predictions and target individuals optimally. Because these correlations are hidden in high dimensional data, removing protected attributes from the database does not solve the discrimination problem; instead, removing those attributes often exacerbates the problem by making it undetectable and, in some cases, even increases the bias generated by the algorithm.
We propose BEAT (Bias-Eliminating Adapted Trees) to address these issues. This approach allows decision makers to target individuals based on differences in their predicted behavior—hence capturing value from personalization—while ensuring a balanced allocation of resources across individuals, guaranteeing both group and individual fairness. Essentially, the method only extracts heterogeneity in the data that is unrelated to protected attributes. To do so, we build on the General Random Forest (GRF) framework (S. Athey et al., Ann. Stat. 47, 1148–1178 (2019)) and develop a targeting allocation that is “balanced” with respect to protected attributes. We validate BEAT using simulations and an online experiment with N=3,146 participants. This approach can be applied to any type of allocation decision that is based on prediction algorithms, such as medical treatments, hiring decisions, product recommendations, or dynamic pricing.
The success of customer relationship management (CRM) programs ultimately depends on the firm’s ability to identify and leverage differences across customers—a difficult task when firms attempt to manage new customers, for whom only the first purchase has been observed. The lack of repeated observations for these customers poses a structural challenge for firms to infer unobserved differences across them. This is what the authors call the “cold start” problem of customer relationship management, whereby companies have difficulties leveraging existing data when they attempt to make inferences about customers at the beginning of their relationship. The authors propose a solution to the cold start problem by developing a probabilistic machine learning modeling framework that leverages the information collected at the moment of acquisition. The main aspect of the model is that it flexibly captures latent dimensions that govern the behaviors observed at acquisition as well as future propensities to buy and to respond to marketing actions using deep exponential families. The model can be integrated with a variety of demand specifications and is flexible enough to capture a wide range of heterogeneity structures. The authors validate their approach in a retail context and empirically demonstrate the model’s ability to identify high-value customers as well as those most sensitive to marketing actions right after their first purchase.
Fewer than 40% of companies that invest in AI see gains from it, usually because of one or more of these errors: (1) They don’t ask the right question, and end up directing AI to solve the wrong problem. (2) They don’t recognize the differences between the value of being right and the costs of being wrong, and assume all prediction mistakes are equivalent. (3) They don’t leverage AI’s ability to make far more frequent and granular decisions, and keep following their old practices. If marketers and data science teams communicate better and take steps to avoid these pitfalls, they’ll get much higher returns on their AI efforts.
- Journal Articles
-
- Ascarza, Eva, and Ayelet Israeli. "Eliminating Unintended Bias in Personalized Policies Using Bias-Eliminating Adapted Trees (BEAT)." e2115126119. Proceedings of the National Academy of Sciences 119, no. 11 (March 8, 2022). View Details
- Padilla, Nicolas, and Eva Ascarza. "Overcoming the Cold Start Problem of CRM Using a Probabilistic Machine Learning Approach." Journal of Marketing Research (JMR) 58, no. 5 (October 2021): 981–1006. View Details
- Ascarza, Eva. "Research: When A/B Testing Doesn't Tell You the Whole Story." Harvard Business Review Digital Articles (June 23, 2021). View Details
- Ascarza, Eva, Michael Ross, and Bruce G.S. Hardie. "Why You Aren't Getting More from Your Marketing AI." Harvard Business Review 99, no. 4 (July–August 2021): 48–54. View Details
- Ascarza, Eva, Scott A. Neslin, Oded Netzer, Zachery Anderson, Peter S. Fader, Sunil Gupta, Bruce Hardie, Aurelie Lemmens, Barak Libai, David T. Neal, Foster Provost, and Rom Schrift. "In Pursuit of Enhanced Customer Retention Management: Review, Key Issues, and Future Directions." Special Issue on 2016 Choice Symposium. Customer Needs and Solutions 5, nos. 1-2 (March 2018): 65–81. View Details
- Ascarza, Eva. "Retention Futility: Targeting High-Risk Customers Might Be Ineffective." Journal of Marketing Research (JMR) 55, no. 1 (February 2018): 80–98. View Details
- Ascarza, Eva, Oded Netzer, and Bruce G.S. Hardie. "Some Customers Would Rather Leave Without Saying Goodbye." Marketing Science 37, no. 1 (January–February 2018): 54–77. View Details
- Ascarza, Eva, Peter Ebbes, Oded Netzer, and Matthew Danielson. "Beyond the Target Customer: Social Effects in CRM Campaigns." Journal of Marketing Research (JMR) 54, no. 3 (June 2017): 347–363. View Details
- Ascarza, Eva, Raghuram Iyengar, and Martin Schleicher. "The Perils of Proactive Churn Prevention Using Plan Recommendations: Evidence from a Field Experiment." Journal of Marketing Research (JMR) 53, no. 1 (February 2016): 46–60. View Details
- Ascarza, Eva, and Bruce G.S. Hardie. "A Joint Model of Usage and Churn in Contractual Settings." Marketing Science 32, no. 4 (July–August 2013): 570–590. View Details
- Ascarza, Eva, Anja Lambrecht, and Naufel Vilcassim. When Talk Is "Free": The Effect of Tariff Structure on Usage Under Two- and Three-Part Tariffs. Journal of Marketing Research (JMR) 49, no. 6 (December 2012): 882–900. View Details
- Working papers
-
- Huang, Ta-Wei, and Eva Ascarza. "Doing More with Less: Overcoming Ineffective Long-Term Targeting Using Short-Term Signals." Harvard Business School Working Paper, No. 23-023, October 2022. (Revised April 2023.) View Details
- Dew, Ryan, Eva Ascarza, Oded Netzer, and Nachum Sicherman. "Detecting Routines in Ridesharing: Implications for Customer Management." Harvard Business School Working Paper, No. 23-060, March 2023. View Details
- Ascarza, Eva, Oded Netzer, and Julian Runge. "The Twofold Effect of Customer Retention in Freemium Settings." Harvard Business School Working Paper, No. 21-062, November 2020. View Details
- Padilla, Nicolas, Eva Ascarza, and Oded Netzer. "The Customer Journey as a Source of Information." Working Paper, June 2019. View Details
- Book Chapters
-
- Cases and Teaching Materials
-
- Ascarza, Eva, Ayelet Israeli, and Celine Chammas. "Retail Media Networks." Harvard Business School Background Note 523-029, August 2022. View Details
- Ascarza, Eva. "Managing Customers in the Digital Era." Harvard Business School Module Note 522-066, March 2022. (Revised March 2022.) View Details
- Ascarza, Eva. "Allianz Customer Centricity: Is Simplicity the Way Forward?" Harvard Business School PowerPoint Supplement 522-086, March 2022. View Details
- Ascarza, Eva. "Allianz Customer Centricity: Is Simplicity the Way Forward?" Harvard Business School Spreadsheet Supplement 522-713, March 2022. View Details
- Ascarza, Eva. "Allianz Customer Centricity: Is Simplicity the Way Forward?" Harvard Business School Teaching Note 522-060, March 2022. View Details
- Ascarza, Eva. "Interview with Julio Bruno (Time Out)." Harvard Business School Multimedia/Video Supplement 522-707, September 2021. View Details
- Ascarza, Eva. "Time Out: The Evolution from Media to Markets." Harvard Business School Teaching Note 522-036, August 2021. View Details
- Ascarza, Eva. "Melissa Wood Health: How to Win in the Creator Economy." Harvard Business School Teaching Note 522-024, August 2021. (Revised February 2022.) View Details
- Ascarza, Eva, and Ayelet Israeli. "Amazon Shopper Panel: Paying Customers for Their Data." Harvard Business School Teaching Note 522-011, July 2021. (Revised January 2022.) View Details
- Ascarza, Eva, and Emilie Billaud. "Allianz Customer Centricity: Is Simplicity the Way Forward?" Harvard Business School Case 522-008, July 2021. (Revised October 2021.) View Details
- Ascarza, Eva. "Melissa Wood Health: How to Win in the Creator Economy." Harvard Business School Case 521-086, May 2021. (Revised August 2021.) View Details
- Ascarza, Eva, and Ayelet Israeli. "Amazon Shopper Panel: Paying Customers for Their Data." Harvard Business School Case 521-058, January 2021. (Revised May 2021.) View Details
- Barasz, Kate, and Eva Ascarza. "Time Out: The Evolution from Media to Markets." Harvard Business School Case 520-128, June 2020. (Revised August 2021.) View Details
- Ascarza, Eva, and Ayelet Israeli. "Spreadsheet Supplement to Artea Teaching Note." Harvard Business School Spreadsheet Supplement 521-705, September 2020. (Revised July 2022.) View Details
- Ascarza, Eva, and Ayelet Israeli. "Artea (A), (B), (C), and (D): Designing Targeting Strategies." Harvard Business School Teaching Note 521-041, September 2020. (Revised July 2022.) View Details
- Ascarza, Eva, and Ayelet Israeli. "Artea (D): Discrimination through Algorithmic Bias in Targeting." Harvard Business School Exercise 521-043, September 2020. (Revised July 2022.) View Details
- Ascarza, Eva, and Ayelet Israeli. "Artea (C): Potential Discrimination through Algorithmic Targeting." Harvard Business School Exercise 521-037, September 2020. (Revised July 2022.) View Details
- Ascarza, Eva, and Ayelet Israeli. "Artea (B): Including Customer-level Demographic Data." Harvard Business School Exercise 521-022, September 2020. (Revised July 2022.) View Details
- Ascarza, Eva, and Ayelet Israeli. Spreadsheet Supplement to "Artea: Designing Targeting Strategies". Harvard Business School Spreadsheet Supplement 521-703, September 2020. (Revised July 2022.) View Details
- Israeli, Ayelet, and Eva Ascarza. "Algorithmic Bias in Marketing." Harvard Business School Teaching Note 521-035, September 2020. (Revised July 2022.) View Details
- Israeli, Ayelet, and Eva Ascarza. "Algorithmic Bias in Marketing." Harvard Business School Technical Note 521-020, September 2020. (Revised July 2022.) View Details
- Ascarza, Eva, and Ayelet Israeli. "Artea: Designing Targeting Strategies." Harvard Business School Exercise 521-021, September 2020. (Revised April 2021.) View Details
- Ascarza, Eva, and Keith Wilcox. "Kate Spade New York: Will Expansion Deepen or Dilute the Brand? Teaching Note." 2015. View Details
- Ascarza, Eva, Tomomichi Amano, and Sunil Gupta. "Othellonia: Growing a Mobile Game." Harvard Business School Teaching Note 520-041, November 2019. (Revised January 2022.) View Details
- Ascarza, Eva, Tomomichi Amano, and Sunil Gupta. "Othellonia: Growing a Mobile Game." Harvard Business School Case 520-016, September 2019. (Revised June 2020.) View Details
- Ascarza, Eva, and Keith Wilcox. "EPILOGUE: Kate Spade New York: Will Expansion Deepen or Dilute the Brand?" Columbia CaseWorks Series. 2015. View Details
- Wilcox, Keith, and Eva Ascarza. "Kate Spade New York: Will Expansion Deepen or Dilute the Brand?" Columbia CaseWorks Series. 2015. View Details
- Other Publications and Materials
-
- Research Summary
-
Professor Ascarza’s research primarily focuses on providing researchers and marketers a better understanding of how to manage customer retention so as to reduce churn and increase firm’s profitability. She addresses these issues by building empirical models of customer relationship management with a focus on understanding and managing customer retention (i.e., reducing customer churn). While previous literature on customer relationship management (CRM) has predominantly used secondary data, she investigates most of these research questions from the lenses of causal inference (e.g., running field experiments). Some of her findings are counter-intuitive at first glance, but compelling once she pins down the underlying mechanisms. For example, some of her recent work challenges the very common practice of focusing on ‘risk of churning’ as the most important metric for proactive churn management. Combining two field experiments in different industries, professor Ascarza shows that, when the goal is to select customers for proactive/preventive retention efforts, identifying customers who have a high risk of churning might be missing the point. In turn, she empirically demonstrates that customers with the highest risk of churning and those who should be targeted are not necessarily the same. In another field study, Professor Ascarza investigates the role of social influence in retention campaigns. Specifically, she examines the role of the (telecommunications) network in influencing usage and retention decisions among customers who did not receive a marketing campaign, but who were connected to those who were targeted in the campaign. She finds a social multiplier of 1.28. That is, the effect of the campaign on first-degree connections of targeted customers is 28% of the effect of the campaign on the targeted customers.
- Awards & Honors
-
Winner of the 2023 Weitz-Winer-O'Dell Award for “Retention Futility: Targeting High-Risk Customers Might Be Ineffective” (Journal of Marketing Research (JMR), 2018).
Selected as an INFORMS Doctoral Consortium Fellow at the University of Miami in 2023.
Selected as an AMA-Sheth Foundation Doctoral Consortium Faculty Fellow by the American Marketing Association in 2015, 2018, 2019, 2020, 2022, and 2023.
Finalist for the 2021 Weitz-Winer-O'Dell Award for “The Perils of Proactive Churn Prevention Using Plan Recommendations: Evidence from a Field Experiment” (Journal of Marketing Research (JMR), 2016) with Raghuram Iyengar and Martin Schleicher.
Selected as a Marketing Science Institute Scholar in 2020.
Winner of the 2019 Erin Anderson Award for Emerging Female Marketing Scholar and Mentor from the American Marketing Association.
Finalist for the 2019 MSI Robert D. Buzzell Award from the Marketing Science Institute for “In Pursuit of Enhanced Customer Retention Management: Review, Key Issues, and Future Directions” (Customer Needs and Solutions, 2018) with Scott A. Neslin, Oded Netzer, Zachery Anderson, Peter S. Fader, Sunil Gupta, Bruce G.S. Hardie, Aurélie Lemmens, Barak Libai, David Neal, Foster Provost, and Rom Schrift.
Winner of the 2018 Paul E. Green Award from the Journal of Marketing Research for "Retention Futility: Targeting High-Risk Customers Might Be Ineffective" (February 2018).
Finalist for the 2017 Paul E. Green Award from the Journal of Marketing Research for “Beyond the Target Customer: Social Effects in CRM campaigns” (June 2017) with Peter Ebbes, Oded Netzer and Matthew Danielson.
Selected as a Marketing Science Institute Young Scholar in 2017.
Finalist for the 2016 Paul E. Green Award from the Journal of Marketing Research for “The Perils of Proactive Churn Prevention using Plan Recommendations: Evidence from a Field Experiment” (February 2016) with Raghuram Iyengar and Martin Schleicher.
Winner of the 2014 Frank M. Bass Dissertation Paper Award for “A Joint Model of Usage and Churn in Contractual Settings.”
Selected as an INFORMS Doctoral Consortium Fellow at the University of British Columbia in 2008.
Selected as an AMA-Sheth Foundation Doctoral Consortium Fellow by the American Marketing Association in 2007.
- Additional Information
-
- Areas of Interest
-