This research examines whether consumers ascribe racial stereotypes to artificially intelligent (AI; nonhuman) agents and whether these stereotypes impact ratings of satisfaction, perceptions of competence and humanness, and outcomes of negotiated transactions. Drawing on the stereotype content model, expectation violation theory, and the humanness-value-loyalty framework, we investigate how consumers apply racial stereotype judgments in interactions with artificially intelligent agents in a controlled negotiation experiment. Results reveal that although Black people, in general, are more likely to be stereotyped as less competent than Asian or White people, the opposite is true for Black AI bots. Furthermore, perceptions of competence and humanness of Black AI bots supersede those of Asian and White AI bots, leading to increased ratings of overall satisfaction, and some evidence of more favorable negotiation behaviors. Implications for AI applications in marketing are discussed.
Davis, Nicole, Nils Olsen, Vanessa G. Perry, Marcus M. Stewart, and Tiffany B. White (2023), “I’m
Only Human? The Role of Racial Stereotypes, Humanness and Satisfaction in Transactions with Anthropomorphic Sales Bot Agents,” Journal of the Association for Consumer Research, 8(1), 47-58.
WATCH OUR VIDEO BELOW TO LEARN MORE ABOUT OUR FINDINGS!