Trust in Humans, Robots, and Cyborgs: Treated the Same, but Experienced Differently
Posted: 10 Nov 2018
Date Written: October 17, 2018
Human-robot and human-cyborg interactions requiring trust are increasingly common in the marketplace, workplace, on the road, and in the home, yet little is known about human willingness to make trust-based investments with non-human agents acting alone (i.e., “robots”), or bound to welfare of non-deciding humans (i.e., “cyborgs”). Even less is known about the emotional reactions these interactions elicit. While other-regarding models of social preferences predict more trust-based investment in interactions that can benefit others, we see no difference across conditions in investments – only differences in emotional reactions to the trust-based interaction outcomes. The Recalibrational model of emotions predicts whether particular emotions are reported following trust-game interactions with people. Here we extend those emotion predictions to analogous trust based interactions, but with robots and cyborgs that violate certain expectations of human-human relationships. Using a between-subjects design, we compare investment and emotions from human-human trust games to investment and emotions from nearly identical trust games (a.k.a. “risk games”) that humans play with a robot or with a cyborg. Between conditions we find different emotional reactions but fail to find differences in investment behavior. These results highlight a unique emotional facet of human interaction while providing support for the Recalibration model of emotions.
Keywords: Trust, Robots, Cyborgs, Recalibrational Emotion, Experiment
JEL Classification: C72, C90, D63, D64, L51
Suggested Citation: Suggested Citation