TY - JOUR AB - Trust is a crucial guide in interpersonal interactions, helping people to navigate through social decision-making problems and cooperate with others. In human–computer interaction (HCI), trustworthy computer agents foster appropriate trust by supporting a match between their perceived and actual characteristics. As computers are increasingly endowed with capabilities for cooperation and intelligent problem-solving, it is critical to ask under which conditions people discern and distinguish trustworthy from untrustworthy technology. We present an interactive cooperation game framework allowing us to capture human social attributions that indicate trust in continued and interdependent human–agent cooperation. Within this framework, we experimentally examine the impact of two key dimensions of social cognition, warmth and competence, as antecedents of behavioral trust and self-reported trustworthiness attributions of intelligent computers. Our findings suggest that, first, people infer warmth attributions from unselfish vs. selfish behavior and competence attributions from competent vs. incompetent problem-solving. Second, warmth statistically mediates the relation between unselfishness and behavioral trust as well as between unselfishness and perceived trustworthiness. We discuss the possible role of human social cognition for human–computer trust. DA - 2018 DO - 10.3389/fdigh.2018.00014 LA - eng PY - 2018 T2 - Frontiers in Digital Humanities. Human-Media Interaction TI - A social cognition perspective on human--computer trust. The effect of perceived warmth and competence on trust in decision-making with computers UR - https://nbn-resolving.org/urn:nbn:de:0070-pub-29201642 Y2 - 2024-11-22T05:36:57 ER -