School-age children are more skeptical of inaccurate robots than adults

Cognition. 2024 Aug:249:105814. doi: 10.1016/j.cognition.2024.105814. Epub 2024 May 18.

Abstract

We expect children to learn new words, skills, and ideas from various technologies. When learning from humans, children prefer people who are reliable and trustworthy, yet children also forgive people's occasional mistakes. Are the dynamics of children learning from technologies, which can also be unreliable, similar to learning from humans? We tackle this question by focusing on early childhood, an age at which children are expected to master foundational academic skills. In this project, 168 4-7-year-old children (Study 1) and 168 adults (Study 2) played a word-guessing game with either a human or robot. The partner first gave a sequence of correct answers, but then followed this with a sequence of wrong answers, with a reaction following each one. Reactions varied by condition, either expressing an accident, an accident marked with an apology, or an unhelpful intention. We found that older children were less trusting than both younger children and adults and were even more skeptical after errors. Trust decreased most rapidly when errors were intentional, but only children (and especially older children) outright rejected help from intentionally unhelpful partners. As an exception to this general trend, older children maintained their trust for longer when a robot (but not a human) apologized for its mistake. Our work suggests that educational technology design cannot be one size fits all but rather must account for developmental changes in children's learning goals.

Keywords: Cognitive development; Educational robotics; Human-robot interaction; Selective trust.

MeSH terms

  • Adult
  • Age Factors
  • Child
  • Child Development / physiology
  • Child, Preschool
  • Female
  • Humans
  • Learning / physiology
  • Male
  • Robotics*
  • Trust*
  • Young Adult