Robots are increasingly being considered for use in highly tense civilian encounters to minimize person-to-person contact and danger to peacekeeping personnel. Trust, along with physical qualities and cultural considerations, is an essential factor in the effectiveness of these robotic peacekeepers. New research to be presented at the HFES 2015 Annual Meeting in Los Angeles in October examines the importance of social cues when evaluating the role of trust in human-robot interaction.
Joachim Meyer, coauthor of “Manners Matter: Trust in Robotic Peacekeepers” and a professor at Tel Aviv University’s Department of Industrial Engineering, notes that “interactions between machines and people should follow rules of behavior similar to the rules used in human-to-human interaction. Robots are not seen as mindless technology; rather, they are considered agents with intentions.”