Presentation
43. How Robots Might Trust Humans in Mixed Motive Situations
SessionPoster Session 2
DescriptionIdeally, team members ought to work towards goals that benefit the team. There are situations in which individual goals become significant, e.g., the individual might need to respond to threat. This choice (of individual rather than team goals) could affect the ‘trust’ that other team members have in the individual. Much of the research on human-robot trust sees ‘trust’ as a disposition or attitude held by the human and directed toward to the robot. We assume that ‘trust’ is context-dependent and will vary over the course of a mission. In order to consider trust in ways that can be applicable to both humans and robots, we define trust using three dimension: Capability; Predictability; Integrity, In this paper, we operationalise ‘integrity’ in terms of individual or cooperative goals and how these change in missions. A broader aim of our research addresses the moral and ethical dimensions of integrity in trustworthy autonomy.
Event Type
Poster
TimeThursday, September 12th5:30pm - 6:30pm MST
LocationMcArthur Ballroom
Aging
Augmented Cognition
Children's Issues
Communications
Cybersecurity
Education
Environmental Design
General Sessions
Human AI Robot Teaming (AI)
Macroergonomics
Occupational Ergonomics
Student Forum
Surface Transportation
Sustainability
System Development