Our HRI research goal is to develop capabilities for humans to supervise and task large, heterogeneous robot teams as well as for humans to interact directly with robotic teammates.

Remotely deployed robots significantly complicates the presentation of meaningful, timely, and relevant information to the human operators or supervisors. Humans’ ability to understand robot provided information and to react appropriately decreases disproportionately as the number and types of robots increase. Our HRI research focuses on developing a system of interfaces that provide advanced interaction and visualization capabilities to address these complexities, while supporting human decision making and situation awareness.

Our team has developed a number visualization capabilities to permit a single operator (e.g. the unmanned vehicle specialist) to effectively supervise and command large teams of robots. Robot autonomy is improving; however, autonomy can introduce “out-of-the-loop syndrome” that further complicates the human’s responsibilities. Thus, new methods of tasking robots and visualizing robot provided information are required.

We are currently developing a system of interfaces that integrates the unmanned vehicle specialist interface with interfaces that support a human command hierarchy. The higher user command hierarchy levels incorporate information provided by robots and responders to support overall response decision making that is communicated to the lower command hierarchy levels and results in the allocation of tasks to heterogeneous robot teams. The cognitive and work analyses and CIFA results have been fundamental to designing and developing the system of interfaces.

Our research also focuses on human-robot team partnerships and understanding how humans interact directly with robotic partners. Specifically, we have investigated how human performance differs when teaming with another human or a robot. This research is also developing human performance metrics for use in the real world when the human and robot team must complete tasks outside of constrained laboratory environments.

Current Projects

System of human-robot interfaces
Multi-modal interaction with robot teams
General Visualization Abstraction Algorithm
Human-robot teams informed by human performance moderator functions

Prior Projects

Compass Visualizations for Human-Robotic Interaction
A Human Eye Like Perspective for Remote Vision
Multiple Robot Interfaces Scalability and the Halo Concept
Picture-in-Picture Interface
Common Interface Software Architecture
Visualization of Multiple Robots During Team Activities
Task Lists for Human-Multiple Robot Interaction
PDA-based teleoperation
Robotic Tasks for CBRNE Incident Response
CBRNE User Levels
Decision Information Abstracted to a Relevant Encapsulation concept
Thinking about and interacting with living and mechanical agents
Integration of Image Stitching and HRI Visualizations
Visualization of Object Detection Information
Effectively Co-located GIS Map Items