PDA-Based Teleoperation Interface for a Mobile Robot

Faculty: Dr. Julie A. Adams

Student: Hande Kaymaz-Keskinpala

Overview:

The teleoperation of a mobile robot is a basic operation that requires human-robot interaction. This project employs a Personal Digital Assistant (PDA) interface to teleoperate a mobile robot. This interface provides a touch based interaction and requires no stylus based interactions. 

The system provides a common touch based interface that employs large transparent buttons.  The buttons are large enough to accommodate fingers while wearing thick, heavy gloves.  Since the buttons cover a portion of the space, their transparency provides the ability to view the information behind the buttons.  

The common teleoperation interface allows the user to command the robot to move forward, backward, left, right or a combination of forward and turning or backward and turning.  The interface also provides a large stop button.  The intention is that the PDA would be attached to the forearm of the user so that the user's hands are free.  Therefore, the interface screens are rotated 90' counter-clockwise.

This particular system is composed of three interfaces that provide different levels of information for varying conditions.  Such a design is required due to the limited screen real-estate provided by PDA devices. 

Objective:

The objective of this work was to develop a very simple, usable, small, and portable teleoperation device for a mobile robot.  A user evaluation was performed to determine which of the three interface screens was the most usable and required the least amount of workload. 

Project Description:

Three touch-based PDA screens were developed in order to teleoperate a mobile robot.  The robot used in this work is Scooter, the center's ATVR-Jr. robot, as shown in Figure 1. Details regarding screen design considerations and implementations are provided in [1, 3].

Figure 1. Scooter, IRL’s modified ATRV-Jr by iRobot

The Vision-Only screen provides the operator with the view from the robot’s forward facing camera and is useful when the robot and its environment cannot be directly viewed. The screen is provided in Figure 2.

Figure 2. The Vision-only screen.

The Sensor-only screen includes only the ultrasonic and laser range finder information.  The hypothesis for the development of this screen was that it would provide the user with the ability to navigate small environments containing a large number of obstacles as it provides feedback surrounding the robot. The screen displays the sensory information relative to the robot's main body.  Figure 3 provides an example screen shot. 

Figure 3: Sensor-only Screen.

The Vision with Sensory Overlay screen is intended to provide the sonar and laser range finder information overlaid onto the robot's camera image, thus providing simultaneous viewing of all available sensory data.  Figure 4 provides an example screen shot. 

Figure 4. The Vision with Sensory Overlay screen. 

A user evaluation was conducted to compare the three screens. Thirty novice participants from the Vanderbilt University community participated in the study.  Generally speaking, the participants found that the Vision-only screen was the easiest to use and required the least amount of workload when participants were required to teleoperate the robot from a remote location.  They also found that the Vision with Sensory Overlay screen rated better than the Sensory-only screen when both screens were used from a remote location. The results changed when the participants were permitted to use the Sensor-only screen while in the same environment as the robot.  In this case, the Sensor-only screen rated easier to use and required less workload than the other two screens. The results related to perceived mental workload are presented in [2] and complete details of the evaluation and results are provided in [3].


Acknowledgements:

This work was supported by an E-Teams sponsored grant and by internal Vanderbilt University funding.

References:

[1] H. Kaymaz-Keskinpala, J. A. Adams, and Kazuhiko Kawamura, "PDABased HumanRobotic Interface," 2003 IEEE International Conference on Systems, Man, and Cybernetics. Washington, DC., October, 2003.

[2] J. A. Adams and H. Kaymaz-Keskinpala, "Analysis of Perceived Workload when using a PDA for Mobile Robot Teleoperation," 2004 IEEE International Conference on Robotics and Automation, (submitted to). April, 2004. 

[3] H. Kaymaz-Keskinpala, PDA-Based Teleoperation Interface for A Mobile Robot. Master's Thesis, Department of Electrical Engineering and Computer Science, Vanderbilt University. May 2004.