Volume 23 Issue 2 - Publication Date: 1 February 2004
Human-Centered Robotics and Interactive Haptic Simulation
Oussama Khatib Robotics Laboratory, Department of Computer Science, Stanford University, Stanford, CA 94305, USA, Oliver Brock Laboratory for Perceptual Robotics, Department of Computer Science, University of Massachusetts, Amherst, MA 01003, USA, Kyong-Sok Chang, Diego Ruspini, Luis Sentis and Sriram Viji Robotics Laboratory, Department of Computer Science, Stanford University, Stanford, CA 94305, USA

A new field of robotics is emerging. Robots are today moving towards applications beyond the structured environment of a manufacturing plant. They are making their way into the everyday world that people inhabit. This paper focuses on models, strategies, and algorithms associated with the autonomous behaviors needed for robots to work, assist, and cooperate with humans. In addition to the new capabilities they bring to the physical robot, these models and algorithms and, more generally, the body of developments in robotics is having a significant impact on the virtual world. Haptic interaction with an accurate dynamic simulation provides unique insights into the realworld behaviors of physical systems. The potential applications of this emerging technology include virtual prototyping, animation, surgery, robotics, cooperative design, and education among many others. Haptics is one area where the computational requirement associated with the resolution in real time of the dynamics and contact forces of the virtual environment is particularly challenging. This paper describes various methodologies and algorithms that address the computational challenges associated with interactive simulations involving multiple contacts and impacts between humanlike structures.

Multimedia Key
= Video = Data = Code = Image
Example One: Task and posture control sequence. The task involves the control of the position of the hands and the orientation of the head. The posture is designed to minimize the gravity torques at the knee joint. (4.9MB)
Example Two: The motion of a human skeleton is generated by dynamic simulation. The user interactively exerts a force onto the skeleton at a location indicated by the mouse pointer at times 0:04 and 0:07. The dynamic response of the skeleton is shown in the video. (2.1MB)
Example Three: A humanoid figure on skis is dropped on a ski jump. The only actuation is provided by gravitational forces. All subsequent interactions with the environment are determined using the described framework for dynamic multicontact simulation. (1.8MB)
Example Four: A dynamically simulated sequence involving two humanoids and many objects in the environment. (3.88MB)
Example Five: Realtime path modification using elastic strips. To avoid the obstacles, the mobile manipulator deviates significantly from the task, which consists of following the red line with the end-effector. (3.4MB)
Example Six: Taskconsistent path modification using elastic strips. The obstacles perform the same motion as in the previous video. Obstacle avoidance is performed in the nullspace of the task so that task execution is not interrupted. (3.4MB)
Example Seven: Obstacle motion renders task-consistent path modification impossible,due to kinematic limitations of the mechanism. Based on the elastic strip framework, the task is automatically suspended and resumed when the second obstacle is avoided. (3.4MB)
Example Eight: The experiments shown in Extensions 5 and 6 are executed on a real robot. (13.6MB)
Return to Contents