TB1 Sensor-Based Control for Robotic Systems (invited)

Organized by: Tsakiris D., ICS-FORTH, Greece

Home


Solutions for Visual Control of Motion: Active Tracking Applications

Authors:

Barreto J., University of Coimbra-Polo II, Portugal

Batista J., University of Coimbra-Polo II, Portugal

Araujo H., University of Coimbra-Polo II, Portugal

ABSTRACT

This paper deals with active tracking of 3D moving targets. The performance and robustness in visual control of motion depend on the vision algorithms and the control structure where dynamical aspects can not be neglected. Visual tracking is presented as a regulation control problem. Both system architecture and controller design are discussed. The performance of visually guided systems is substantially deteriorated by delays in the control loop. Interpolation is used to cope with visual processing delay. Model predictive control strategies are proposed to compensate for the mechanical latency and improve the global system performance.

tb1-1

TOP


Robot Formations: Learning Minimum Length Paths on Uneven Terrain

Authors:

Hristu D., University of Maryland, USA

ABSTRACT

We discuss a prototype problem involving terrain exploration and learning by formations of autonomous vehicles. We investigate an algorithm for coordinating multiple robots whose task is to find the shortest path between a fixed pair of start and target locations, without access to a global map containing those locations. Odometry information alone is not suficient for minimizing path length if the terrain is uneven or if it includes obstacles. We generalize existing results on a simple control law, also known as \local pursuit", which is appropriate in the context of formations and which requires limited interaction between vehicles. Our algorithm is iterative andconverges to a locally optimal path. We include simulations and experiments illustrating the performance of the proposed strategy.

tb1-2

TOP


Using Fast Statistical Dynamic Contours for Grasping Occluding Contours

Authors:

Perrin D., University of Minnesota, USA

Masoud O., University of Minnesota, USA

Smith C., University of Colorado at Denver, USA

Papanikolopoulos N., University of Minnesota, USA

ABSTRACT

Object grasping is one of the basic functions required for many manipulator tasks. In particular, the grasping of unknown objects is often a desired functionality in manipulator system applications ranging from space exploration to factory automation. Due to the amount of object and environment data typically required to execute an unknown object grasp, computer vision is the sensor modality of choice. This paper presents a method for the automatic determination of plausible grasp axes on unknown objects using an eye-in-hand robotic system and a novel deformable contour model. The system finds potential grasp point pairs, ranks all the possible pairs/axes based upon measurements taken from the con-tour, and executes a vision-guided grasp of the object using the highest ranked grasp point pair to determine the gripper alignment constraints.

Key Words. Robotic Grasping, Statistical Dynamic Contours, Eye-in-Hand Robotic Sys-tems.

tb1-3

TOP


Corridor Following by Mobile Robots Equipped with Panoramic Cameras

Authors:

Tsakiris D., ICS-FORTH, Greece

Argyros A., ICS-FORTH, Greece

ABSTRACT

The present work considers corridor–following maneuvers for nonholo-nomic mobile robots, guided by sensory data acquired by panoramic cameras. The panoramic vision system provides information from an environment with textured walls to the motion control system, which drives the robot along a corridor. Panoramic cameras have a 360 visual field, a capability that the proposed control methods ex-ploit. In our sensor–based control scheme, optical flow information from several distinct viewing directions in the entire field of view of the panoramic camera is used directly in the control loop, without the need for state reconstruction. The interest of this lies in the fact that the optical flow information is not sufficient to reconstruct the state of the system, it is however sufficient for the proposed control law to accomplish the desired task. Driving the robot along a corridor amounts to the asymptotic stabi-lization of a subsystem of the robot’s kinematics and the proposed control schemes are shown to achieve this goal.

tb1-4

TOP


A New Partitioned Approach to Image-Based Visual Servo Control

Authors:

Corke P., CSIRO Manufacturing Science & Technology, Australia

Hutchinson S., Univ. of Illinois at Urbana-Champaign, USA

ABSTRACT

In image-based visual servo control, since control is effected with respect to the image, there is no direct control over the Cartesian velocities of the robot end effector. As a result, trajectories that the robot executes, while producing image trajectories that are pleasing, can be quite contorted in the Cartesian space. In this paper we introduce a new partitioned approach to visual servo control that overcomes this problem. In particular, we decouple the z-axis rotational and translational components of the control from the remaining degrees of freedom. Then, to guarantee thatallfeatures remain in the image throughout the entire trajectory, we incorporate a potential function that repels feature points from the boundary of the image plane. We illustrate our new control scheme with a variety of results.

tb1-5

TOP