ABSTRACT
This paper deals with active tracking of 3D moving targets. The performance and robustness in visual control of motion depend on the vision algorithms and the control structure where dynamical aspects can not be neglected. Visual tracking is presented as a regulation control problem. Both system architecture and controller design are discussed. The performance of visually guided systems is substantially deteriorated by delays in the control loop. Interpolation is used to cope with visual processing delay. Model predictive control strategies are proposed to compensate for the mechanical latency and improve the global system performance.
ABSTRACT
We discuss a prototype problem involving terrain exploration and learning by formations of autonomous vehicles. We investigate an algorithm for coordinating multiple robots whose task is to find the shortest path between a fixed pair of start and target locations, without access to a global map containing those locations. Odometry information alone is not suficient for minimizing path length if the terrain is uneven or if it includes obstacles. We generalize existing results on a simple control law, also known as \local pursuit", which is appropriate in the context of formations and which requires limited interaction between vehicles. Our algorithm is iterative andconverges to a locally optimal path. We include simulations and experiments illustrating the performance of the proposed strategy.
ABSTRACT
Object grasping is one of the basic functions required for many manipulator tasks. In particular, the grasping of unknown objects is often a desired functionality in manipulator system applications ranging from space exploration to factory automation. Due to the amount of object and environment data typically required to execute an unknown object grasp, computer vision is the sensor modality of choice. This paper presents a method for the automatic determination of plausible grasp axes on unknown objects using an eye-in-hand robotic system and a novel deformable contour model. The system finds potential grasp point pairs, ranks all the possible pairs/axes based upon measurements taken from the con-tour, and executes a vision-guided grasp of the object using the highest ranked grasp point pair to determine the gripper alignment constraints.
Key Words. Robotic Grasping, Statistical Dynamic Contours, Eye-in-Hand Robotic Sys-tems.
ABSTRACT
The present work considers corridor–following maneuvers for nonholo-nomic mobile robots, guided by sensory data acquired by panoramic cameras. The panoramic vision system provides information from an environment with textured walls to the motion control system, which drives the robot along a corridor. Panoramic cameras have a 360 visual field, a capability that the proposed control methods ex-ploit. In our sensor–based control scheme, optical flow information from several distinct viewing directions in the entire field of view of the panoramic camera is used directly in the control loop, without the need for state reconstruction. The interest of this lies in the fact that the optical flow information is not sufficient to reconstruct the state of the system, it is however sufficient for the proposed control law to accomplish the desired task. Driving the robot along a corridor amounts to the asymptotic stabi-lization of a subsystem of the robot’s kinematics and the proposed control schemes are shown to achieve this goal.
ABSTRACT
In image-based visual servo control, since control is effected with respect to the image, there is no direct control over the Cartesian velocities of the robot end effector. As a result, trajectories that the robot executes, while producing image trajectories that are pleasing, can be quite contorted in the Cartesian space. In this paper we introduce a new partitioned approach to visual servo control that overcomes this problem. In particular, we decouple the z-axis rotational and translational components of the control from the remaining degrees of freedom. Then, to guarantee thatallfeatures remain in the image throughout the entire trajectory, we incorporate a potential function that repels feature points from the boundary of the image plane. We illustrate our new control scheme with a variety of results.