Complementary data fusion in vision-guide and control of robotic tracking
Published online by Cambridge University Press: 17 January 2001
Abstract
We present a data fusion control scheme for the hand-held camera of the SCORBOT-ER VII robot arm for learning visual tracking and interception. The control scheme consists of two modules: The first one generates candidate actions to drive the end-effector as accurate as possible directly above a moving target, so that the second module can handily take over to intercept it. The desired camera-joint coordinate mappings are generalized by Elman neural networks for a tracking module. The intercept module then determines a suitable intercept trajectory for the robot within the required conditions. The simulation results support the claim that it could be successfully applied to track and intercept a moving target.
- Type
- Research Article
- Information
- Copyright
- © 2001 Cambridge University Press
- 1
- Cited by