李明富, 付艳, 李世其, 朱文革, 赵迪. 基于双目视差和主动轮廓的机器人手眼协调控制技术研究[J]. 机器人, 2008, 30(3): 248-253..
LI Ming-fu, FU Yan, LI Shi-qi, ZHU Wen-ge, ZHAO Di. Robotic Hand-Eye Coordination Control Based on Binocular Disparity and Active Contour. ROBOT, 2008, 30(3): 248-253..
Abstract:Robotic hand-eye coordination technology based on binocular disparity and active contour is proposed in this paper.The idea of active contour is used to dynamically approximate and track the external contour of the robot and the object,and the method which makes the binocular parallax to zero is proposed to enable the robot to reach the target and grasp the object.Firstly,the geometric parameter model of robot finger contour and the corresponding probability density observation model are set up,and CONDENSATION algorithm is applied to approximate and track the contour dynamically.Then based on the geometrical characteristics of approximated contours,the strategy of hand-eye coordination control based on binocular disparity is discussed.Finally,experiments for robot to grasp a ball are implemented based on the hand-eye coordination control method.The result shows that the proposed method is robust and insensitive to image noise,and can carry out tracking and grasping tasks guided by vision even under the conditions with cluttered background and complex textures.
[1] Marchand E,Chaumette F.Feature tracking for visual servoing purposes[J].Robotics and Autonomous Systems,2005,52(1):53-70.
[2] Hager G D,Toyama K.X Vision:A portable substrate for realtime vision applications[J].Computer Vision and Image Understanding,1998,69(1):23-37.
[3] Marchand E.ViSP:A software environment for eye-in-hand visual servoing[A].Proceedings of the IEEE International Conference on Robotics and Automation[C].Piscataway,NJ,USA:IEEE,1999.3224-3229.
[4] Sundareswaran V,Behringer R.Visual servoing-based augmented reality[A].Proceedings of the First International Workshop on Augmented Reality[C].Natick,MA,USA:A K Peters,1998.193-200.
[5] Sullivan M J,Papanikolopoulos N P.Using active deformable models to track deformable objects in robotic visual servoing experiments[A].Proceedings of the IEEE International Conference on Robotics and Automation[C].Piscataway,NJ,USA:IEEE,1996.2929-2934.
[6] Porta J M,Verbeek J J,Krose B J A.Active appearance-based robot localization using stereo vision[J].Autonomous Robots,2005,18(1):59-80.
[7] 夏利民,谷士文,罗大庸,等.基于活动轮廓的机器人视觉伺服控制[J].国防科技大学学报,2000,22(1):60-64.
[8] Xia L M,Gu S W,Luo D Y,et al.Robotic visual servoing based on snakes[A].Proceedings of the 3rd World Congress on Intelligent Control and Automation[C].Piscataway,NJ,USA:IEEE,2000.1317-1320.
[9] Pressigout M,Marchand E.Real time planar structure tracking for visual servoing:A contour and texture approach[A].Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems[C].Piscataway,NJ,USA:IEEE,2005.251-256.
[10] Isard M,Blake A.CONDENSATION-Conditional density propagation for visual tracking[J].International Journal of Computer Vision,1998,29(1):5-28.
[11] Ferre M,Aracil R,Navas M.Stereoscopic video images for telerobotic applications[J].Journal of Robotic Systems,2005,22(3):131-146.
[12] Hager G D,Chang W C,Morse A S.Robot hand-eye coordination based on stereo vision[J].IEEE Control Systems Magazine,1995,15(1):30-39.