一种基于线结构光的水下目标3维信息测量方法

A 3D Information Measuring Method of Underwater Targets Based on Line-Structured Light

  • 摘要: 针对现有水下探测方式精度不佳、标定误差较大、效率较低等不足,设计了一种基于线结构光的水下目标3维信息测量方法。首先利用交比不变原理求解特征点在世界坐标系下的相对位置,随后基于像面特征点的方向向量及特征点的位置信息求解其在相机坐标下的真实坐标值,利用多个特征点完成结构光平面的标定。通过目标物扫描实验验证了该方法的可行性及有效性。线结构光平面的标定仅需要对特制的简单结构立体靶标进行一次扫描即可完成,无需多次移动靶标,避免了人为误差的影响。在标定线结构光平面的同时可以同步完成位移平台各自由度的位移速度标定,有效降低了实验平台系统误差,在提升系统精度的同时也极大地提升了整个系统标定过程的效率,非常适用于大尺度线结构光系统在水下现场标定。经验证,扫描水下目标物所得3维点云坐标沿X、Y轴方向的误差均小于2%,沿Z轴方向的误差略大,平均误差2.77%。

     

    Abstract: For shortcomings of existing underwater detection methods such as poor accuracy, large calibration errors and low efficiency, a 3D information measuring method of underwater targets based on line-structured light is designed. Firstly, the principle of cross ratio invariance is utilized to solve the relative position of feature points in the world coordinate system, and then their real coordinate values in camera coordinates are solved based on both the direction vector of feature points on image plane and their position information. The structured light plane are calibrated using multiple feature points. The feasibility and effectiveness of the proposed method are verified by target scanning experiments. For calibrating the structured light plane, only one scan of the specially designed simple-structured stereo target is required, without the need for moving the target several times, which avoids the influence of human-induced errors. While calibrating the structured light plane, the displacement velocity of the displacement stage in each dimension can be calibrated simultaneously, which effectively reduces the system error of the experimental platform, improves the system accuracy and also greatly enhances the efficiency of the whole system calibration process. It is very suitable for the underwater field calibration of large-scale line-structured light system. It is proved that the error of the 3D point cloud coordinates obtained from scanning the underwater target is less than 2% along the X and Y axes, and the error along the Z axis is slightly larger, with an average error of only 2.77%.

     

/

返回文章
返回