摘要
针对双目测距系统中传统Census变换算法对中心像素点依赖性过大、代价计算耗时长等问题,提出一种改进型Census变换与绝对误差和(sum of absolute differences,SAD)相结合的算法。在中心像素周围选取固定的8个像素点,按照顺序两两比较像素值得到一个字节的比特串,由归一化处理将Census与SAD的初始代价统一为对应支持窗口的初始代价,得到对应的视差图进而测出距离;采用C++语言在Visual Studio2017集成环境下进行开发实验,验证改进型SAD-Census变换算法的有效性。结果表明:本文算法对高像素与低像素图像代价计算耗时随支持窗口的增大分别稳定在6.0,0.2 s,为AD-Census算法、传统Census变换及SAD算法耗时的1/5,1/4,1/3,立体匹配速度有较大提升;在被测物距摄像头光心800~2950 mm时,采用本文算法得到的距离与实际距离相对误差绝对值小于5%,精度满足测距要求。
Aiming at the problems that the traditional Census conversion algorithm in binocular ranging system depends too much dependence on the central pixel and takes a long time to calculate the cost,an improved Census conversion algorithm combined with sum of absolute differences(SAD)was proposed.The 8 fixed pixels around the central pixel were selected,the pixel values in sequence were compared to obtain bit string of a byte,and the initial cost of Census and SAD were unified as the initial cost of the corresponding support window by the normalization process.Then corresponding disparity map was obtained to measure the distance.The development experiment was carried out in the Visual Studio 2017 integrated environment with C++language to verify the effectiveness of the improved SAD-Census conversion algorithm.The results show that with the increase of support window,the cost calculation time of this algorithm for high pixel and low pixel graphics is stable at 6.0 s and 0.2 s respectively,which is 1/5,1/4 and 1/3 of the cost of AD-Census algorithm,traditional Census conversion and SAD algorithm,and its stereo matching speed is greatly improved.When the measured object is 800-2950 mm away from the optical center of the camera,the absolute value of the relative error between the distance obtained by the algorithm in this paper and the actual distance is less than 5%,and the accuracy meets the ranging requirements.
作者
闫小宇
陆凡凡
葛芦生
YAN Xiaoyu;LU Fanfan;GE Lusheng(School of Electrical&Information Engineering,Anhui University of Technology,Maanshan 243032,China)
出处
《安徽工业大学学报(自然科学版)》
CAS
2022年第4期463-470,共8页
Journal of Anhui University of Technology(Natural Science)
基金
国家自然科学基金项目(61873002)。