WO2018103152A1 - Three-dimensional digital imaging sensor, and three-dimensional scanning system and scanning method thereof - Google Patents

Three-dimensional digital imaging sensor, and three-dimensional scanning system and scanning method thereof Download PDF

Info

Publication number
WO2018103152A1
WO2018103152A1 PCT/CN2016/112118 CN2016112118W WO2018103152A1 WO 2018103152 A1 WO2018103152 A1 WO 2018103152A1 CN 2016112118 W CN2016112118 W CN 2016112118W WO 2018103152 A1 WO2018103152 A1 WO 2018103152A1
Authority
WO
WIPO (PCT)
Prior art keywords
dimensional
module
stripe
image
measured
Prior art date
Application number
PCT/CN2016/112118
Other languages
French (fr)
Chinese (zh)
Inventor
赵晓波
王文斌
刘增艺
Original Assignee
杭州先临三维科技股份有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 杭州先临三维科技股份有限公司 filed Critical 杭州先临三维科技股份有限公司
Publication of WO2018103152A1 publication Critical patent/WO2018103152A1/en

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/24Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
    • G01B11/25Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object
    • G01B11/2545Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object with one projection direction and several detection directions, e.g. stereo
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/24Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
    • G01B11/25Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis

Definitions

  • the invention relates to a three-dimensional digital imaging sensor, a three-dimensional scanning system and a scanning method, in particular to a three-dimensional digital imaging sensor, a three-dimensional scanning system and a scanning method for a handheld multi-striped three-dimensional scanning system.
  • Three-dimensional digital technology is an emerging interdisciplinary field active in international research in recent years, and is widely used in many fields such as reverse engineering, cultural relics protection, industrial inspection and virtual reality.
  • Handheld portable 3D scanners are widely used in 3D scanning for their convenience and flexibility.
  • the principle of the existing handheld 3D scanner is mainly based on the active stereoscopic mode of structured light.
  • structured light such as infrared laser speckle, DLP (Digital Light Processing) projection speckle, and DLP projection analog laser. Stripes, laser stripes, etc.
  • DLP-projected analog laser stripes, laser stripes are structured light, and the handheld 3D scanner has the highest precision and best scanning details.
  • the three-dimensional reconstruction algorithm is used to perform three-dimensional reconstruction on the matched matching stripe and the corresponding marker point center;
  • the matching stripe matching on the left and right camera images during the scanning process is mainly based on the guidance of the light plane or the stripe plane equation.
  • the matching error rate of the corresponding stripe on the camera image is significantly improved when the number of stripes is greater than 15. This increases noise and reduces the accuracy of the scanned data.
  • the scanning efficiency is not effectively improved. Therefore, an effective method for improving scanning efficiency under the inherent scanning frame rate limitation is to increase the number of stripes while improving the accuracy of stripe matching.
  • the present invention provides a three-dimensional scanning system for acquiring three-dimensional point cloud data of an object to be measured, comprising: a light source for projecting a plurality of stripe patterns on the object to be measured; and left and right cameras for synchronous acquisition a 2D left image and a 2D right image of the measured object; a 3D module for synchronously acquiring a depth map of the measured object; and a stripe matching module configured to guide the left and right image stripe according to the depth map; 3D reconstruction module, which is used to match the left and right images with the corresponding stripes, using the left and right phases
  • the polar line geometric constraint relationship of the machine is used to find the corresponding relationship of single points in the center line segment of the corresponding stripe, and then reconstruct the corresponding point into 3D point cloud data according to the calibration parameters of the 3D scanning system.
  • the light source includes a laser, a projector, and when the light source is a projector, the projector is a digital projector, and the stripe pattern includes an analog laser stripe pattern, a laser stripe pattern, and the like.
  • the number of stripes in the stripe pattern is greater than 15.
  • the three-dimensional module is a low-resolution three-dimensional scanning module.
  • a three-dimensional digital imaging sensor for scanning an object to be measured, comprising: a light source for projecting a plurality of stripe patterns on the object to be measured; and left and right cameras for acquiring 2D left of the object to be measured An image and a 2D right image; a 3D module for acquiring a depth image of the measured object; a spatial relationship between the left and right cameras and the 3D module is known and fixed; the stripe pattern and the depth image Appears on the image.
  • a three-dimensional scanning method includes the following steps: (1) device construction: constructing a three-dimensional digital imaging sensor composed of a three-dimensional module, two cameras and a light source, and a relative position between the three-dimensional module, two cameras, and a projector Fixed; (2) System calibration: calibration of left and right cameras and 3D modules to obtain calibration parameters; (3) Projection and image acquisition: generate a stripe pattern, project a light source to the object to be measured, and the stripe pattern is measured by the object to be measured The height modulation is deformed to produce a modulated stripe pattern, and the left and right cameras synchronously acquire the modulated stripe pattern to obtain left and right images, and the three-dimensional module synchronously collects the depth map of the measured object; (4) Stripe matching: according to the depth map guide Left and right image stripes are matched; (5) 3D reconstruction: matching the left and right images with the corresponding stripes, using the polar line geometry of the left and right cameras The system searches for the correspondence of a single point in the center line segment of the corresponding stripe
  • the system calibration further includes the steps of: calibrating the left and right cameras to obtain a rotational translation matrix Mc corresponding to the relative position between the camera and the camera, and simultaneously calibrating the relative positional relationship between the three-dimensional module and the left camera. Rotate the translation matrix Ms.
  • the three-dimensional module is a three-dimensional scanning module.
  • the projection and image acquisition further includes the following steps: the light source projects a stripe pattern to the object to be measured, and the left and right cameras respectively collect left and right images; the light source When turned off, the 3D module emits light to the object to be measured, and then acquires a 3D depth image.
  • the projection and image acquisition further includes the following steps: the light source and the three-dimensional module simultaneously project a stripe pattern to the object to be measured, The camera and the 3D module simultaneously capture left and right images and 3D depth images.
  • the stripe matching further includes the following steps: performing center line extraction on the stripes on the left and right camera images, and then forming a plurality of independent line segments for each of the center line connected domains; and the depth map acquired by the three-dimensional module is corresponding to the calibration
  • the internal reference is converted to the 3D point cloud coordinate (pi) in its own coordinate system; according to the rotation translation matrix Ms between the calibration 3D module and the left camera, (pi) is converted to the 3D point cloud coordinate (qi) in the left camera coordinate system.
  • the three-dimensional point cloud coordinates (qi) are inversely projected onto the left and right images according to the respective internal parameters of the left and right cameras, and each corresponding point has a corresponding serial number, forming a lookup table corresponding to the left and right image coordinates, traversing The corresponding serial number of each point of each stripe line segment on the left image; according to the lookup table, the stripe line segments matching the right image can be directly found, thereby achieving accurate matching of the left and right image line segments or stripes.
  • the three-dimensional reconstruction further comprises the steps of: matching the left and right images with the corresponding stripe center line segments, using the polar line geometric constraint relationship of the left and right cameras, searching for the correspondence relationship of the single points in the corresponding strip center line segment, and then according to the calibration parameters of the system , the corresponding point pairs are reconstructed into three-dimensional point cloud data.
  • the three-dimensional scanner Compared with the prior art, in the three-dimensional scanner and the scanning method thereof, the three-dimensional point cloud data is obtained by guiding the left and right images to perform stripe matching by the depth map of the measured object.
  • the three-dimensional scanner has the following advantages: 1.
  • the accuracy or accuracy of the stripe matching is high, so that the scanning efficiency of the three-dimensional scanning system can be improved by increasing the number of matched stripes; 2.
  • the number of stripes is large
  • it is dense only the depth map can guide the left and right images to perform stripe matching.
  • No need to calibrate the stripe light plane That is, there is no need to guide the matching of the left and right images through the light plane, and the installation accuracy of the relative position of the hardware is low, which reduces the system cost.
  • FIG. 1 is a schematic structural diagram of a three-dimensional scanner according to an embodiment of the present invention.
  • Figure 2 is a stripe diagram of the left and right cameras captured in the three-dimensional scanner of Figure 1;
  • 3 is a stripe diagram of an image in which a three-dimensional coordinate in a depth map is sequentially back-projected to an image of left and right cameras;
  • Figure 4 is a schematic diagram of the geometrical constraint of the polar line.
  • an embodiment of the present invention provides a three-dimensional scanning system for acquiring or acquiring three-dimensional point cloud data of an object 106 to be measured.
  • the three-dimensional scanning system includes a light source 101, a three-dimensional module 102, a left camera 103, a right camera 104, and a data processing unit 105.
  • the type of the three-dimensional scanning system is not limited.
  • the three-dimensional scanning system is a handheld multi-striped binocular three-dimensional scanning system.
  • the mutual position between the light source 101, the three-dimensional module 102, the left camera 103, and the right camera 104 is not limited as long as the object to be measured 106 can be projected or collected, and in operation, the light source 101, the three-dimensional mode
  • the positions of the group 102, the left camera 103, and the right camera 104 are relatively fixed.
  • the light source 101 is disposed between the left camera 103 and the right camera 104
  • the three-dimensional module 102 is disposed between the light source 101 and the left camera 103.
  • the stripe pattern projected by the light source 101 is not limited, and is preferably a digital analog laser stripe pattern.
  • the number of stripes is not limited, but in order to improve the scanning efficiency, more than 15 strips are usually required.
  • the number of stripes is greater than 80. It can be understood that when the number of the stripe is small, it is necessary to additionally attach a marker point on the measured object 106, but when the number of the stripe is large, there is no need to additionally stick on the measured object 106. Post mark.
  • the structure of the light source 101 Not limited, as long as a stripe pattern can be projected onto the object 106 to be measured.
  • the light source 101 comprises a laser, a projector.
  • the light source 101 is a digital projector, and the stripe pattern includes an analog laser stripe pattern, a laser stripe pattern, and the like.
  • the left and right cameras 103 and 104 are configured to synchronously acquire a 2D left image and a 2D right image of the measured object.
  • the types of the left and right cameras 103, 104 are not limited as long as a two-dimensional image of the object 106 to be measured can be acquired. It can be understood that the stripe pattern projected by the light source 101 to the object to be measured 106 is deformed by the height modulation of the object 106 to be measured, and a modulated stripe pattern is generated.
  • the left and right cameras 103 and 104 obtain left and right images by acquiring the modulated stripe pattern.
  • the three-dimensional module 102 is configured to synchronously acquire a depth map of the measured object 106.
  • the type of the three-dimensional module 102 is not limited as long as the depth map can be acquired, and the three-dimensional scanning module is a three-dimensional module 102 that is used more.
  • the embodiment adopts a low-resolution three-dimensional scanning module 102. It can be understood that the three-dimensional module 102 and the left and right cameras 103, 104 need to acquire images synchronously, so-called synchronous acquisition, that is, the position of the three-dimensional module 102 and the left and right cameras 103, 104 are maintained during the acquisition process. Fixed, and the time of collection is not limited.
  • the three-dimensional module 102 is a three-dimensional scanning module and the light emitted by the three-dimensional module 102 is equal or substantially the same as the wavelength of light emitted by the light source 101
  • the three-dimensional module 102 and the left and right cameras are used.
  • the acquisition time of 103, 104 must be staggered.
  • the three-dimensional module 102 is a three-dimensional scanning module and the light emitted by the three-dimensional module 102 is not equal to the wavelength of light emitted by the light source 101, the three-dimensional module 102 and the left and right cameras 103, 104 can be simultaneously acquired.
  • the light source 101, the three-dimensional module 102, and the left and right cameras 103, 104 can be combined into one.
  • a three-dimensional digital imaging sensor for scanning the object 106 to be measured, the spatial relationship between the left and right cameras and the three-dimensional module is known and fixed.
  • the three-dimensional digital imaging sensor is different from the existing three-dimensional digital imaging sensor, and includes a three-dimensional module 102 capable of acquiring a depth image, which can greatly improve imaging accuracy.
  • the data processing unit 105 is configured to guide the left and right image stripes according to the depth map to match, and simultaneously match the left and right images to the corresponding corresponding stripes to reconstruct the three-dimensional point cloud data.
  • the data processing unit 105 includes a stripe matching module and a three-dimensional reconstruction module.
  • the depth map actually includes a coordinate set of the measured object 106
  • the coordinate set is projected to the left and right images by the calibration internal parameters of the three-dimensional module 102 and the left and right cameras 103, 104, that is, Points in dots or stripes in the left and right images may be made to have coordinates having the set of coordinates. Since each coordinate set in the coordinate set corresponds to only one unique point in the object 106 to be measured, the points or stripes with the same coordinates in the left and right images can be matched. That is, the stripe matching of the left and right images can be achieved by the guidance of the depth map, the precision is very high, and the interference is not afraid.
  • the three-dimensional reconstruction module is configured to match the left and right images with the corresponding stripes, and use the polar geometric constraint relationship of the left and right cameras to find a corresponding relationship of the single points in the center line segment of the corresponding stripe, and then according to the calibration parameters, Reconstruct the corresponding points into 3D point cloud data.
  • the method or the technology for reconstructing the three-dimensional point cloud data by the three-dimensional reconstruction module is not limited, as long as the matched left and right images can be reconstructed into three-dimensional point cloud data.
  • a scanning method for acquiring or acquiring three-dimensional point cloud data of an object 106 to be measured by using the above three-dimensional scanning system comprising the following steps:
  • a three-dimensional digital imaging sensor composed of a three-dimensional module, two cameras and a light source is constructed, and the relative position between the three-dimensional module, the two cameras, and the light source is fixed.
  • System calibration calibration of the left and right cameras and the three-dimensional module to obtain calibration parameters; the system calibration further includes the following steps: calibrating the left and right cameras to obtain the rotation corresponding to the relative position between the camera and the camera The matrix Mc is shifted, and the rotational translation matrix Ms corresponding to the relative positional relationship between the three-dimensional module and the left camera is simultaneously calibrated.
  • Projection and image acquisition generate a stripe pattern, which is projected by the light source to the object to be measured.
  • the stripe pattern is deformed by the height modulation of the object to be measured, and the modulated stripe pattern is generated.
  • the left and right cameras synchronously collect and modulate the stripe pattern.
  • the left and right images are obtained, and the three-dimensional module synchronously collects the depth map of the measured object.
  • the three-dimensional module is a three-dimensional scanning module.
  • the projection and image acquisition further includes the following steps: the light source projects a stripe pattern to the object to be measured, and the left and right cameras respectively collect left and right images; the light source When turned off, the 3D module emits light to the object to be measured, and then acquires a 3D depth image.
  • the projection and image acquisition further includes the following steps: the light source and the three-dimensional module simultaneously project a stripe pattern to the object to be measured, The camera and the 3D module simultaneously capture left and right images and 3D depth images.
  • Stripe matching the left and right image strips are matched according to the depth map.
  • the stripe matching further includes the following steps: backprojecting the depth map to the left and right images at a time, thereby realizing left and right image line segments or stripes.
  • the method includes the following steps: a, the left and right camera images
  • the upper stripe is extracted by the center line, and then the segmentation of each connected center line forms a plurality of independent line segments; b.
  • the depth map acquired by the three-dimensional module is converted into a three-dimensional point cloud in its own coordinate system according to the corresponding calibration internal reference.
  • Three-dimensional reconstruction matching the left and right images with the corresponding stripes, using the polar geometric constraint relationship of the left and right cameras to find the corresponding relationship of the single points in the center line segment of the corresponding stripe, and then corresponding to the calibration parameters according to the calibration parameters
  • the point is reconstructed into 3D point cloud data.
  • the three-dimensional reconstruction further comprises the steps of: matching the left and right images with the corresponding stripe center line segments, using the polar line geometric constraint relationship of the left and right cameras, searching for the correspondence relationship of the single points in the corresponding strip center line segment, and then according to the calibration parameters of the system , the corresponding point pairs are reconstructed into three-dimensional point cloud data.
  • FIG. 1 the structure of the actually designed three-dimensional handheld multi-striped binocular three-dimensional scanning system is shown in FIG. 1 .
  • 101 is a digital projector
  • 102 is a resolution three-dimensional scanning module
  • 103 is a camera
  • 104 is a right camera
  • 105 is a computer
  • 106 is an object to be measured.
  • the internal parameters of the left camera after calibration are:
  • the internal parameters of the right camera are:
  • the system structure parameters between the left camera and the right camera are:
  • Ts [9.13387457e+001, 2.801821536e+001, 1.79046857e+000]
  • a digital analog laser fringe pattern is projected on the measured object 106, and is synchronously acquired by the left and right cameras and the resolution three-dimensional scanning module. There is also a low resolution depth map based on the acquired fringe pattern.
  • the internal parameters of the low-resolution three-dimensional scanning module that is, the internal parameters, are used to convert the depth map into three-dimensional coordinates, and the three-dimensional coordinates are sequentially back-projected to the images of the left and right cameras according to the calibration parameters, as shown in FIG.
  • the serial number is assigned to the corresponding points on the left and right to form a serial number lookup table.
  • the stripe center on the left and right camera images is extracted and the connected domain is segmented, and the matching of the stripe corresponding line segments is performed according to the sequence number lookup table.
  • the matched line segment pairs are searched according to the polar line geometric constraint relationship of the dual camera as shown in Fig. 04. Display, and then perform three-dimensional reconstruction according to the calibration parameters to generate point cloud data.
  • the three-dimensional scanner of the present invention and the scanning method thereof, the three-dimensional point cloud data is obtained by guiding the left and right images to perform stripe matching by the depth map of the measured object.
  • the three-dimensional scanner has the following advantages: 1.
  • the accuracy or accuracy of the stripe matching is high, so that the scanning efficiency of the three-dimensional scanning system can be improved by increasing the number of matched stripes; 2.
  • the number of stripes is large
  • only the depth map can guide the left and right images to perform stripe matching.
  • No need to calibrate the stripe light plane That is, there is no need to guide the matching of the left and right images through the light plane, and the installation accuracy of the relative position of the hardware is low, which reduces the system cost.

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Length Measuring Devices By Optical Means (AREA)
  • Image Processing (AREA)

Abstract

Provided are a three-dimensional scanning system and a scanning method thereof. The three-dimensional scanning system is used to acquire three-dimensional point cloud data of an object under inspection (106), and comprises: a light source (101) for projecting a plurality of striped patterns on to the object under inspection; left and right cameras (103, 104) for acquiring a 2D left image and a 2D right image of the object under inspection (106); a three-dimensional module (102) for collecting a depth image of the object under inspection (106); a stripe matching module for instructing the left and right images to be matched according to the depth image; and a three-dimensional reconstruction module for searching for a correspondence relationship between single points in center lines of corresponding strips according to matched corresponding strips in the left and right images by using an epipolar geometry constraint relationship between the left and right cameras (103, 104), and then recreating three-dimensional point cloud data with corresponding points according to calibration parameters of the three-dimensional scanning system.

Description

一种三维数字成像传感器、三维扫描***及其扫描方法Three-dimensional digital imaging sensor, three-dimensional scanning system and scanning method thereof 技术领域Technical field
本发明涉及一种三维数字成像传感器、三维扫描***及扫描方法,尤其涉及一种用于手持多条纹三维扫描***的三维数字成像传感器、三维扫描***及扫描方法。The invention relates to a three-dimensional digital imaging sensor, a three-dimensional scanning system and a scanning method, in particular to a three-dimensional digital imaging sensor, a three-dimensional scanning system and a scanning method for a handheld multi-striped three-dimensional scanning system.
背景技术Background technique
三维数字化技术是近年来国际上活跃研究的一个新兴交叉学科领域,被广泛的应用到逆向工程、文物保护、工业检测及虚拟现实等诸多领域。而手持便携式三维扫描仪以其便捷性,灵活性的优点在三维扫描领域被广泛应用。现有手持式三维扫描仪的原理主要是基于结构光的主动立体视觉方式,结构光的模式可以有多种,如红外激光散斑、DLP(Digital Light Processing)投影散斑、DLP投影的模拟激光条纹、激光条纹等。这些结构光模式中以DLP投影的模拟激光条纹,激光条纹为结构光的手持三维扫描仪的精度最高、扫描细节最好。Three-dimensional digital technology is an emerging interdisciplinary field active in international research in recent years, and is widely used in many fields such as reverse engineering, cultural relics protection, industrial inspection and virtual reality. Handheld portable 3D scanners are widely used in 3D scanning for their convenience and flexibility. The principle of the existing handheld 3D scanner is mainly based on the active stereoscopic mode of structured light. There are various modes of structured light, such as infrared laser speckle, DLP (Digital Light Processing) projection speckle, and DLP projection analog laser. Stripes, laser stripes, etc. Among these structured light modes, DLP-projected analog laser stripes, laser stripes are structured light, and the handheld 3D scanner has the highest precision and best scanning details.
以DLP投影的模拟激光条纹,激光条纹为结构光为例的基本工作流程是:The basic workflow of analog laser stripe with DLP projection and laser stripe for structured light is:
(1)对投射的条纹进行平面拟合;(1) plane fitting the projected stripes;
(2)根据采集到的条纹图进行标志点提取及条纹中心提取; (2) Mark point extraction and stripe center extraction according to the collected fringe pattern;
(3)对条纹中心进行连通域分割,根据平面方程对左右相机图像上的条纹进行对应点匹配;(3) Performing the connected domain segmentation on the center of the stripe, and matching the corresponding points on the left and right camera images according to the plane equation;
(4)利用两相机的极线约束关系查找左右相机图像上对应的标志点中心;(4) Finding the center of the corresponding marker point on the left and right camera images by using the polar line constraint relationship of the two cameras;
(5)根据扫描***的标定参数,采用三维重建算法对已经匹配好的对应条纹及对应标志点中心进行三维重建;(5) According to the calibration parameters of the scanning system, the three-dimensional reconstruction algorithm is used to perform three-dimensional reconstruction on the matched matching stripe and the corresponding marker point center;
(6)标志点拼接及条纹三维点旋转平移实现手持三维扫描。(6) Marker stitching and stripe 3D point rotation translation to realize handheld 3D scanning.
然而,该扫描过程中左右相机图像上的对应条纹匹配主要是基于光平面或条纹平面方程的指导,该方法在条纹数量大于15的时候左右相机图像上的对应条纹的匹配错误率将显著提高,进而增加噪声,降低扫描数据的准确性。当条纹数量小于15时,扫描效率得不到有效提高。故而在固有的扫描帧率限制下提高扫描效率的有效方法是增加条纹数量同时提高条纹匹配的准确性。However, the matching stripe matching on the left and right camera images during the scanning process is mainly based on the guidance of the light plane or the stripe plane equation. The matching error rate of the corresponding stripe on the camera image is significantly improved when the number of stripes is greater than 15. This increases noise and reduces the accuracy of the scanned data. When the number of stripes is less than 15, the scanning efficiency is not effectively improved. Therefore, an effective method for improving scanning efficiency under the inherent scanning frame rate limitation is to increase the number of stripes while improving the accuracy of stripe matching.
发明内容Summary of the invention
有鉴于此,有必要提供一种手持多条纹三维扫描***及其扫描方法,以解决现有手持三维扫描***无法兼顾高扫描效率和高扫描数据准确性的问题。In view of this, it is necessary to provide a handheld multi-striped three-dimensional scanning system and a scanning method thereof to solve the problem that the existing handheld three-dimensional scanning system cannot achieve high scanning efficiency and high scanning data accuracy.
本发明提供一种三维扫描***,用于获取被测物体的三维点云数据,其包括:一光源,用于在所述被测物体投射多个条纹图案;左、右相机,用于同步采集所述被测物体的2D左图像及2D右图像;一三维模组,用于同步采集被测物体的深度图;条纹匹配模块,用于根据所述深度图指导左、右图像条纹进行匹配;三维重构模块,用于将左右图像匹配好的对应条纹,利用左、右两个相 机的极线几何约束关系,查找对应条纹中心线段中单个点对应关系,然后根据三维扫描***的标定参数,将对应点重建为三维点云数据。The present invention provides a three-dimensional scanning system for acquiring three-dimensional point cloud data of an object to be measured, comprising: a light source for projecting a plurality of stripe patterns on the object to be measured; and left and right cameras for synchronous acquisition a 2D left image and a 2D right image of the measured object; a 3D module for synchronously acquiring a depth map of the measured object; and a stripe matching module configured to guide the left and right image stripe according to the depth map; 3D reconstruction module, which is used to match the left and right images with the corresponding stripes, using the left and right phases The polar line geometric constraint relationship of the machine is used to find the corresponding relationship of single points in the center line segment of the corresponding stripe, and then reconstruct the corresponding point into 3D point cloud data according to the calibration parameters of the 3D scanning system.
所述光源包括激光、投影仪,当所述光源为投影仪时,所述投影仪为数字投影仪,所述条纹图案包括模拟激光条纹图案、激光条纹图案等。The light source includes a laser, a projector, and when the light source is a projector, the projector is a digital projector, and the stripe pattern includes an analog laser stripe pattern, a laser stripe pattern, and the like.
所述条纹图案中的条纹条数大于15。The number of stripes in the stripe pattern is greater than 15.
所述三维模组为低分辨率三维扫描模组。The three-dimensional module is a low-resolution three-dimensional scanning module.
一种用于扫描被测物体的三维数字成像传感器,其包括:一光源,用于在所述被测物体投射多个条纹图案;左、右相机,用于获取所述被测物体的2D左图像及2D右图像;一三维模组,用于采集被测物体的深度图像;所述左、右相机与三维模组的空间关系是已知且固定的;所述条纹图案和所述深度图像出现在所述图像上。A three-dimensional digital imaging sensor for scanning an object to be measured, comprising: a light source for projecting a plurality of stripe patterns on the object to be measured; and left and right cameras for acquiring 2D left of the object to be measured An image and a 2D right image; a 3D module for acquiring a depth image of the measured object; a spatial relationship between the left and right cameras and the 3D module is known and fixed; the stripe pattern and the depth image Appears on the image.
一种三维扫描方法,其包括如下步骤:(1)设备构建:构建由三维模组、两个相机和光源组成三维数字成像传感器,且三维模组、两个相机、投影仪之间的相对位置固定;(2)***标定:对左右相机及三维模组进行标定,获得标定参数;(3)投影与图像采集:生成一幅条纹图案,用光源向被测物体投射,条纹图案被被测物体的高度调制发生变形,产生调制后的条纹图案,左右相机同步采集调制后的条纹图案得到左右图像,三维模组同步采集被测物体的深度图;(4)条纹匹配:根据所述深度图指导左、右图像条纹进行匹配;(5)三维重构:将左右图像匹配好的对应条纹,利用左、右两个相机的极线几何约束关 系,查找对应条纹中心线段中单个点对应关系,然后根据所述标定参数,将对应点重建为三维点云数据。A three-dimensional scanning method includes the following steps: (1) device construction: constructing a three-dimensional digital imaging sensor composed of a three-dimensional module, two cameras and a light source, and a relative position between the three-dimensional module, two cameras, and a projector Fixed; (2) System calibration: calibration of left and right cameras and 3D modules to obtain calibration parameters; (3) Projection and image acquisition: generate a stripe pattern, project a light source to the object to be measured, and the stripe pattern is measured by the object to be measured The height modulation is deformed to produce a modulated stripe pattern, and the left and right cameras synchronously acquire the modulated stripe pattern to obtain left and right images, and the three-dimensional module synchronously collects the depth map of the measured object; (4) Stripe matching: according to the depth map guide Left and right image stripes are matched; (5) 3D reconstruction: matching the left and right images with the corresponding stripes, using the polar line geometry of the left and right cameras The system searches for the correspondence of a single point in the center line segment of the corresponding stripe, and then reconstructs the corresponding point into three-dimensional point cloud data according to the calibration parameter.
所述***标定进一步包括如下步骤:对左右相机进行标定从而获取相机的内外参及相机之间的相对位置对应的旋转平移矩阵Mc,同时标定三维模组与左相机之间的相对位置关系对应的旋转平移矩阵Ms。The system calibration further includes the steps of: calibrating the left and right cameras to obtain a rotational translation matrix Mc corresponding to the relative position between the camera and the camera, and simultaneously calibrating the relative positional relationship between the three-dimensional module and the left camera. Rotate the translation matrix Ms.
所述三维模组为三维扫描模组。The three-dimensional module is a three-dimensional scanning module.
当所述三维扫描模组发射与光源等波长的光时,所述投影与图像采集进一步包括如下步骤:所述光源向被测物体投射一幅条纹图案,左右相机分别采集左右图像;所述光源关闭,三维模组向被测物体发射光,然后采集三维深度图像。When the three-dimensional scanning module emits light of a wavelength equal to a light source, the projection and image acquisition further includes the following steps: the light source projects a stripe pattern to the object to be measured, and the left and right cameras respectively collect left and right images; the light source When turned off, the 3D module emits light to the object to be measured, and then acquires a 3D depth image.
当所述三维扫描模组发射的光与光源发射波长的波长不相等时,所述投影与图像采集进一步包括如下步骤:所述光源和三维模组同时向被测物体投射一幅条纹图案,左右相机及三维模组同时采集左右图像及三维深度图像。所述条纹匹配进一步包括如下步骤:对左右相机图像上的条纹进行中心线提取,然后对每条中心线连通域的分割形成多条独立线段;将三维模组采集到的深度图根据对应的标定内参换算为自身坐标系下的三维点云坐标(pi);根据标定三维模组与左相机之间的旋转平移矩阵Ms,将(pi)转换到左相机坐标系下三维点云坐标(qi);将三维点云坐标(qi)依次根据左右相机各自的内参反投影到左右图像上,每个对应点均有相应的序号,形成左右图像坐标对应的查找表,遍历 左图像上每个条纹线段每个点的所对应的序号;根据查找表可直接查找到右图像相匹配的条纹线段,从而实现左右图像线段或条纹的准确匹配。When the light emitted by the three-dimensional scanning module is not equal to the wavelength of the emission wavelength of the light source, the projection and image acquisition further includes the following steps: the light source and the three-dimensional module simultaneously project a stripe pattern to the object to be measured, The camera and the 3D module simultaneously capture left and right images and 3D depth images. The stripe matching further includes the following steps: performing center line extraction on the stripes on the left and right camera images, and then forming a plurality of independent line segments for each of the center line connected domains; and the depth map acquired by the three-dimensional module is corresponding to the calibration The internal reference is converted to the 3D point cloud coordinate (pi) in its own coordinate system; according to the rotation translation matrix Ms between the calibration 3D module and the left camera, (pi) is converted to the 3D point cloud coordinate (qi) in the left camera coordinate system. The three-dimensional point cloud coordinates (qi) are inversely projected onto the left and right images according to the respective internal parameters of the left and right cameras, and each corresponding point has a corresponding serial number, forming a lookup table corresponding to the left and right image coordinates, traversing The corresponding serial number of each point of each stripe line segment on the left image; according to the lookup table, the stripe line segments matching the right image can be directly found, thereby achieving accurate matching of the left and right image line segments or stripes.
所述三维重构进一步包括如下步骤:将左右图像匹配好的对应条纹中心线段,利用左右两个相机的极线几何约束关系,查找对应条纹中心线段中单个点对应关系,然后根据***的标定参数,将对应点对重建为三维点云数据。The three-dimensional reconstruction further comprises the steps of: matching the left and right images with the corresponding stripe center line segments, using the polar line geometric constraint relationship of the left and right cameras, searching for the correspondence relationship of the single points in the corresponding strip center line segment, and then according to the calibration parameters of the system , the corresponding point pairs are reconstructed into three-dimensional point cloud data.
与现有技术相比,本发明的三维扫描仪及其扫描方法中,通过被测物体的深度图指导左、右图像进行条纹匹配的方式,获得三维点云数据。该三维扫描仪相对传统的三维扫描仪具有如下优点:1、条纹匹配的精度或准确性较高,从而可通过增加匹配的条纹数量来提高三维扫描***的扫描效率;2、当条纹数量较多且较密的时候,仅通过深度图即可指导左、右图像进行条纹匹配,无需额外在被测物体上贴标志点,不用借助标志点即可实现实时拼接;3、无需标定条纹光平面,即无需通过光平面来指导左右图像的匹配,对硬件的相对位置的安装精度要求较低,降低了***成本。Compared with the prior art, in the three-dimensional scanner and the scanning method thereof, the three-dimensional point cloud data is obtained by guiding the left and right images to perform stripe matching by the depth map of the measured object. Compared with the conventional three-dimensional scanner, the three-dimensional scanner has the following advantages: 1. The accuracy or accuracy of the stripe matching is high, so that the scanning efficiency of the three-dimensional scanning system can be improved by increasing the number of matched stripes; 2. When the number of stripes is large When it is dense, only the depth map can guide the left and right images to perform stripe matching. There is no need to additionally attach marker points to the measured object, and real-time stitching can be realized without using the marker points. 3. No need to calibrate the stripe light plane. That is, there is no need to guide the matching of the left and right images through the light plane, and the installation accuracy of the relative position of the hardware is low, which reduces the system cost.
上述说明仅是本发明技术方案的概述,为了能够更清楚了解本发明的技术手段,而可依照说明书的内容予以实施,并且为了让本发明的上述和其它目的、特征和优点能够更明显易懂,以下特举实施例,并配合附图,详细说明如下。The above description is only an overview of the technical solutions of the present invention, and the above-described and other objects, features and advantages of the present invention can be more clearly understood. The following specific embodiments are described in detail below with reference to the accompanying drawings.
附图说明DRAWINGS
以下结合附图描述本发明的实施例,其中:Embodiments of the present invention are described below in conjunction with the accompanying drawings, in which:
图1是本发明实施例提供的三维扫描仪的结构示意图; 1 is a schematic structural diagram of a three-dimensional scanner according to an embodiment of the present invention;
图2是图1中三维扫描仪中的左右相机采集到的条纹图;Figure 2 is a stripe diagram of the left and right cameras captured in the three-dimensional scanner of Figure 1;
图3是将深度图中的三维坐标依次反投影到左右相机的图像的条纹图;3 is a stripe diagram of an image in which a three-dimensional coordinate in a depth map is sequentially back-projected to an image of left and right cameras;
图4是极线几何约束示意图。Figure 4 is a schematic diagram of the geometrical constraint of the polar line.
具体实施方式detailed description
以下基于附图对本发明的具体实施例进行进一步详细说明。应当理解,此处所描述的具体实施例仅仅作为实施例,并不用于限定本发明的保护范围。Specific embodiments of the present invention will be further described in detail below based on the drawings. It should be understood that the specific embodiments described herein are merely illustrative and are not intended to limit the scope of the invention.
请参照图1,本发明实施例提供一种三维扫描***,用于获取或采集被测物体106的三维点云数据。该三维扫描***包括光源101,三维模组102、左相机103、右相机104及数据处理单元105。所述三维扫描***的类型不限,优选地,所述三维扫描***为手持多条纹双目三维扫描***。Referring to FIG. 1 , an embodiment of the present invention provides a three-dimensional scanning system for acquiring or acquiring three-dimensional point cloud data of an object 106 to be measured. The three-dimensional scanning system includes a light source 101, a three-dimensional module 102, a left camera 103, a right camera 104, and a data processing unit 105. The type of the three-dimensional scanning system is not limited. Preferably, the three-dimensional scanning system is a handheld multi-striped binocular three-dimensional scanning system.
所述光源101,三维模组102、左相机103、右相机104之间的相互位置不限,只要可以投射或采集到被测物体106即可,且在工作时,所述光源101,三维模组102、左相机103、右相机104的位置相对固定。优选地,所述光源101设置在所述左相机103与右相机104正中间,所述三维模组102设置在光源101与左相机103之间。所述光源101所投射的条纹图案不限,优选地,为数字模拟激光条纹图案。条纹的数量不限,但为了提高扫描效率,通常需要大于15条,本实施例中,所述条纹数量大于80条。可以理解,当所述条纹的数量较少时,需要在所述被测物体106上额外粘帖标志点,但当所述条纹数量较多时,则不需要在所述被测物体106上额外粘帖标志点。所述光源101的结构 不限,只要能向被测物体106投射条纹图案即可。优选地,所述光源101包括激光、投影仪。在本实施例中,所述光源101为数字投影仪,所述条纹图案包括模拟激光条纹图案、激光条纹图案等。The mutual position between the light source 101, the three-dimensional module 102, the left camera 103, and the right camera 104 is not limited as long as the object to be measured 106 can be projected or collected, and in operation, the light source 101, the three-dimensional mode The positions of the group 102, the left camera 103, and the right camera 104 are relatively fixed. Preferably, the light source 101 is disposed between the left camera 103 and the right camera 104, and the three-dimensional module 102 is disposed between the light source 101 and the left camera 103. The stripe pattern projected by the light source 101 is not limited, and is preferably a digital analog laser stripe pattern. The number of stripes is not limited, but in order to improve the scanning efficiency, more than 15 strips are usually required. In this embodiment, the number of stripes is greater than 80. It can be understood that when the number of the stripe is small, it is necessary to additionally attach a marker point on the measured object 106, but when the number of the stripe is large, there is no need to additionally stick on the measured object 106. Post mark. The structure of the light source 101 Not limited, as long as a stripe pattern can be projected onto the object 106 to be measured. Preferably, the light source 101 comprises a laser, a projector. In this embodiment, the light source 101 is a digital projector, and the stripe pattern includes an analog laser stripe pattern, a laser stripe pattern, and the like.
所述左、右相机103、104,用于同步采集所述被测物体的2D左图像及2D右图像。所述左右相机103、104的类型不限,只要能采集到所述被测物体106的二维图像即可。可以理解,由于所述光源101向被测物体106投射的条纹图案,被被测物体106的高度调制发生变形,产生调制后的条纹图案。而左右相机103、104则通过采集调制后的条纹图案得到左右图像。The left and right cameras 103 and 104 are configured to synchronously acquire a 2D left image and a 2D right image of the measured object. The types of the left and right cameras 103, 104 are not limited as long as a two-dimensional image of the object 106 to be measured can be acquired. It can be understood that the stripe pattern projected by the light source 101 to the object to be measured 106 is deformed by the height modulation of the object 106 to be measured, and a modulated stripe pattern is generated. The left and right cameras 103 and 104 obtain left and right images by acquiring the modulated stripe pattern.
所述三维模组102用于同步采集被测物体106的深度图。所述三维模组102的类型不限,只要能采集到所述深度图即可,而三维扫描模组则是用的比较多的一种三维模组102。为降低成本,本实施例采用低分辨率三维扫描模组102。可以理解,所述三维模组102与左、右相机103、104需要同步采集图像,所谓同步采集,亦即在采集过程中,所述三维模组102与左、右相机103、104的位置保持固定,而采集的时间不限定。当然,为避免相互间形成干扰,当所述三维模组102为三维扫描模组且其发射的光与光源101发射的光波长相等或基本一致时,所述三维模组102与左、右相机103、104的采集时间必须错开。当所述三维模组102为三维扫描模组且其发射的光与光源101发射的光波长不相等时,所述三维模组102与左、右相机103、104可以同时采集。The three-dimensional module 102 is configured to synchronously acquire a depth map of the measured object 106. The type of the three-dimensional module 102 is not limited as long as the depth map can be acquired, and the three-dimensional scanning module is a three-dimensional module 102 that is used more. In order to reduce the cost, the embodiment adopts a low-resolution three-dimensional scanning module 102. It can be understood that the three-dimensional module 102 and the left and right cameras 103, 104 need to acquire images synchronously, so-called synchronous acquisition, that is, the position of the three-dimensional module 102 and the left and right cameras 103, 104 are maintained during the acquisition process. Fixed, and the time of collection is not limited. Of course, in order to avoid interference with each other, when the three-dimensional module 102 is a three-dimensional scanning module and the light emitted by the three-dimensional module 102 is equal or substantially the same as the wavelength of light emitted by the light source 101, the three-dimensional module 102 and the left and right cameras are used. The acquisition time of 103, 104 must be staggered. When the three-dimensional module 102 is a three-dimensional scanning module and the light emitted by the three-dimensional module 102 is not equal to the wavelength of light emitted by the light source 101, the three-dimensional module 102 and the left and right cameras 103, 104 can be simultaneously acquired.
可以理解,所述光源101、三维模组102及左、右相机103、104可组成一 个用于扫描被测物体106的三维数字成像传感器,所述左、右相机与三维模组的空间关系是已知且固定的。所述三维数字成像传感器不同于现有的三维数字成像传感器,其包括可采集深度图像的三维模组102,能大幅提高成像精度。It can be understood that the light source 101, the three-dimensional module 102, and the left and right cameras 103, 104 can be combined into one. A three-dimensional digital imaging sensor for scanning the object 106 to be measured, the spatial relationship between the left and right cameras and the three-dimensional module is known and fixed. The three-dimensional digital imaging sensor is different from the existing three-dimensional digital imaging sensor, and includes a three-dimensional module 102 capable of acquiring a depth image, which can greatly improve imaging accuracy.
所述数据处理单元105,用于根据所述深度图指导左、右图像条纹进行匹配,同时将左、右图像匹配好的对应条纹,重建为三维点云数据。具体地,所述数据处理单元105包括条纹匹配模块及三维重构模块。The data processing unit 105 is configured to guide the left and right image stripes according to the depth map to match, and simultaneously match the left and right images to the corresponding corresponding stripes to reconstruct the three-dimensional point cloud data. Specifically, the data processing unit 105 includes a stripe matching module and a three-dimensional reconstruction module.
具体地,由于所述深度图实际上包含了被测物体106的坐标集,通过所述三维模组102与左、右相机103、104的标定内参,将所述坐标集投射到左右图像,即可使得左右图像中的点或条纹中的点均具有具有所述坐标集的坐标。由于坐标集中的每个坐标集仅对应被测物体106中的唯一的一个点,左右图像中坐标相同的点或条纹即可匹配起来。即,所述左右图像的条纹匹配可以通过深度图的指导来实现,精度非常高,且不怕干扰。Specifically, since the depth map actually includes a coordinate set of the measured object 106, the coordinate set is projected to the left and right images by the calibration internal parameters of the three-dimensional module 102 and the left and right cameras 103, 104, that is, Points in dots or stripes in the left and right images may be made to have coordinates having the set of coordinates. Since each coordinate set in the coordinate set corresponds to only one unique point in the object 106 to be measured, the points or stripes with the same coordinates in the left and right images can be matched. That is, the stripe matching of the left and right images can be achieved by the guidance of the depth map, the precision is very high, and the interference is not afraid.
而所述三维重构模块用于将左右图像匹配好的对应条纹,利用左、右两个相机的极线几何约束关系,查找对应条纹中心线段中单个点对应关系,然后根据所述标定参数,将对应点重建为三维点云数据。当然,所述三维重构模块重建三维点云数据的方式或技术不限,只要能将匹配好的左右图像重建为三维点云数据即可。The three-dimensional reconstruction module is configured to match the left and right images with the corresponding stripes, and use the polar geometric constraint relationship of the left and right cameras to find a corresponding relationship of the single points in the center line segment of the corresponding stripe, and then according to the calibration parameters, Reconstruct the corresponding points into 3D point cloud data. Of course, the method or the technology for reconstructing the three-dimensional point cloud data by the three-dimensional reconstruction module is not limited, as long as the matched left and right images can be reconstructed into three-dimensional point cloud data.
一种用上述三维扫描***获取或采集被测物体106的三维点云数据的扫描方法,其包括如下步骤: A scanning method for acquiring or acquiring three-dimensional point cloud data of an object 106 to be measured by using the above three-dimensional scanning system, comprising the following steps:
(1)设备构建:构建由三维模组、两个相机和光源组成三维数字成像传感器,且三维模组、两个相机、光源之间的相对位置固定。(1) Device construction: A three-dimensional digital imaging sensor composed of a three-dimensional module, two cameras and a light source is constructed, and the relative position between the three-dimensional module, the two cameras, and the light source is fixed.
(2)***标定:对左右相机及三维模组进行标定,获得标定参数;所述***标定进一步包括如下步骤:对左右相机进行标定从而获取相机的内外参及相机之间的相对位置对应的旋转平移矩阵Mc,同时标定三维模组与左相机之间的相对位置关系对应的旋转平移矩阵Ms。(2) System calibration: calibration of the left and right cameras and the three-dimensional module to obtain calibration parameters; the system calibration further includes the following steps: calibrating the left and right cameras to obtain the rotation corresponding to the relative position between the camera and the camera The matrix Mc is shifted, and the rotational translation matrix Ms corresponding to the relative positional relationship between the three-dimensional module and the left camera is simultaneously calibrated.
(3)投影与图像采集:生成一幅条纹图案,用光源向被测物体投射,条纹图案被被测物体的高度调制发生变形,产生调制后的条纹图案,左右相机同步采集调制后的条纹图案得到左右图像,三维模组同步采集被测物体的深度图。优选地,所述三维模组为三维扫描模组。当所述三维扫描模组发射与光源等波长的光时,所述投影与图像采集进一步包括如下步骤:所述光源向被测物体投射一幅条纹图案,左右相机分别采集左右图像;所述光源关闭,三维模组向被测物体发射光,然后采集三维深度图像。当所述三维扫描模组发射的光与光源发射波长的波长不相等时,所述投影与图像采集进一步包括如下步骤:所述光源和三维模组同时向被测物体投射一幅条纹图案,左右相机及三维模组同时采集左右图像及三维深度图像。(3) Projection and image acquisition: generate a stripe pattern, which is projected by the light source to the object to be measured. The stripe pattern is deformed by the height modulation of the object to be measured, and the modulated stripe pattern is generated. The left and right cameras synchronously collect and modulate the stripe pattern. The left and right images are obtained, and the three-dimensional module synchronously collects the depth map of the measured object. Preferably, the three-dimensional module is a three-dimensional scanning module. When the three-dimensional scanning module emits light of a wavelength equal to a light source, the projection and image acquisition further includes the following steps: the light source projects a stripe pattern to the object to be measured, and the left and right cameras respectively collect left and right images; the light source When turned off, the 3D module emits light to the object to be measured, and then acquires a 3D depth image. When the light emitted by the three-dimensional scanning module is not equal to the wavelength of the emission wavelength of the light source, the projection and image acquisition further includes the following steps: the light source and the three-dimensional module simultaneously project a stripe pattern to the object to be measured, The camera and the 3D module simultaneously capture left and right images and 3D depth images.
(4)条纹匹配:根据所述深度图指导左、右图像条纹进行匹配.所述条纹匹配进一步包括如下步骤:将所述深度图一次反投射到左右图像中,从而实现左右图像线段或条纹的准确匹配。具体地,包括如下步骤:a、对左右相机图像 上的条纹进行中心线提取,然后对每条中心线连通域的分割形成多条独立线段;b、将三维模组采集到的深度图根据对应的标定内参换算为自身坐标系下的三维点云坐标(pi);c、根据标定三维模组与左相机之间的旋转平移矩阵Ms,将(pi)转换到左相机坐标系下三维点云坐标(qi);d、将三维点云坐标(qi)依次根据左右相机各自的内参反投影到左右图像上,每个对应点均有相应的序号,形成左右图像坐标对应的查找表;e、遍历左图像上每个条纹线段每个点的所对应的序号,根据查找表可直接查找到右图像相匹配的条纹线段,从而实现左右图像线段或条纹的准确匹配。(4) Stripe matching: the left and right image strips are matched according to the depth map. The stripe matching further includes the following steps: backprojecting the depth map to the left and right images at a time, thereby realizing left and right image line segments or stripes. Exact match. Specifically, the method includes the following steps: a, the left and right camera images The upper stripe is extracted by the center line, and then the segmentation of each connected center line forms a plurality of independent line segments; b. The depth map acquired by the three-dimensional module is converted into a three-dimensional point cloud in its own coordinate system according to the corresponding calibration internal reference. Coordinate (pi); c, according to the rotation translation matrix Ms between the calibration 3D module and the left camera, convert (pi) to the 3D point cloud coordinate (qi) in the left camera coordinate system; d, the 3D point cloud coordinates ( Qi) in turn according to the respective internal parameters of the left and right cameras to the left and right images, each corresponding point has a corresponding serial number, forming a lookup table corresponding to the left and right image coordinates; e, traversing each point of each stripe segment on the left image Corresponding serial numbers, according to the lookup table, can directly find the striped line segments that match the right image, thereby achieving accurate matching of the left and right image line segments or stripes.
(5)三维重构:将左右图像匹配好的对应条纹,利用左、右两个相机的极线几何约束关系,查找对应条纹中心线段中单个点对应关系,然后根据所述标定参数,将对应点重建为三维点云数据。所述三维重构进一步包括如下步骤:将左右图像匹配好的对应条纹中心线段,利用左右两个相机的极线几何约束关系,查找对应条纹中心线段中单个点对应关系,然后根据***的标定参数,将对应点对重建为三维点云数据。(5) Three-dimensional reconstruction: matching the left and right images with the corresponding stripes, using the polar geometric constraint relationship of the left and right cameras to find the corresponding relationship of the single points in the center line segment of the corresponding stripe, and then corresponding to the calibration parameters according to the calibration parameters The point is reconstructed into 3D point cloud data. The three-dimensional reconstruction further comprises the steps of: matching the left and right images with the corresponding stripe center line segments, using the polar line geometric constraint relationship of the left and right cameras, searching for the correspondence relationship of the single points in the corresponding strip center line segment, and then according to the calibration parameters of the system , the corresponding point pairs are reconstructed into three-dimensional point cloud data.
为进一步阐述本发明的三维扫描***及其扫描方法,下面以具体的实施例给予说明。To further illustrate the three-dimensional scanning system of the present invention and its scanning method, a description will be given below with specific embodiments.
请参照图1,实际设计的三维手持多条纹双目三维扫描***的结构如图1所示。101为数字投影仪,102为分辨率三维扫描模组,103为相机,104位右相机,105为计算机,106为被测物体。 Referring to FIG. 1 , the structure of the actually designed three-dimensional handheld multi-striped binocular three-dimensional scanning system is shown in FIG. 1 . 101 is a digital projector, 102 is a resolution three-dimensional scanning module, 103 is a camera, 104 is a right camera, 105 is a computer, and 106 is an object to be measured.
标定后的左相机的内部参数为:The internal parameters of the left camera after calibration are:
K1=[2271.084,    0,         645.632,K1=[2271.084, 0, 645.632,
       0,       2265.112,    511.553,0, 2265.112, 511.553,
       0,        0,          1]0, 0, 1]
右相机的内部参数为:The internal parameters of the right camera are:
K2=[2275.181,    0,         644.405,K2=[2275.181, 0, 644.405,
       0,     2270.321,      510.053,0, 2270.321, 510.053,
       0,      0,            1]0, 0, 1]
左相机和右相机之间的***结构参数为:The system structure parameters between the left camera and the right camera are:
R=[8.749981e-001,6.547051e-003,4.840819e-001,R=[8.749981e-001, 6.547051e-003, 4.840819e-001,
   -2.904034e-003,9.999615e-001,-8.274993e-003,-2.904034e-003,9.999615e-001,-8.274993e-003,
   -4.841175e-001,5.834813e-003,8.749835e-001]-4.841175e-001,5.834813e-003,8.749835e-001]
T=[-1.778995e+002,-4.162821e-001,5.074737e+001]T=[-1.778995e+002,-4.162821e-001, 5.074737e+001]
低分辨率三维扫描模组的内部参数:Internal parameters of the low-resolution 3D scanning module:
Ks=[476.927,    0,       312.208,Ks=[476.927, 0, 312.208,
        0,    475.927,    245.949,0, 475.927, 245.949,
        0,      0,        1]0, 0, 1]
低分辨率三维扫描模组与左相机的之间的***结构参数:System structure parameters between the low-resolution 3D scanning module and the left camera:
Rs=[9.98946971e-001,4.44611477e-002,-1.13205701e-002,Rs=[9.98946971e-001,4.44611477e-002,-1.13205701e-002,
   -4.54442748e-002,9.92786812e-001,-1.10946668e-001,-4.54442748e-002, 9.92786812e-001, -1.10946668e-001,
   6.30609650e-003,1.11344293e-001,9.93761884e-001]6.30609650e-003,1.11344293e-001,9.93761884e-001]
Ts=[9.13387457e+001,2.81182536e+001,1.79046857e+000]Ts=[9.13387457e+001, 2.801821536e+001, 1.79046857e+000]
按照上面的叙述的步骤,对被测物体106投射数字模拟激光条纹图,被左右相机与分辨率三维扫描模组同步采集。根据采集到的条纹图还有低分辨率深度图。如图02所示,利用低分辨率三维扫描模组的内部参数即内参将深度图转换为三维坐标,同时根据标定参数将该三维坐标依次反投影到左右相机的图像上如图03所示,左右对应点上赋予序号,形成序号查找表。提取左右相机图像上条纹中心并进行连通域分割,根据序号查找表进行条纹对应线段的匹配。匹配完的线段对根据双相机的极线几何约束关系进行对应点查找如图04所 示,然后根据标定参数进行三维重构,生成点云数据。According to the steps described above, a digital analog laser fringe pattern is projected on the measured object 106, and is synchronously acquired by the left and right cameras and the resolution three-dimensional scanning module. There is also a low resolution depth map based on the acquired fringe pattern. As shown in FIG. 02, the internal parameters of the low-resolution three-dimensional scanning module, that is, the internal parameters, are used to convert the depth map into three-dimensional coordinates, and the three-dimensional coordinates are sequentially back-projected to the images of the left and right cameras according to the calibration parameters, as shown in FIG. The serial number is assigned to the corresponding points on the left and right to form a serial number lookup table. The stripe center on the left and right camera images is extracted and the connected domain is segmented, and the matching of the stripe corresponding line segments is performed according to the sequence number lookup table. The matched line segment pairs are searched according to the polar line geometric constraint relationship of the dual camera as shown in Fig. 04. Display, and then perform three-dimensional reconstruction according to the calibration parameters to generate point cloud data.
本发明的三维扫描仪及其扫描方法中,通过被测物体的深度图指导左、右图像进行条纹匹配的方式,获得三维点云数据。该三维扫描仪相对传统的三维扫描仪具有如下优点:1、条纹匹配的精度或准确性较高,从而可通过增加匹配的条纹数量来提高三维扫描***的扫描效率;2、当条纹数量较多且较密的时候,仅通过深度图即可指导左、右图像进行条纹匹配,无需额外在被测物体上贴标志点,不用借助标志点即可实现实时拼接;3、无需标定条纹光平面,即无需通过光平面来指导左右图像的匹配,对硬件的相对位置的安装精度要求较低,降低了***成本。In the three-dimensional scanner of the present invention and the scanning method thereof, the three-dimensional point cloud data is obtained by guiding the left and right images to perform stripe matching by the depth map of the measured object. Compared with the conventional three-dimensional scanner, the three-dimensional scanner has the following advantages: 1. The accuracy or accuracy of the stripe matching is high, so that the scanning efficiency of the three-dimensional scanning system can be improved by increasing the number of matched stripes; 2. When the number of stripes is large When it is dense, only the depth map can guide the left and right images to perform stripe matching. There is no need to additionally attach marker points to the measured object, and real-time stitching can be realized without using the marker points. 3. No need to calibrate the stripe light plane. That is, there is no need to guide the matching of the left and right images through the light plane, and the installation accuracy of the relative position of the hardware is low, which reduces the system cost.
以上所述仅为本发明的较佳实施例而已,并不用以限制本发明,凡在本发明的精神和原则之内所作的任何修改、等同替换和改进等,均应包含在本发明的保护范围之内。 The above is only the preferred embodiment of the present invention, and is not intended to limit the present invention. Any modifications, equivalent substitutions and improvements made within the spirit and principles of the present invention should be included in the protection of the present invention. Within the scope.

Claims (15)

  1. 一种三维扫描***,用于获取被测物体的三维点云数据,其包括:A three-dimensional scanning system for acquiring three-dimensional point cloud data of an object to be measured, comprising:
    光源,用于在所述被测物体投射多个条纹图案;a light source for projecting a plurality of stripe patterns on the object to be measured;
    左、右相机,用于同步采集所述被测物体的2D左图像及2D右图像;a left and right camera for synchronously acquiring a 2D left image and a 2D right image of the measured object;
    三维模组,用于同步采集被测物体的深度图;a three-dimensional module for synchronously acquiring a depth map of the measured object;
    条纹匹配模块,用于根据所述深度图指导左、右图像条纹进行匹配;a stripe matching module, configured to guide left and right image stripes according to the depth map;
    三维重构模块,用于将左、右图像匹配好的对应条纹,重建为三维点云数据。The 3D reconstruction module is configured to reconstruct the corresponding stripes of the left and right images into 3D point cloud data.
  2. 如权利要求1所述的三维扫描***,其特征在于,所述光源包括激光、投影仪,当所述光源为投影仪时,所述投影仪为数字投影仪,所述条纹图案包括模拟激光条纹图案、激光条纹图案等。The three-dimensional scanning system according to claim 1, wherein said light source comprises a laser, a projector, and when said light source is a projector, said projector is a digital projector, said stripe pattern comprising simulated laser stripes Patterns, laser stripe patterns, etc.
  3. 如权利要求2所述的三维扫描***,其特征在于,所述条纹图案中的条纹条数大于15。The three-dimensional scanning system according to claim 2, wherein the number of stripes in the stripe pattern is greater than 15.
  4. 如权利要求1所述的三维扫描***,其特征在于,所述三维模组为低分辨率三维扫描模组。The three-dimensional scanning system of claim 1 , wherein the three-dimensional module is a low-resolution three-dimensional scanning module.
  5. 如权利要求1所述的三维扫描***,其特征在于,所述三维扫描***为手持式三维扫描***。The three-dimensional scanning system of claim 1 wherein said three-dimensional scanning system is a handheld three-dimensional scanning system.
  6. 一种用于扫描被测物体的三维数字成像传感器,其包括:A three-dimensional digital imaging sensor for scanning an object to be measured, comprising:
    一光源,用于在所述被测物体投射多个条纹图案; a light source for projecting a plurality of stripe patterns on the object to be measured;
    左、右相机,用于获取所述被测物体的2D左图像及2D右图像;a left and right camera for acquiring a 2D left image and a 2D right image of the measured object;
    一三维模组,用于采集被测物体的深度图像;a three-dimensional module for collecting a depth image of the object to be measured;
    所述左、右相机与三维模组的空间关系是已知且固定的;The spatial relationship between the left and right cameras and the three-dimensional module is known and fixed;
    所述条纹图案和所述深度图像出现在所述图像上。The stripe pattern and the depth image appear on the image.
  7. 一种三维扫描***,用于获取被测物体的三维点云数据,其包括:A three-dimensional scanning system for acquiring three-dimensional point cloud data of an object to be measured, comprising:
    一光源,用于在所述被测物体投射多个条纹图案;a light source for projecting a plurality of stripe patterns on the object to be measured;
    左、右相机,用于同步采集所述被测物体的2D左图像及2D右图像;a left and right camera for synchronously acquiring a 2D left image and a 2D right image of the measured object;
    一三维模组,用于同步采集被测物体的深度图;a three-dimensional module for synchronously acquiring a depth map of the measured object;
    一数据处理单元,用于根据所述深度图指导左、右图像条纹进行匹配,同时将左、右图像匹配好的对应条纹,重建为三维点云数据。A data processing unit is configured to guide the left and right image stripes according to the depth map to match, and simultaneously match the left and right images with the corresponding corresponding stripes to reconstruct the three-dimensional point cloud data.
  8. 一种三维扫描方法,其包括如下步骤:A three-dimensional scanning method includes the following steps:
    (1)设备构建:构建由三维模组、两个相机和光源组成三维数字成像传感器,且三维模组、两个相机、光源之间的相对位置固定;(1) Equipment construction: constructing a three-dimensional digital imaging sensor composed of a three-dimensional module, two cameras and a light source, and the relative positions between the three-dimensional module, the two cameras, and the light source are fixed;
    (2)***标定:对左右相机及三维模组进行标定,获得标定参数;(2) System calibration: calibration of left and right cameras and 3D modules to obtain calibration parameters;
    (3)投影与图像采集:生成一幅条纹图案,用光源向被测物体投射,条纹图案被被测物体的高度调制发生变形,产生调制后的条纹图案,左右相机同步采集调制后的条纹图案得到左右图像,三维模组同步采集被测物体的深度图;(3) Projection and image acquisition: generate a stripe pattern, which is projected by the light source to the object to be measured. The stripe pattern is deformed by the height modulation of the object to be measured, and the modulated stripe pattern is generated. The left and right cameras synchronously collect and modulate the stripe pattern. Obtaining the left and right images, and the three-dimensional module synchronously collects the depth map of the measured object;
    (4)条纹匹配:根据所述深度图指导左、右图像条纹进行匹配;(4) Stripe matching: guide the left and right image stripes to match according to the depth map;
    (5)三维重构:将左右图像匹配好的对应条纹,利用左、右两个相机的极 线几何约束关系,查找对应条纹中心线段中单个点对应关系,然后根据所述标定参数,将对应点重建为三维点云数据。(5) 3D reconstruction: matching the left and right images with the corresponding stripes, using the poles of the left and right cameras The line geometric constraint relationship is used to find a corresponding relationship of a single point in the center line segment of the corresponding stripe, and then reconstruct the corresponding point into three-dimensional point cloud data according to the calibration parameter.
  9. 一种如权利要求8所述的三维扫描方法,其特征在于,所述***标定进一步包括如下步骤:对左右相机进行标定从而获取相机的内外参及相机之间的相对位置对应的旋转平移矩阵Mc,同时标定三维模组与左相机之间的相对位置关系对应的旋转平移矩阵Ms。A three-dimensional scanning method according to claim 8, wherein the system calibration further comprises the steps of: calibrating the left and right cameras to obtain a rotational translation matrix Mc corresponding to the relative position between the camera and the camera. At the same time, the rotational translation matrix Ms corresponding to the relative positional relationship between the three-dimensional module and the left camera is calibrated.
  10. 一种如权利要求8所述的三维扫描方法,其特征在于,所述三维模组为三维扫描模组。A three-dimensional scanning method according to claim 8, wherein the three-dimensional module is a three-dimensional scanning module.
  11. 一种如权利要求10所述的三维扫描方法,其特征在于,当所述三维扫描模组发射与光源等波长的光时,所述投影与图像采集进一步包括如下步骤:所述光源向被测物体投射一幅条纹图案,左右相机分别采集左右图像;所述光源关闭,三维模组向被测物体发射光,然后采集三维深度图像。A three-dimensional scanning method according to claim 10, wherein when the three-dimensional scanning module emits light of a wavelength equal to a light source, the projection and image acquisition further includes the following steps: the light source is measured The object projects a stripe pattern, and the left and right cameras respectively collect left and right images; the light source is turned off, the three-dimensional module emits light to the object to be measured, and then the three-dimensional depth image is acquired.
  12. 一种如权利要求10所述的三维扫描方法,其特征在于,当所述三维扫描模组发射的光与光源发射波长的波长不相等时,所述投影与图像采集进一步包括如下步骤:所述光源和三维模组同时向被测物体投射一幅条纹图案,左右相机及三维模组同时采集左右图像及三维深度图像。A three-dimensional scanning method according to claim 10, wherein when the light emitted by the three-dimensional scanning module is not equal to the wavelength of the emission wavelength of the light source, the projection and image acquisition further comprises the following steps: The light source and the three-dimensional module simultaneously project a stripe pattern on the object to be measured, and the left and right cameras and the three-dimensional module simultaneously collect left and right images and three-dimensional depth images.
  13. 一种如权利要求9所述的三维扫描方法,其特征在于,所述条纹匹配进一步包括如下步骤:将所述深度图一次反投射到左右图像中,从而实现左右图像线段或条纹的准确匹配。 A three-dimensional scanning method according to claim 9, wherein the stripe matching further comprises the step of: projecting the depth map back into the left and right images at a time, thereby achieving accurate matching of left and right image line segments or stripes.
  14. 一种如权利要求9所述的三维扫描方法,其特征在于,所述条纹匹配进一步包括如下步骤:A three-dimensional scanning method according to claim 9, wherein said stripe matching further comprises the following steps:
    对左右相机图像上的条纹进行中心线提取,然后对每条中心线连通域的分割形成多条独立线段;Center line extraction on the stripes on the left and right camera images, and then forming a plurality of independent line segments for the segmentation of each center line connected domain;
    将三维模组采集到的深度图根据对应的标定内参换算为自身坐标系下的三维点云坐标(pi);The depth map acquired by the 3D module is converted into a 3D point cloud coordinate (pi) in the coordinate system according to the corresponding calibration internal parameter;
    根据标定三维模组与左相机之间的旋转平移矩阵Ms,将(pi)转换到左相机坐标系下三维点云坐标(qi);Converting (pi) to the 3D point cloud coordinate (qi) in the left camera coordinate system according to the rotation translation matrix Ms between the calibration 3D module and the left camera;
    将三维点云坐标(qi)依次根据左右相机各自的内参反投影到左右图像上,每个对应点均有相应的序号,形成左右图像坐标对应的查找表;The three-dimensional point cloud coordinates (qi) are inversely projected onto the left and right images according to the respective internal parameters of the left and right cameras, and each corresponding point has a corresponding serial number, forming a lookup table corresponding to the left and right image coordinates;
    遍历左图像上每个条纹线段每个点的所对应的序号,根据查找表可直接查找到右图像相匹配的条纹线段,从而实现左右图像线段或条纹的准确匹配。The corresponding serial number of each point of each stripe segment on the left image is traversed, and the stripe line segment matching the right image can be directly found according to the lookup table, thereby realizing accurate matching of the left and right image line segments or stripes.
  15. 一种如权利要求14所述的三维扫描方法,其特征在于,所述三维重构进一步包括如下步骤:将左右图像匹配好的对应条纹中心线段,利用左右两个相机的极线几何约束关系,查找对应条纹中心线段中单个点对应关系,然后根据***的标定参数,将对应点对重建为三维点云数据。 A three-dimensional scanning method according to claim 14, wherein the three-dimensional reconstruction further comprises the steps of: matching the left and right images with the corresponding stripe center line segment, and utilizing the polar line geometric constraint relationship of the left and right cameras. Find the corresponding relationship of a single point in the center line segment of the corresponding stripe, and then reconstruct the corresponding point pair into 3D point cloud data according to the calibration parameters of the system.
PCT/CN2016/112118 2016-12-05 2016-12-26 Three-dimensional digital imaging sensor, and three-dimensional scanning system and scanning method thereof WO2018103152A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201611102629.2 2016-12-05
CN201611102629.2A CN108151671B (en) 2016-12-05 2016-12-05 A kind of 3 D digital imaging sensor, 3 D scanning system and its scan method

Publications (1)

Publication Number Publication Date
WO2018103152A1 true WO2018103152A1 (en) 2018-06-14

Family

ID=62470859

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2016/112118 WO2018103152A1 (en) 2016-12-05 2016-12-26 Three-dimensional digital imaging sensor, and three-dimensional scanning system and scanning method thereof

Country Status (2)

Country Link
CN (1) CN108151671B (en)
WO (1) WO2018103152A1 (en)

Cited By (27)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109242956A (en) * 2018-08-19 2019-01-18 雅派朗迪(北京)科技发展股份有限公司 3D body scans digital modeling vehicle
CN109903382A (en) * 2019-03-20 2019-06-18 中煤航测遥感集团有限公司 The fusion method and device of point cloud data
CN110244302A (en) * 2019-07-05 2019-09-17 苏州科技大学 Ground Synthetic Aperture Radar images cell coordinate three-dimension varying method
CN110243307A (en) * 2019-04-15 2019-09-17 深圳市易尚展示股份有限公司 A kind of automatized three-dimensional colour imaging and measuring system
CN110517323A (en) * 2019-08-16 2019-11-29 中铁第一勘察设计院集团有限公司 3 D positioning system and method based on manipulator one camera multi-vision visual
CN111008602A (en) * 2019-12-06 2020-04-14 青岛海之晨工业装备有限公司 Two-dimensional and three-dimensional visual combined lineation feature extraction method for small-curvature thin-wall part
CN111127625A (en) * 2019-10-08 2020-05-08 新拓三维技术(深圳)有限公司 Foot scanning method, system and device
CN111750805A (en) * 2020-07-06 2020-10-09 山东大学 Three-dimensional measuring device and method based on binocular camera imaging and structured light technology
CN111932565A (en) * 2019-05-13 2020-11-13 中国科学院沈阳自动化研究所 Multi-target identification tracking resolving method
CN112200911A (en) * 2020-11-06 2021-01-08 北京易达恩能科技有限公司 Region overlapping type three-dimensional map construction method and device combined with markers
CN112465912A (en) * 2020-11-18 2021-03-09 新拓三维技术(深圳)有限公司 Three-dimensional camera calibration method and device
CN112509057A (en) * 2020-11-30 2021-03-16 北京百度网讯科技有限公司 Camera external parameter calibration method and device, electronic equipment and computer readable medium
CN112530008A (en) * 2020-12-25 2021-03-19 中国科学院苏州纳米技术与纳米仿生研究所 Method, device and equipment for determining parameters of stripe structured light and storage medium
CN113008164A (en) * 2021-03-23 2021-06-22 南京理工大学 Rapid high-precision three-dimensional surface shape reconstruction method
CN113034603A (en) * 2019-12-09 2021-06-25 百度在线网络技术(北京)有限公司 Method and device for determining calibration parameters
CN113313746A (en) * 2020-12-01 2021-08-27 湖南长天自控工程有限公司 Method and system for stockpile warehouse
CN113483668A (en) * 2021-08-19 2021-10-08 广东亚太新材料科技有限公司 Method and system for detecting size of carbon fiber composite product
CN113655064A (en) * 2021-08-11 2021-11-16 合肥工业大学 Appearance defect multi-mode visual detection sensor
CN114119747A (en) * 2021-11-23 2022-03-01 四川大学 Three-dimensional flow field flow display method based on PMD wavefront detection
CN114332349A (en) * 2021-11-17 2022-04-12 浙江智慧视频安防创新中心有限公司 Binocular structured light edge reconstruction method and system and storage medium
CN114681089A (en) * 2020-12-31 2022-07-01 先临三维科技股份有限公司 Three-dimensional scanning device and method
CN114739312A (en) * 2022-04-26 2022-07-12 黄晓明 Hand-held type road surface structure degree of depth laser survey device
CN114909993A (en) * 2022-04-26 2022-08-16 泰州市创新电子有限公司 High-precision laser projection visual three-dimensional measurement system
CN114998499A (en) * 2022-06-08 2022-09-02 深圳大学 Binocular three-dimensional reconstruction method and system based on line laser galvanometer scanning
CN115068833A (en) * 2021-03-15 2022-09-20 湖南华创医疗科技有限公司 Positioning device for beam blocker and radiotherapy system
WO2023019833A1 (en) * 2021-08-18 2023-02-23 梅卡曼德(北京)机器人科技有限公司 Laser line scanning-based point cloud processing method and apparatus
WO2023207756A1 (en) * 2022-04-28 2023-11-02 杭州海康机器人股份有限公司 Image reconstruction method and apparatus, and device

Families Citing this family (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109299662B (en) * 2018-08-24 2022-04-12 上海图漾信息科技有限公司 Depth data calculation device and method, and face recognition device
CN111508012B (en) * 2019-01-31 2024-04-19 先临三维科技股份有限公司 Method and device for line stripe mismatching detection and three-dimensional reconstruction
CN110047147A (en) * 2019-04-09 2019-07-23 易视智瞳科技(深圳)有限公司 A kind of 3D point cloud processing method, device, system and computer storage medium
CN109900221A (en) * 2019-04-12 2019-06-18 杭州思看科技有限公司 A kind of handheld three-dimensional scanning system
CN110702025B (en) * 2019-05-30 2021-03-19 北京航空航天大学 Grating type binocular stereoscopic vision three-dimensional measurement system and method
WO2020259625A1 (en) * 2019-06-28 2020-12-30 先临三维科技股份有限公司 Three-dimensional scanning method, scanner, three-dimensional scanning system, computer device, and computer-readable storage medium
CN110764841B (en) * 2019-10-10 2024-01-19 珠海格力智能装备有限公司 3D visual application development platform and development method
CN111047692A (en) * 2019-12-23 2020-04-21 武汉华工激光工程有限责任公司 Three-dimensional modeling method, device and equipment and readable storage medium
CN111462331B (en) * 2020-03-31 2023-06-27 四川大学 Lookup table method for expanding epipolar geometry and calculating three-dimensional point cloud in real time
CN112330732A (en) * 2020-09-29 2021-02-05 先临三维科技股份有限公司 Three-dimensional data splicing method, three-dimensional scanning system and handheld scanner
CN112747671B (en) * 2020-12-10 2022-12-09 杭州先临天远三维检测技术有限公司 Three-dimensional detection system and three-dimensional detection method
CN113219489B (en) * 2021-05-13 2024-04-16 深圳数马电子技术有限公司 Point-to-point determination method, device, computer equipment and storage medium for multi-line laser
CN113963115A (en) * 2021-10-28 2022-01-21 山东大学 High dynamic range laser three-dimensional scanning method based on single frame image
WO2023179782A1 (en) * 2022-03-25 2023-09-28 先临三维科技股份有限公司 Three-dimensional scanning system, method and apparatus, and mobile computing module
CN117522940A (en) * 2022-07-27 2024-02-06 梅卡曼德(北京)机器人科技有限公司 Three-dimensional laser camera, calibration method and method for acquiring color point cloud image
CN115345995A (en) * 2022-08-10 2022-11-15 先临三维科技股份有限公司 Three-dimensional reconstruction method, device and system
CN115984371A (en) * 2022-11-25 2023-04-18 杭州天远三维检测技术有限公司 Scanning head posture detection method, device, equipment and medium
CN116664796B (en) * 2023-04-25 2024-04-02 北京天翔睿翼科技有限公司 Lightweight head modeling system and method

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101245998A (en) * 2008-02-01 2008-08-20 黑龙江科技学院 Imaging method of three-dimensional measuring system
CN101608908A (en) * 2009-07-20 2009-12-23 杭州先临三维科技股份有限公司 The three-dimension digital imaging method that digital speckle projection and phase measuring profilometer combine
CN101739717A (en) * 2009-11-12 2010-06-16 天津汇信软件有限公司 Non-contact scanning method for three-dimensional colour point clouds
US20140253929A1 (en) * 2011-10-18 2014-09-11 Nanyang Technological University Apparatus and method for 3d surface measurement
CN105869167A (en) * 2016-03-30 2016-08-17 天津大学 High-resolution depth map acquisition method based on active and passive fusion

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070057946A1 (en) * 2003-07-24 2007-03-15 Dan Albeck Method and system for the three-dimensional surface reconstruction of an object
JP4577126B2 (en) * 2005-07-08 2010-11-10 オムロン株式会社 Projection pattern generation apparatus and generation method for stereo correspondence
CN101853528B (en) * 2010-05-10 2011-12-07 沈阳雅克科技有限公司 Hand-held three-dimensional surface information extraction method and extractor thereof
CN103900494B (en) * 2014-03-31 2016-06-08 中国科学院上海光学精密机械研究所 For the homologous points fast matching method of binocular vision 3 D measurement
CN103954239A (en) * 2014-05-08 2014-07-30 青岛三友智控科技有限公司 Three-dimensional measurement system and method
CN204043632U (en) * 2014-08-29 2014-12-24 西安新拓三维光测科技有限公司 A kind of structure based light multiple plate three-dimensional measurement instrument apparatus
CN105115445A (en) * 2015-09-14 2015-12-02 杭州光珀智能科技有限公司 Three-dimensional imaging system and imaging method based on combination of depth camera and binocular vision
CN106091987A (en) * 2016-06-14 2016-11-09 中国科学院上海光学精密机械研究所 Based on the large scale optical blank method for three-dimensional measurement that speckle time domain is relevant

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101245998A (en) * 2008-02-01 2008-08-20 黑龙江科技学院 Imaging method of three-dimensional measuring system
CN101608908A (en) * 2009-07-20 2009-12-23 杭州先临三维科技股份有限公司 The three-dimension digital imaging method that digital speckle projection and phase measuring profilometer combine
CN101739717A (en) * 2009-11-12 2010-06-16 天津汇信软件有限公司 Non-contact scanning method for three-dimensional colour point clouds
US20140253929A1 (en) * 2011-10-18 2014-09-11 Nanyang Technological University Apparatus and method for 3d surface measurement
CN105869167A (en) * 2016-03-30 2016-08-17 天津大学 High-resolution depth map acquisition method based on active and passive fusion

Cited By (44)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109242956B (en) * 2018-08-19 2023-05-23 雅派朗迪(北京)科技发展股份有限公司 3D human body scanning digital modeling vehicle
CN109242956A (en) * 2018-08-19 2019-01-18 雅派朗迪(北京)科技发展股份有限公司 3D body scans digital modeling vehicle
CN109903382A (en) * 2019-03-20 2019-06-18 中煤航测遥感集团有限公司 The fusion method and device of point cloud data
CN109903382B (en) * 2019-03-20 2023-05-23 中煤航测遥感集团有限公司 Point cloud data fusion method and device
CN110243307A (en) * 2019-04-15 2019-09-17 深圳市易尚展示股份有限公司 A kind of automatized three-dimensional colour imaging and measuring system
CN111932565A (en) * 2019-05-13 2020-11-13 中国科学院沈阳自动化研究所 Multi-target identification tracking resolving method
CN111932565B (en) * 2019-05-13 2023-09-19 中国科学院沈阳自动化研究所 Multi-target recognition tracking calculation method
CN110244302A (en) * 2019-07-05 2019-09-17 苏州科技大学 Ground Synthetic Aperture Radar images cell coordinate three-dimension varying method
CN110244302B (en) * 2019-07-05 2023-02-17 苏州科技大学 Three-dimensional transformation method for image pixel coordinates of foundation synthetic aperture radar
CN110517323A (en) * 2019-08-16 2019-11-29 中铁第一勘察设计院集团有限公司 3 D positioning system and method based on manipulator one camera multi-vision visual
CN111127625B (en) * 2019-10-08 2024-01-12 新拓三维技术(深圳)有限公司 Foot scanning method, system and device
CN111127625A (en) * 2019-10-08 2020-05-08 新拓三维技术(深圳)有限公司 Foot scanning method, system and device
CN111008602A (en) * 2019-12-06 2020-04-14 青岛海之晨工业装备有限公司 Two-dimensional and three-dimensional visual combined lineation feature extraction method for small-curvature thin-wall part
CN111008602B (en) * 2019-12-06 2023-07-25 青岛海之晨工业装备有限公司 Scribing feature extraction method combining two-dimensional vision and three-dimensional vision for small-curvature thin-wall part
CN113034603A (en) * 2019-12-09 2021-06-25 百度在线网络技术(北京)有限公司 Method and device for determining calibration parameters
CN113034603B (en) * 2019-12-09 2023-07-14 百度在线网络技术(北京)有限公司 Method and device for determining calibration parameters
CN111750805A (en) * 2020-07-06 2020-10-09 山东大学 Three-dimensional measuring device and method based on binocular camera imaging and structured light technology
CN112200911B (en) * 2020-11-06 2024-05-28 北京易达恩能科技有限公司 Method and device for constructing regional overlapping three-dimensional map by combining markers
CN112200911A (en) * 2020-11-06 2021-01-08 北京易达恩能科技有限公司 Region overlapping type three-dimensional map construction method and device combined with markers
CN112465912B (en) * 2020-11-18 2024-03-29 新拓三维技术(深圳)有限公司 Stereo camera calibration method and device
CN112465912A (en) * 2020-11-18 2021-03-09 新拓三维技术(深圳)有限公司 Three-dimensional camera calibration method and device
CN112509057A (en) * 2020-11-30 2021-03-16 北京百度网讯科技有限公司 Camera external parameter calibration method and device, electronic equipment and computer readable medium
CN112509057B (en) * 2020-11-30 2024-04-12 北京百度网讯科技有限公司 Camera external parameter calibration method, device, electronic equipment and computer readable medium
US11875535B2 (en) 2020-11-30 2024-01-16 Beijing Baidu Netcom Science And Technology Co., Ltd. Method, apparatus, electronic device and computer readable medium for calibrating external parameter of camera
CN113313746A (en) * 2020-12-01 2021-08-27 湖南长天自控工程有限公司 Method and system for stockpile warehouse
CN112530008A (en) * 2020-12-25 2021-03-19 中国科学院苏州纳米技术与纳米仿生研究所 Method, device and equipment for determining parameters of stripe structured light and storage medium
CN114681089A (en) * 2020-12-31 2022-07-01 先临三维科技股份有限公司 Three-dimensional scanning device and method
CN114681089B (en) * 2020-12-31 2023-06-06 先临三维科技股份有限公司 Three-dimensional scanning device and method
CN115068833A (en) * 2021-03-15 2022-09-20 湖南华创医疗科技有限公司 Positioning device for beam blocker and radiotherapy system
CN115068833B (en) * 2021-03-15 2024-02-06 湖南华创医疗科技有限公司 Positioning device for beam stopper and radiation therapy system
CN113008164A (en) * 2021-03-23 2021-06-22 南京理工大学 Rapid high-precision three-dimensional surface shape reconstruction method
CN113655064A (en) * 2021-08-11 2021-11-16 合肥工业大学 Appearance defect multi-mode visual detection sensor
CN113655064B (en) * 2021-08-11 2023-08-08 合肥工业大学 Appearance defect multimode visual detection sensor
WO2023019833A1 (en) * 2021-08-18 2023-02-23 梅卡曼德(北京)机器人科技有限公司 Laser line scanning-based point cloud processing method and apparatus
CN113483668A (en) * 2021-08-19 2021-10-08 广东亚太新材料科技有限公司 Method and system for detecting size of carbon fiber composite product
CN114332349B (en) * 2021-11-17 2023-11-03 浙江视觉智能创新中心有限公司 Binocular structured light edge reconstruction method, system and storage medium
CN114332349A (en) * 2021-11-17 2022-04-12 浙江智慧视频安防创新中心有限公司 Binocular structured light edge reconstruction method and system and storage medium
CN114119747A (en) * 2021-11-23 2022-03-01 四川大学 Three-dimensional flow field flow display method based on PMD wavefront detection
CN114909993A (en) * 2022-04-26 2022-08-16 泰州市创新电子有限公司 High-precision laser projection visual three-dimensional measurement system
CN114739312A (en) * 2022-04-26 2022-07-12 黄晓明 Hand-held type road surface structure degree of depth laser survey device
CN114739312B (en) * 2022-04-26 2024-04-23 黄晓明 Hand-held type road surface structure degree of depth laser survey device
WO2023207756A1 (en) * 2022-04-28 2023-11-02 杭州海康机器人股份有限公司 Image reconstruction method and apparatus, and device
CN114998499B (en) * 2022-06-08 2024-03-26 深圳大学 Binocular three-dimensional reconstruction method and system based on line laser galvanometer scanning
CN114998499A (en) * 2022-06-08 2022-09-02 深圳大学 Binocular three-dimensional reconstruction method and system based on line laser galvanometer scanning

Also Published As

Publication number Publication date
CN108151671B (en) 2019-10-25
CN108151671A (en) 2018-06-12

Similar Documents

Publication Publication Date Title
WO2018103152A1 (en) Three-dimensional digital imaging sensor, and three-dimensional scanning system and scanning method thereof
WO2018152929A1 (en) Three-dimensional scanning system and scanning method thereof
JP6564537B1 (en) 3D reconstruction method and apparatus using monocular 3D scanning system
CN110288642B (en) Three-dimensional object rapid reconstruction method based on camera array
CN107202554B (en) It is provided simultaneously with photogrammetric and 3-D scanning function hand-held large scale three-dimensional measurement beam scanner system
CN108267097B (en) Three-dimensional reconstruction method and device based on binocular three-dimensional scanning system
US8265376B2 (en) Method and system for providing a digital model of an object
US6781618B2 (en) Hand-held 3D vision system
CN106500628B (en) A kind of 3-D scanning method and scanner containing multiple and different long wavelength lasers
CA3022442C (en) Three-dimensional reconstruction method and device based on monocular three-dimensional scanning system
JP6429772B2 (en) 3D scanning and positioning system
WO2016037486A1 (en) Three-dimensional imaging method and system for human body
CN109727277B (en) Body surface positioning tracking method for multi-eye stereo vision
CN107860337B (en) Structured light three-dimensional reconstruction method and device based on array camera
CN108665535A (en) A kind of three-dimensional structure method for reconstructing and system based on coding grating structured light
CN105303572B (en) Based on the main depth information acquisition method passively combined
WO2020199439A1 (en) Single- and dual-camera hybrid measurement-based three-dimensional point cloud computing method
CN104680534B (en) Object depth information acquisition methods based on single frames composite shuttering
CN111780678A (en) Method for measuring diameter of track slab embedded sleeve
Xu et al. Three degrees of freedom global calibration method for measurement systems with binocular vision
CN113421286B (en) Motion capturing system and method
Ettl Introductory review on ‘Flying Triangulation’: a motion-robust optical 3D measurement principle
Zhang et al. Fusion of time-of-flight and phase shifting for high-resolution and low-latency depth sensing
Agrawal et al. RWU3D: Real World ToF and Stereo Dataset with High Quality Ground Truth
Kawasaki et al. Registration and entire shape acquisition for grid based active one-shot scanning techniques

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16923462

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 16923462

Country of ref document: EP

Kind code of ref document: A1