WO2018103152A1 - 一种三维数字成像传感器、三维扫描***及其扫描方法 - Google Patents

一种三维数字成像传感器、三维扫描***及其扫描方法 Download PDF

Info

Publication number
WO2018103152A1
WO2018103152A1 PCT/CN2016/112118 CN2016112118W WO2018103152A1 WO 2018103152 A1 WO2018103152 A1 WO 2018103152A1 CN 2016112118 W CN2016112118 W CN 2016112118W WO 2018103152 A1 WO2018103152 A1 WO 2018103152A1
Authority
WO
WIPO (PCT)
Prior art keywords
dimensional
module
stripe
image
measured
Prior art date
Application number
PCT/CN2016/112118
Other languages
English (en)
French (fr)
Inventor
赵晓波
王文斌
刘增艺
Original Assignee
杭州先临三维科技股份有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 杭州先临三维科技股份有限公司 filed Critical 杭州先临三维科技股份有限公司
Publication of WO2018103152A1 publication Critical patent/WO2018103152A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/24Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
    • G01B11/25Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object
    • G01B11/2545Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object with one projection direction and several detection directions, e.g. stereo
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/24Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
    • G01B11/25Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis

Definitions

  • the invention relates to a three-dimensional digital imaging sensor, a three-dimensional scanning system and a scanning method, in particular to a three-dimensional digital imaging sensor, a three-dimensional scanning system and a scanning method for a handheld multi-striped three-dimensional scanning system.
  • Three-dimensional digital technology is an emerging interdisciplinary field active in international research in recent years, and is widely used in many fields such as reverse engineering, cultural relics protection, industrial inspection and virtual reality.
  • Handheld portable 3D scanners are widely used in 3D scanning for their convenience and flexibility.
  • the principle of the existing handheld 3D scanner is mainly based on the active stereoscopic mode of structured light.
  • structured light such as infrared laser speckle, DLP (Digital Light Processing) projection speckle, and DLP projection analog laser. Stripes, laser stripes, etc.
  • DLP-projected analog laser stripes, laser stripes are structured light, and the handheld 3D scanner has the highest precision and best scanning details.
  • the three-dimensional reconstruction algorithm is used to perform three-dimensional reconstruction on the matched matching stripe and the corresponding marker point center;
  • the matching stripe matching on the left and right camera images during the scanning process is mainly based on the guidance of the light plane or the stripe plane equation.
  • the matching error rate of the corresponding stripe on the camera image is significantly improved when the number of stripes is greater than 15. This increases noise and reduces the accuracy of the scanned data.
  • the scanning efficiency is not effectively improved. Therefore, an effective method for improving scanning efficiency under the inherent scanning frame rate limitation is to increase the number of stripes while improving the accuracy of stripe matching.
  • the present invention provides a three-dimensional scanning system for acquiring three-dimensional point cloud data of an object to be measured, comprising: a light source for projecting a plurality of stripe patterns on the object to be measured; and left and right cameras for synchronous acquisition a 2D left image and a 2D right image of the measured object; a 3D module for synchronously acquiring a depth map of the measured object; and a stripe matching module configured to guide the left and right image stripe according to the depth map; 3D reconstruction module, which is used to match the left and right images with the corresponding stripes, using the left and right phases
  • the polar line geometric constraint relationship of the machine is used to find the corresponding relationship of single points in the center line segment of the corresponding stripe, and then reconstruct the corresponding point into 3D point cloud data according to the calibration parameters of the 3D scanning system.
  • the light source includes a laser, a projector, and when the light source is a projector, the projector is a digital projector, and the stripe pattern includes an analog laser stripe pattern, a laser stripe pattern, and the like.
  • the number of stripes in the stripe pattern is greater than 15.
  • the three-dimensional module is a low-resolution three-dimensional scanning module.
  • a three-dimensional digital imaging sensor for scanning an object to be measured, comprising: a light source for projecting a plurality of stripe patterns on the object to be measured; and left and right cameras for acquiring 2D left of the object to be measured An image and a 2D right image; a 3D module for acquiring a depth image of the measured object; a spatial relationship between the left and right cameras and the 3D module is known and fixed; the stripe pattern and the depth image Appears on the image.
  • a three-dimensional scanning method includes the following steps: (1) device construction: constructing a three-dimensional digital imaging sensor composed of a three-dimensional module, two cameras and a light source, and a relative position between the three-dimensional module, two cameras, and a projector Fixed; (2) System calibration: calibration of left and right cameras and 3D modules to obtain calibration parameters; (3) Projection and image acquisition: generate a stripe pattern, project a light source to the object to be measured, and the stripe pattern is measured by the object to be measured The height modulation is deformed to produce a modulated stripe pattern, and the left and right cameras synchronously acquire the modulated stripe pattern to obtain left and right images, and the three-dimensional module synchronously collects the depth map of the measured object; (4) Stripe matching: according to the depth map guide Left and right image stripes are matched; (5) 3D reconstruction: matching the left and right images with the corresponding stripes, using the polar line geometry of the left and right cameras The system searches for the correspondence of a single point in the center line segment of the corresponding stripe
  • the system calibration further includes the steps of: calibrating the left and right cameras to obtain a rotational translation matrix Mc corresponding to the relative position between the camera and the camera, and simultaneously calibrating the relative positional relationship between the three-dimensional module and the left camera. Rotate the translation matrix Ms.
  • the three-dimensional module is a three-dimensional scanning module.
  • the projection and image acquisition further includes the following steps: the light source projects a stripe pattern to the object to be measured, and the left and right cameras respectively collect left and right images; the light source When turned off, the 3D module emits light to the object to be measured, and then acquires a 3D depth image.
  • the projection and image acquisition further includes the following steps: the light source and the three-dimensional module simultaneously project a stripe pattern to the object to be measured, The camera and the 3D module simultaneously capture left and right images and 3D depth images.
  • the stripe matching further includes the following steps: performing center line extraction on the stripes on the left and right camera images, and then forming a plurality of independent line segments for each of the center line connected domains; and the depth map acquired by the three-dimensional module is corresponding to the calibration
  • the internal reference is converted to the 3D point cloud coordinate (pi) in its own coordinate system; according to the rotation translation matrix Ms between the calibration 3D module and the left camera, (pi) is converted to the 3D point cloud coordinate (qi) in the left camera coordinate system.
  • the three-dimensional point cloud coordinates (qi) are inversely projected onto the left and right images according to the respective internal parameters of the left and right cameras, and each corresponding point has a corresponding serial number, forming a lookup table corresponding to the left and right image coordinates, traversing The corresponding serial number of each point of each stripe line segment on the left image; according to the lookup table, the stripe line segments matching the right image can be directly found, thereby achieving accurate matching of the left and right image line segments or stripes.
  • the three-dimensional reconstruction further comprises the steps of: matching the left and right images with the corresponding stripe center line segments, using the polar line geometric constraint relationship of the left and right cameras, searching for the correspondence relationship of the single points in the corresponding strip center line segment, and then according to the calibration parameters of the system , the corresponding point pairs are reconstructed into three-dimensional point cloud data.
  • the three-dimensional scanner Compared with the prior art, in the three-dimensional scanner and the scanning method thereof, the three-dimensional point cloud data is obtained by guiding the left and right images to perform stripe matching by the depth map of the measured object.
  • the three-dimensional scanner has the following advantages: 1.
  • the accuracy or accuracy of the stripe matching is high, so that the scanning efficiency of the three-dimensional scanning system can be improved by increasing the number of matched stripes; 2.
  • the number of stripes is large
  • it is dense only the depth map can guide the left and right images to perform stripe matching.
  • No need to calibrate the stripe light plane That is, there is no need to guide the matching of the left and right images through the light plane, and the installation accuracy of the relative position of the hardware is low, which reduces the system cost.
  • FIG. 1 is a schematic structural diagram of a three-dimensional scanner according to an embodiment of the present invention.
  • Figure 2 is a stripe diagram of the left and right cameras captured in the three-dimensional scanner of Figure 1;
  • 3 is a stripe diagram of an image in which a three-dimensional coordinate in a depth map is sequentially back-projected to an image of left and right cameras;
  • Figure 4 is a schematic diagram of the geometrical constraint of the polar line.
  • an embodiment of the present invention provides a three-dimensional scanning system for acquiring or acquiring three-dimensional point cloud data of an object 106 to be measured.
  • the three-dimensional scanning system includes a light source 101, a three-dimensional module 102, a left camera 103, a right camera 104, and a data processing unit 105.
  • the type of the three-dimensional scanning system is not limited.
  • the three-dimensional scanning system is a handheld multi-striped binocular three-dimensional scanning system.
  • the mutual position between the light source 101, the three-dimensional module 102, the left camera 103, and the right camera 104 is not limited as long as the object to be measured 106 can be projected or collected, and in operation, the light source 101, the three-dimensional mode
  • the positions of the group 102, the left camera 103, and the right camera 104 are relatively fixed.
  • the light source 101 is disposed between the left camera 103 and the right camera 104
  • the three-dimensional module 102 is disposed between the light source 101 and the left camera 103.
  • the stripe pattern projected by the light source 101 is not limited, and is preferably a digital analog laser stripe pattern.
  • the number of stripes is not limited, but in order to improve the scanning efficiency, more than 15 strips are usually required.
  • the number of stripes is greater than 80. It can be understood that when the number of the stripe is small, it is necessary to additionally attach a marker point on the measured object 106, but when the number of the stripe is large, there is no need to additionally stick on the measured object 106. Post mark.
  • the structure of the light source 101 Not limited, as long as a stripe pattern can be projected onto the object 106 to be measured.
  • the light source 101 comprises a laser, a projector.
  • the light source 101 is a digital projector, and the stripe pattern includes an analog laser stripe pattern, a laser stripe pattern, and the like.
  • the left and right cameras 103 and 104 are configured to synchronously acquire a 2D left image and a 2D right image of the measured object.
  • the types of the left and right cameras 103, 104 are not limited as long as a two-dimensional image of the object 106 to be measured can be acquired. It can be understood that the stripe pattern projected by the light source 101 to the object to be measured 106 is deformed by the height modulation of the object 106 to be measured, and a modulated stripe pattern is generated.
  • the left and right cameras 103 and 104 obtain left and right images by acquiring the modulated stripe pattern.
  • the three-dimensional module 102 is configured to synchronously acquire a depth map of the measured object 106.
  • the type of the three-dimensional module 102 is not limited as long as the depth map can be acquired, and the three-dimensional scanning module is a three-dimensional module 102 that is used more.
  • the embodiment adopts a low-resolution three-dimensional scanning module 102. It can be understood that the three-dimensional module 102 and the left and right cameras 103, 104 need to acquire images synchronously, so-called synchronous acquisition, that is, the position of the three-dimensional module 102 and the left and right cameras 103, 104 are maintained during the acquisition process. Fixed, and the time of collection is not limited.
  • the three-dimensional module 102 is a three-dimensional scanning module and the light emitted by the three-dimensional module 102 is equal or substantially the same as the wavelength of light emitted by the light source 101
  • the three-dimensional module 102 and the left and right cameras are used.
  • the acquisition time of 103, 104 must be staggered.
  • the three-dimensional module 102 is a three-dimensional scanning module and the light emitted by the three-dimensional module 102 is not equal to the wavelength of light emitted by the light source 101, the three-dimensional module 102 and the left and right cameras 103, 104 can be simultaneously acquired.
  • the light source 101, the three-dimensional module 102, and the left and right cameras 103, 104 can be combined into one.
  • a three-dimensional digital imaging sensor for scanning the object 106 to be measured, the spatial relationship between the left and right cameras and the three-dimensional module is known and fixed.
  • the three-dimensional digital imaging sensor is different from the existing three-dimensional digital imaging sensor, and includes a three-dimensional module 102 capable of acquiring a depth image, which can greatly improve imaging accuracy.
  • the data processing unit 105 is configured to guide the left and right image stripes according to the depth map to match, and simultaneously match the left and right images to the corresponding corresponding stripes to reconstruct the three-dimensional point cloud data.
  • the data processing unit 105 includes a stripe matching module and a three-dimensional reconstruction module.
  • the depth map actually includes a coordinate set of the measured object 106
  • the coordinate set is projected to the left and right images by the calibration internal parameters of the three-dimensional module 102 and the left and right cameras 103, 104, that is, Points in dots or stripes in the left and right images may be made to have coordinates having the set of coordinates. Since each coordinate set in the coordinate set corresponds to only one unique point in the object 106 to be measured, the points or stripes with the same coordinates in the left and right images can be matched. That is, the stripe matching of the left and right images can be achieved by the guidance of the depth map, the precision is very high, and the interference is not afraid.
  • the three-dimensional reconstruction module is configured to match the left and right images with the corresponding stripes, and use the polar geometric constraint relationship of the left and right cameras to find a corresponding relationship of the single points in the center line segment of the corresponding stripe, and then according to the calibration parameters, Reconstruct the corresponding points into 3D point cloud data.
  • the method or the technology for reconstructing the three-dimensional point cloud data by the three-dimensional reconstruction module is not limited, as long as the matched left and right images can be reconstructed into three-dimensional point cloud data.
  • a scanning method for acquiring or acquiring three-dimensional point cloud data of an object 106 to be measured by using the above three-dimensional scanning system comprising the following steps:
  • a three-dimensional digital imaging sensor composed of a three-dimensional module, two cameras and a light source is constructed, and the relative position between the three-dimensional module, the two cameras, and the light source is fixed.
  • System calibration calibration of the left and right cameras and the three-dimensional module to obtain calibration parameters; the system calibration further includes the following steps: calibrating the left and right cameras to obtain the rotation corresponding to the relative position between the camera and the camera The matrix Mc is shifted, and the rotational translation matrix Ms corresponding to the relative positional relationship between the three-dimensional module and the left camera is simultaneously calibrated.
  • Projection and image acquisition generate a stripe pattern, which is projected by the light source to the object to be measured.
  • the stripe pattern is deformed by the height modulation of the object to be measured, and the modulated stripe pattern is generated.
  • the left and right cameras synchronously collect and modulate the stripe pattern.
  • the left and right images are obtained, and the three-dimensional module synchronously collects the depth map of the measured object.
  • the three-dimensional module is a three-dimensional scanning module.
  • the projection and image acquisition further includes the following steps: the light source projects a stripe pattern to the object to be measured, and the left and right cameras respectively collect left and right images; the light source When turned off, the 3D module emits light to the object to be measured, and then acquires a 3D depth image.
  • the projection and image acquisition further includes the following steps: the light source and the three-dimensional module simultaneously project a stripe pattern to the object to be measured, The camera and the 3D module simultaneously capture left and right images and 3D depth images.
  • Stripe matching the left and right image strips are matched according to the depth map.
  • the stripe matching further includes the following steps: backprojecting the depth map to the left and right images at a time, thereby realizing left and right image line segments or stripes.
  • the method includes the following steps: a, the left and right camera images
  • the upper stripe is extracted by the center line, and then the segmentation of each connected center line forms a plurality of independent line segments; b.
  • the depth map acquired by the three-dimensional module is converted into a three-dimensional point cloud in its own coordinate system according to the corresponding calibration internal reference.
  • Three-dimensional reconstruction matching the left and right images with the corresponding stripes, using the polar geometric constraint relationship of the left and right cameras to find the corresponding relationship of the single points in the center line segment of the corresponding stripe, and then corresponding to the calibration parameters according to the calibration parameters
  • the point is reconstructed into 3D point cloud data.
  • the three-dimensional reconstruction further comprises the steps of: matching the left and right images with the corresponding stripe center line segments, using the polar line geometric constraint relationship of the left and right cameras, searching for the correspondence relationship of the single points in the corresponding strip center line segment, and then according to the calibration parameters of the system , the corresponding point pairs are reconstructed into three-dimensional point cloud data.
  • FIG. 1 the structure of the actually designed three-dimensional handheld multi-striped binocular three-dimensional scanning system is shown in FIG. 1 .
  • 101 is a digital projector
  • 102 is a resolution three-dimensional scanning module
  • 103 is a camera
  • 104 is a right camera
  • 105 is a computer
  • 106 is an object to be measured.
  • the internal parameters of the left camera after calibration are:
  • the internal parameters of the right camera are:
  • the system structure parameters between the left camera and the right camera are:
  • Ts [9.13387457e+001, 2.801821536e+001, 1.79046857e+000]
  • a digital analog laser fringe pattern is projected on the measured object 106, and is synchronously acquired by the left and right cameras and the resolution three-dimensional scanning module. There is also a low resolution depth map based on the acquired fringe pattern.
  • the internal parameters of the low-resolution three-dimensional scanning module that is, the internal parameters, are used to convert the depth map into three-dimensional coordinates, and the three-dimensional coordinates are sequentially back-projected to the images of the left and right cameras according to the calibration parameters, as shown in FIG.
  • the serial number is assigned to the corresponding points on the left and right to form a serial number lookup table.
  • the stripe center on the left and right camera images is extracted and the connected domain is segmented, and the matching of the stripe corresponding line segments is performed according to the sequence number lookup table.
  • the matched line segment pairs are searched according to the polar line geometric constraint relationship of the dual camera as shown in Fig. 04. Display, and then perform three-dimensional reconstruction according to the calibration parameters to generate point cloud data.
  • the three-dimensional scanner of the present invention and the scanning method thereof, the three-dimensional point cloud data is obtained by guiding the left and right images to perform stripe matching by the depth map of the measured object.
  • the three-dimensional scanner has the following advantages: 1.
  • the accuracy or accuracy of the stripe matching is high, so that the scanning efficiency of the three-dimensional scanning system can be improved by increasing the number of matched stripes; 2.
  • the number of stripes is large
  • only the depth map can guide the left and right images to perform stripe matching.
  • No need to calibrate the stripe light plane That is, there is no need to guide the matching of the left and right images through the light plane, and the installation accuracy of the relative position of the hardware is low, which reduces the system cost.

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Length Measuring Devices By Optical Means (AREA)
  • Image Processing (AREA)

Abstract

一种三维扫描***及其扫描方法。三维扫描***用于获取被测物体(106)的三维点云数据,其包括:一光源(101),用于在被测物体投射多个条纹图案;左、右相机(103,104),用于获取被测物体(106)的2D左图像及2D右图像;一三维模组(102),用于采集被测物体(106)的深度图;条纹匹配模块,用于根据深度图指导左、右图像条纹进行匹配;三维重构模块,用于将左右图像匹配好的对应条纹,利用左、右两个相机(103,104)的极线几何约束关系,查找对应条纹中心线段中单个点对应关系,然后根据三维扫描***的标定参数,将对应点重建为三维点云数据。

Description

一种三维数字成像传感器、三维扫描***及其扫描方法 技术领域
本发明涉及一种三维数字成像传感器、三维扫描***及扫描方法,尤其涉及一种用于手持多条纹三维扫描***的三维数字成像传感器、三维扫描***及扫描方法。
背景技术
三维数字化技术是近年来国际上活跃研究的一个新兴交叉学科领域,被广泛的应用到逆向工程、文物保护、工业检测及虚拟现实等诸多领域。而手持便携式三维扫描仪以其便捷性,灵活性的优点在三维扫描领域被广泛应用。现有手持式三维扫描仪的原理主要是基于结构光的主动立体视觉方式,结构光的模式可以有多种,如红外激光散斑、DLP(Digital Light Processing)投影散斑、DLP投影的模拟激光条纹、激光条纹等。这些结构光模式中以DLP投影的模拟激光条纹,激光条纹为结构光的手持三维扫描仪的精度最高、扫描细节最好。
以DLP投影的模拟激光条纹,激光条纹为结构光为例的基本工作流程是:
(1)对投射的条纹进行平面拟合;
(2)根据采集到的条纹图进行标志点提取及条纹中心提取;
(3)对条纹中心进行连通域分割,根据平面方程对左右相机图像上的条纹进行对应点匹配;
(4)利用两相机的极线约束关系查找左右相机图像上对应的标志点中心;
(5)根据扫描***的标定参数,采用三维重建算法对已经匹配好的对应条纹及对应标志点中心进行三维重建;
(6)标志点拼接及条纹三维点旋转平移实现手持三维扫描。
然而,该扫描过程中左右相机图像上的对应条纹匹配主要是基于光平面或条纹平面方程的指导,该方法在条纹数量大于15的时候左右相机图像上的对应条纹的匹配错误率将显著提高,进而增加噪声,降低扫描数据的准确性。当条纹数量小于15时,扫描效率得不到有效提高。故而在固有的扫描帧率限制下提高扫描效率的有效方法是增加条纹数量同时提高条纹匹配的准确性。
发明内容
有鉴于此,有必要提供一种手持多条纹三维扫描***及其扫描方法,以解决现有手持三维扫描***无法兼顾高扫描效率和高扫描数据准确性的问题。
本发明提供一种三维扫描***,用于获取被测物体的三维点云数据,其包括:一光源,用于在所述被测物体投射多个条纹图案;左、右相机,用于同步采集所述被测物体的2D左图像及2D右图像;一三维模组,用于同步采集被测物体的深度图;条纹匹配模块,用于根据所述深度图指导左、右图像条纹进行匹配;三维重构模块,用于将左右图像匹配好的对应条纹,利用左、右两个相 机的极线几何约束关系,查找对应条纹中心线段中单个点对应关系,然后根据三维扫描***的标定参数,将对应点重建为三维点云数据。
所述光源包括激光、投影仪,当所述光源为投影仪时,所述投影仪为数字投影仪,所述条纹图案包括模拟激光条纹图案、激光条纹图案等。
所述条纹图案中的条纹条数大于15。
所述三维模组为低分辨率三维扫描模组。
一种用于扫描被测物体的三维数字成像传感器,其包括:一光源,用于在所述被测物体投射多个条纹图案;左、右相机,用于获取所述被测物体的2D左图像及2D右图像;一三维模组,用于采集被测物体的深度图像;所述左、右相机与三维模组的空间关系是已知且固定的;所述条纹图案和所述深度图像出现在所述图像上。
一种三维扫描方法,其包括如下步骤:(1)设备构建:构建由三维模组、两个相机和光源组成三维数字成像传感器,且三维模组、两个相机、投影仪之间的相对位置固定;(2)***标定:对左右相机及三维模组进行标定,获得标定参数;(3)投影与图像采集:生成一幅条纹图案,用光源向被测物体投射,条纹图案被被测物体的高度调制发生变形,产生调制后的条纹图案,左右相机同步采集调制后的条纹图案得到左右图像,三维模组同步采集被测物体的深度图;(4)条纹匹配:根据所述深度图指导左、右图像条纹进行匹配;(5)三维重构:将左右图像匹配好的对应条纹,利用左、右两个相机的极线几何约束关 系,查找对应条纹中心线段中单个点对应关系,然后根据所述标定参数,将对应点重建为三维点云数据。
所述***标定进一步包括如下步骤:对左右相机进行标定从而获取相机的内外参及相机之间的相对位置对应的旋转平移矩阵Mc,同时标定三维模组与左相机之间的相对位置关系对应的旋转平移矩阵Ms。
所述三维模组为三维扫描模组。
当所述三维扫描模组发射与光源等波长的光时,所述投影与图像采集进一步包括如下步骤:所述光源向被测物体投射一幅条纹图案,左右相机分别采集左右图像;所述光源关闭,三维模组向被测物体发射光,然后采集三维深度图像。
当所述三维扫描模组发射的光与光源发射波长的波长不相等时,所述投影与图像采集进一步包括如下步骤:所述光源和三维模组同时向被测物体投射一幅条纹图案,左右相机及三维模组同时采集左右图像及三维深度图像。所述条纹匹配进一步包括如下步骤:对左右相机图像上的条纹进行中心线提取,然后对每条中心线连通域的分割形成多条独立线段;将三维模组采集到的深度图根据对应的标定内参换算为自身坐标系下的三维点云坐标(pi);根据标定三维模组与左相机之间的旋转平移矩阵Ms,将(pi)转换到左相机坐标系下三维点云坐标(qi);将三维点云坐标(qi)依次根据左右相机各自的内参反投影到左右图像上,每个对应点均有相应的序号,形成左右图像坐标对应的查找表,遍历 左图像上每个条纹线段每个点的所对应的序号;根据查找表可直接查找到右图像相匹配的条纹线段,从而实现左右图像线段或条纹的准确匹配。
所述三维重构进一步包括如下步骤:将左右图像匹配好的对应条纹中心线段,利用左右两个相机的极线几何约束关系,查找对应条纹中心线段中单个点对应关系,然后根据***的标定参数,将对应点对重建为三维点云数据。
与现有技术相比,本发明的三维扫描仪及其扫描方法中,通过被测物体的深度图指导左、右图像进行条纹匹配的方式,获得三维点云数据。该三维扫描仪相对传统的三维扫描仪具有如下优点:1、条纹匹配的精度或准确性较高,从而可通过增加匹配的条纹数量来提高三维扫描***的扫描效率;2、当条纹数量较多且较密的时候,仅通过深度图即可指导左、右图像进行条纹匹配,无需额外在被测物体上贴标志点,不用借助标志点即可实现实时拼接;3、无需标定条纹光平面,即无需通过光平面来指导左右图像的匹配,对硬件的相对位置的安装精度要求较低,降低了***成本。
上述说明仅是本发明技术方案的概述,为了能够更清楚了解本发明的技术手段,而可依照说明书的内容予以实施,并且为了让本发明的上述和其它目的、特征和优点能够更明显易懂,以下特举实施例,并配合附图,详细说明如下。
附图说明
以下结合附图描述本发明的实施例,其中:
图1是本发明实施例提供的三维扫描仪的结构示意图;
图2是图1中三维扫描仪中的左右相机采集到的条纹图;
图3是将深度图中的三维坐标依次反投影到左右相机的图像的条纹图;
图4是极线几何约束示意图。
具体实施方式
以下基于附图对本发明的具体实施例进行进一步详细说明。应当理解,此处所描述的具体实施例仅仅作为实施例,并不用于限定本发明的保护范围。
请参照图1,本发明实施例提供一种三维扫描***,用于获取或采集被测物体106的三维点云数据。该三维扫描***包括光源101,三维模组102、左相机103、右相机104及数据处理单元105。所述三维扫描***的类型不限,优选地,所述三维扫描***为手持多条纹双目三维扫描***。
所述光源101,三维模组102、左相机103、右相机104之间的相互位置不限,只要可以投射或采集到被测物体106即可,且在工作时,所述光源101,三维模组102、左相机103、右相机104的位置相对固定。优选地,所述光源101设置在所述左相机103与右相机104正中间,所述三维模组102设置在光源101与左相机103之间。所述光源101所投射的条纹图案不限,优选地,为数字模拟激光条纹图案。条纹的数量不限,但为了提高扫描效率,通常需要大于15条,本实施例中,所述条纹数量大于80条。可以理解,当所述条纹的数量较少时,需要在所述被测物体106上额外粘帖标志点,但当所述条纹数量较多时,则不需要在所述被测物体106上额外粘帖标志点。所述光源101的结构 不限,只要能向被测物体106投射条纹图案即可。优选地,所述光源101包括激光、投影仪。在本实施例中,所述光源101为数字投影仪,所述条纹图案包括模拟激光条纹图案、激光条纹图案等。
所述左、右相机103、104,用于同步采集所述被测物体的2D左图像及2D右图像。所述左右相机103、104的类型不限,只要能采集到所述被测物体106的二维图像即可。可以理解,由于所述光源101向被测物体106投射的条纹图案,被被测物体106的高度调制发生变形,产生调制后的条纹图案。而左右相机103、104则通过采集调制后的条纹图案得到左右图像。
所述三维模组102用于同步采集被测物体106的深度图。所述三维模组102的类型不限,只要能采集到所述深度图即可,而三维扫描模组则是用的比较多的一种三维模组102。为降低成本,本实施例采用低分辨率三维扫描模组102。可以理解,所述三维模组102与左、右相机103、104需要同步采集图像,所谓同步采集,亦即在采集过程中,所述三维模组102与左、右相机103、104的位置保持固定,而采集的时间不限定。当然,为避免相互间形成干扰,当所述三维模组102为三维扫描模组且其发射的光与光源101发射的光波长相等或基本一致时,所述三维模组102与左、右相机103、104的采集时间必须错开。当所述三维模组102为三维扫描模组且其发射的光与光源101发射的光波长不相等时,所述三维模组102与左、右相机103、104可以同时采集。
可以理解,所述光源101、三维模组102及左、右相机103、104可组成一 个用于扫描被测物体106的三维数字成像传感器,所述左、右相机与三维模组的空间关系是已知且固定的。所述三维数字成像传感器不同于现有的三维数字成像传感器,其包括可采集深度图像的三维模组102,能大幅提高成像精度。
所述数据处理单元105,用于根据所述深度图指导左、右图像条纹进行匹配,同时将左、右图像匹配好的对应条纹,重建为三维点云数据。具体地,所述数据处理单元105包括条纹匹配模块及三维重构模块。
具体地,由于所述深度图实际上包含了被测物体106的坐标集,通过所述三维模组102与左、右相机103、104的标定内参,将所述坐标集投射到左右图像,即可使得左右图像中的点或条纹中的点均具有具有所述坐标集的坐标。由于坐标集中的每个坐标集仅对应被测物体106中的唯一的一个点,左右图像中坐标相同的点或条纹即可匹配起来。即,所述左右图像的条纹匹配可以通过深度图的指导来实现,精度非常高,且不怕干扰。
而所述三维重构模块用于将左右图像匹配好的对应条纹,利用左、右两个相机的极线几何约束关系,查找对应条纹中心线段中单个点对应关系,然后根据所述标定参数,将对应点重建为三维点云数据。当然,所述三维重构模块重建三维点云数据的方式或技术不限,只要能将匹配好的左右图像重建为三维点云数据即可。
一种用上述三维扫描***获取或采集被测物体106的三维点云数据的扫描方法,其包括如下步骤:
(1)设备构建:构建由三维模组、两个相机和光源组成三维数字成像传感器,且三维模组、两个相机、光源之间的相对位置固定。
(2)***标定:对左右相机及三维模组进行标定,获得标定参数;所述***标定进一步包括如下步骤:对左右相机进行标定从而获取相机的内外参及相机之间的相对位置对应的旋转平移矩阵Mc,同时标定三维模组与左相机之间的相对位置关系对应的旋转平移矩阵Ms。
(3)投影与图像采集:生成一幅条纹图案,用光源向被测物体投射,条纹图案被被测物体的高度调制发生变形,产生调制后的条纹图案,左右相机同步采集调制后的条纹图案得到左右图像,三维模组同步采集被测物体的深度图。优选地,所述三维模组为三维扫描模组。当所述三维扫描模组发射与光源等波长的光时,所述投影与图像采集进一步包括如下步骤:所述光源向被测物体投射一幅条纹图案,左右相机分别采集左右图像;所述光源关闭,三维模组向被测物体发射光,然后采集三维深度图像。当所述三维扫描模组发射的光与光源发射波长的波长不相等时,所述投影与图像采集进一步包括如下步骤:所述光源和三维模组同时向被测物体投射一幅条纹图案,左右相机及三维模组同时采集左右图像及三维深度图像。
(4)条纹匹配:根据所述深度图指导左、右图像条纹进行匹配.所述条纹匹配进一步包括如下步骤:将所述深度图一次反投射到左右图像中,从而实现左右图像线段或条纹的准确匹配。具体地,包括如下步骤:a、对左右相机图像 上的条纹进行中心线提取,然后对每条中心线连通域的分割形成多条独立线段;b、将三维模组采集到的深度图根据对应的标定内参换算为自身坐标系下的三维点云坐标(pi);c、根据标定三维模组与左相机之间的旋转平移矩阵Ms,将(pi)转换到左相机坐标系下三维点云坐标(qi);d、将三维点云坐标(qi)依次根据左右相机各自的内参反投影到左右图像上,每个对应点均有相应的序号,形成左右图像坐标对应的查找表;e、遍历左图像上每个条纹线段每个点的所对应的序号,根据查找表可直接查找到右图像相匹配的条纹线段,从而实现左右图像线段或条纹的准确匹配。
(5)三维重构:将左右图像匹配好的对应条纹,利用左、右两个相机的极线几何约束关系,查找对应条纹中心线段中单个点对应关系,然后根据所述标定参数,将对应点重建为三维点云数据。所述三维重构进一步包括如下步骤:将左右图像匹配好的对应条纹中心线段,利用左右两个相机的极线几何约束关系,查找对应条纹中心线段中单个点对应关系,然后根据***的标定参数,将对应点对重建为三维点云数据。
为进一步阐述本发明的三维扫描***及其扫描方法,下面以具体的实施例给予说明。
请参照图1,实际设计的三维手持多条纹双目三维扫描***的结构如图1所示。101为数字投影仪,102为分辨率三维扫描模组,103为相机,104位右相机,105为计算机,106为被测物体。
标定后的左相机的内部参数为:
K1=[2271.084,    0,         645.632,
       0,       2265.112,    511.553,
       0,        0,          1]
右相机的内部参数为:
K2=[2275.181,    0,         644.405,
       0,     2270.321,      510.053,
       0,      0,            1]
左相机和右相机之间的***结构参数为:
R=[8.749981e-001,6.547051e-003,4.840819e-001,
   -2.904034e-003,9.999615e-001,-8.274993e-003,
   -4.841175e-001,5.834813e-003,8.749835e-001]
T=[-1.778995e+002,-4.162821e-001,5.074737e+001]
低分辨率三维扫描模组的内部参数:
Ks=[476.927,    0,       312.208,
        0,    475.927,    245.949,
        0,      0,        1]
低分辨率三维扫描模组与左相机的之间的***结构参数:
Rs=[9.98946971e-001,4.44611477e-002,-1.13205701e-002,
   -4.54442748e-002,9.92786812e-001,-1.10946668e-001,
   6.30609650e-003,1.11344293e-001,9.93761884e-001]
Ts=[9.13387457e+001,2.81182536e+001,1.79046857e+000]
按照上面的叙述的步骤,对被测物体106投射数字模拟激光条纹图,被左右相机与分辨率三维扫描模组同步采集。根据采集到的条纹图还有低分辨率深度图。如图02所示,利用低分辨率三维扫描模组的内部参数即内参将深度图转换为三维坐标,同时根据标定参数将该三维坐标依次反投影到左右相机的图像上如图03所示,左右对应点上赋予序号,形成序号查找表。提取左右相机图像上条纹中心并进行连通域分割,根据序号查找表进行条纹对应线段的匹配。匹配完的线段对根据双相机的极线几何约束关系进行对应点查找如图04所 示,然后根据标定参数进行三维重构,生成点云数据。
本发明的三维扫描仪及其扫描方法中,通过被测物体的深度图指导左、右图像进行条纹匹配的方式,获得三维点云数据。该三维扫描仪相对传统的三维扫描仪具有如下优点:1、条纹匹配的精度或准确性较高,从而可通过增加匹配的条纹数量来提高三维扫描***的扫描效率;2、当条纹数量较多且较密的时候,仅通过深度图即可指导左、右图像进行条纹匹配,无需额外在被测物体上贴标志点,不用借助标志点即可实现实时拼接;3、无需标定条纹光平面,即无需通过光平面来指导左右图像的匹配,对硬件的相对位置的安装精度要求较低,降低了***成本。
以上所述仅为本发明的较佳实施例而已,并不用以限制本发明,凡在本发明的精神和原则之内所作的任何修改、等同替换和改进等,均应包含在本发明的保护范围之内。

Claims (15)

  1. 一种三维扫描***,用于获取被测物体的三维点云数据,其包括:
    光源,用于在所述被测物体投射多个条纹图案;
    左、右相机,用于同步采集所述被测物体的2D左图像及2D右图像;
    三维模组,用于同步采集被测物体的深度图;
    条纹匹配模块,用于根据所述深度图指导左、右图像条纹进行匹配;
    三维重构模块,用于将左、右图像匹配好的对应条纹,重建为三维点云数据。
  2. 如权利要求1所述的三维扫描***,其特征在于,所述光源包括激光、投影仪,当所述光源为投影仪时,所述投影仪为数字投影仪,所述条纹图案包括模拟激光条纹图案、激光条纹图案等。
  3. 如权利要求2所述的三维扫描***,其特征在于,所述条纹图案中的条纹条数大于15。
  4. 如权利要求1所述的三维扫描***,其特征在于,所述三维模组为低分辨率三维扫描模组。
  5. 如权利要求1所述的三维扫描***,其特征在于,所述三维扫描***为手持式三维扫描***。
  6. 一种用于扫描被测物体的三维数字成像传感器,其包括:
    一光源,用于在所述被测物体投射多个条纹图案;
    左、右相机,用于获取所述被测物体的2D左图像及2D右图像;
    一三维模组,用于采集被测物体的深度图像;
    所述左、右相机与三维模组的空间关系是已知且固定的;
    所述条纹图案和所述深度图像出现在所述图像上。
  7. 一种三维扫描***,用于获取被测物体的三维点云数据,其包括:
    一光源,用于在所述被测物体投射多个条纹图案;
    左、右相机,用于同步采集所述被测物体的2D左图像及2D右图像;
    一三维模组,用于同步采集被测物体的深度图;
    一数据处理单元,用于根据所述深度图指导左、右图像条纹进行匹配,同时将左、右图像匹配好的对应条纹,重建为三维点云数据。
  8. 一种三维扫描方法,其包括如下步骤:
    (1)设备构建:构建由三维模组、两个相机和光源组成三维数字成像传感器,且三维模组、两个相机、光源之间的相对位置固定;
    (2)***标定:对左右相机及三维模组进行标定,获得标定参数;
    (3)投影与图像采集:生成一幅条纹图案,用光源向被测物体投射,条纹图案被被测物体的高度调制发生变形,产生调制后的条纹图案,左右相机同步采集调制后的条纹图案得到左右图像,三维模组同步采集被测物体的深度图;
    (4)条纹匹配:根据所述深度图指导左、右图像条纹进行匹配;
    (5)三维重构:将左右图像匹配好的对应条纹,利用左、右两个相机的极 线几何约束关系,查找对应条纹中心线段中单个点对应关系,然后根据所述标定参数,将对应点重建为三维点云数据。
  9. 一种如权利要求8所述的三维扫描方法,其特征在于,所述***标定进一步包括如下步骤:对左右相机进行标定从而获取相机的内外参及相机之间的相对位置对应的旋转平移矩阵Mc,同时标定三维模组与左相机之间的相对位置关系对应的旋转平移矩阵Ms。
  10. 一种如权利要求8所述的三维扫描方法,其特征在于,所述三维模组为三维扫描模组。
  11. 一种如权利要求10所述的三维扫描方法,其特征在于,当所述三维扫描模组发射与光源等波长的光时,所述投影与图像采集进一步包括如下步骤:所述光源向被测物体投射一幅条纹图案,左右相机分别采集左右图像;所述光源关闭,三维模组向被测物体发射光,然后采集三维深度图像。
  12. 一种如权利要求10所述的三维扫描方法,其特征在于,当所述三维扫描模组发射的光与光源发射波长的波长不相等时,所述投影与图像采集进一步包括如下步骤:所述光源和三维模组同时向被测物体投射一幅条纹图案,左右相机及三维模组同时采集左右图像及三维深度图像。
  13. 一种如权利要求9所述的三维扫描方法,其特征在于,所述条纹匹配进一步包括如下步骤:将所述深度图一次反投射到左右图像中,从而实现左右图像线段或条纹的准确匹配。
  14. 一种如权利要求9所述的三维扫描方法,其特征在于,所述条纹匹配进一步包括如下步骤:
    对左右相机图像上的条纹进行中心线提取,然后对每条中心线连通域的分割形成多条独立线段;
    将三维模组采集到的深度图根据对应的标定内参换算为自身坐标系下的三维点云坐标(pi);
    根据标定三维模组与左相机之间的旋转平移矩阵Ms,将(pi)转换到左相机坐标系下三维点云坐标(qi);
    将三维点云坐标(qi)依次根据左右相机各自的内参反投影到左右图像上,每个对应点均有相应的序号,形成左右图像坐标对应的查找表;
    遍历左图像上每个条纹线段每个点的所对应的序号,根据查找表可直接查找到右图像相匹配的条纹线段,从而实现左右图像线段或条纹的准确匹配。
  15. 一种如权利要求14所述的三维扫描方法,其特征在于,所述三维重构进一步包括如下步骤:将左右图像匹配好的对应条纹中心线段,利用左右两个相机的极线几何约束关系,查找对应条纹中心线段中单个点对应关系,然后根据***的标定参数,将对应点对重建为三维点云数据。
PCT/CN2016/112118 2016-12-05 2016-12-26 一种三维数字成像传感器、三维扫描***及其扫描方法 WO2018103152A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201611102629.2 2016-12-05
CN201611102629.2A CN108151671B (zh) 2016-12-05 2016-12-05 一种三维数字成像传感器、三维扫描***及其扫描方法

Publications (1)

Publication Number Publication Date
WO2018103152A1 true WO2018103152A1 (zh) 2018-06-14

Family

ID=62470859

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2016/112118 WO2018103152A1 (zh) 2016-12-05 2016-12-26 一种三维数字成像传感器、三维扫描***及其扫描方法

Country Status (2)

Country Link
CN (1) CN108151671B (zh)
WO (1) WO2018103152A1 (zh)

Cited By (27)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109242956A (zh) * 2018-08-19 2019-01-18 雅派朗迪(北京)科技发展股份有限公司 3d人体扫描数字建模车
CN109903382A (zh) * 2019-03-20 2019-06-18 中煤航测遥感集团有限公司 点云数据的融合方法及装置
CN110244302A (zh) * 2019-07-05 2019-09-17 苏州科技大学 地基合成孔径雷达影像像元坐标三维变换方法
CN110243307A (zh) * 2019-04-15 2019-09-17 深圳市易尚展示股份有限公司 一种自动化三维彩色成像与测量***
CN110517323A (zh) * 2019-08-16 2019-11-29 中铁第一勘察设计院集团有限公司 基于机械手单相机多目视觉的三维定位***及方法
CN111008602A (zh) * 2019-12-06 2020-04-14 青岛海之晨工业装备有限公司 一种小曲率薄壁零件用二维和三维视觉结合的划线特征提取方法
CN111127625A (zh) * 2019-10-08 2020-05-08 新拓三维技术(深圳)有限公司 一种足部扫描方法、***及装置
CN111750805A (zh) * 2020-07-06 2020-10-09 山东大学 一种基于双目相机成像和结构光技术的三维测量装置及测量方法
CN111932565A (zh) * 2019-05-13 2020-11-13 中国科学院沈阳自动化研究所 一种多靶标识别跟踪解算方法
CN112200911A (zh) * 2020-11-06 2021-01-08 北京易达恩能科技有限公司 结合标志物的区域重叠式三维地图构建方法和装置
CN112465912A (zh) * 2020-11-18 2021-03-09 新拓三维技术(深圳)有限公司 一种立体相机标定方法及装置
CN112509057A (zh) * 2020-11-30 2021-03-16 北京百度网讯科技有限公司 相机外参标定方法、装置、电子设备以及计算机可读介质
CN112530008A (zh) * 2020-12-25 2021-03-19 中国科学院苏州纳米技术与纳米仿生研究所 一种条纹结构光的参数确定方法、装置、设备及存储介质
CN113008164A (zh) * 2021-03-23 2021-06-22 南京理工大学 快速高精度三维面形重构方法
CN113034603A (zh) * 2019-12-09 2021-06-25 百度在线网络技术(北京)有限公司 用于确定标定参数的方法和装置
CN113313746A (zh) * 2020-12-01 2021-08-27 湖南长天自控工程有限公司 一种料堆盘库的方法及***
CN113483668A (zh) * 2021-08-19 2021-10-08 广东亚太新材料科技有限公司 一种碳纤维复合材料产品尺寸检测方法及***
CN113655064A (zh) * 2021-08-11 2021-11-16 合肥工业大学 一种外观缺陷多模式视觉检测传感器
CN114119747A (zh) * 2021-11-23 2022-03-01 四川大学 一种基于pmd波前检测的三维流场流动显示方法
CN114332349A (zh) * 2021-11-17 2022-04-12 浙江智慧视频安防创新中心有限公司 一种双目结构光边缘重建方法、***及存储介质
CN114681089A (zh) * 2020-12-31 2022-07-01 先临三维科技股份有限公司 三维扫描装置和方法
CN114739312A (zh) * 2022-04-26 2022-07-12 黄晓明 一种手持式路面构造深度激光测定装置
CN114909993A (zh) * 2022-04-26 2022-08-16 泰州市创新电子有限公司 一种高精度式激光投影视觉三维测量***
CN114998499A (zh) * 2022-06-08 2022-09-02 深圳大学 一种基于线激光振镜扫描的双目三维重建方法及***
CN115068833A (zh) * 2021-03-15 2022-09-20 湖南华创医疗科技有限公司 用于束流阻挡器的定位装置和放射治疗***
WO2023019833A1 (zh) * 2021-08-18 2023-02-23 梅卡曼德(北京)机器人科技有限公司 基于激光线扫的点云处理方法及装置
WO2023207756A1 (zh) * 2022-04-28 2023-11-02 杭州海康机器人股份有限公司 图像重建方法和装置及设备

Families Citing this family (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109299662B (zh) * 2018-08-24 2022-04-12 上海图漾信息科技有限公司 深度数据计算设备与方法及人脸识别设备
CN111508012B (zh) 2019-01-31 2024-04-19 先临三维科技股份有限公司 线条纹误配检测和三维重建的方法、装置
CN110047147A (zh) * 2019-04-09 2019-07-23 易视智瞳科技(深圳)有限公司 一种3d点云处理方法、装置、***及计算机存储介质
CN109900221A (zh) * 2019-04-12 2019-06-18 杭州思看科技有限公司 一种手持式三维扫描***
CN110702025B (zh) * 2019-05-30 2021-03-19 北京航空航天大学 一种光栅式双目立体视觉三维测量***及方法
WO2020259625A1 (zh) * 2019-06-28 2020-12-30 先临三维科技股份有限公司 三维扫描方法、扫描仪、三维扫描***、计算机设备和计算机可读存储介质
CN110764841B (zh) * 2019-10-10 2024-01-19 珠海格力智能装备有限公司 3d视觉应用开发平台和开发方法
CN111047692A (zh) * 2019-12-23 2020-04-21 武汉华工激光工程有限责任公司 一种三维建模方法、装置、设备及可读取存储介质
CN111462331B (zh) * 2020-03-31 2023-06-27 四川大学 扩展对极几何并实时计算三维点云的查找表方法
CN112330732A (zh) * 2020-09-29 2021-02-05 先临三维科技股份有限公司 三维数据拼接方法及三维扫描***、手持扫描仪
CN112747671B (zh) * 2020-12-10 2022-12-09 杭州先临天远三维检测技术有限公司 三维检测***和三维检测方法
CN113219489B (zh) * 2021-05-13 2024-04-16 深圳数马电子技术有限公司 多线激光的点对确定方法、装置、计算机设备和存储介质
CN113963115A (zh) * 2021-10-28 2022-01-21 山东大学 基于单帧图像的高动态范围激光三维扫描方法
WO2023179782A1 (zh) * 2022-03-25 2023-09-28 先临三维科技股份有限公司 三维扫描***、方法、装置和移动计算模组
CN117522940A (zh) * 2022-07-27 2024-02-06 梅卡曼德(北京)机器人科技有限公司 三维激光相机及标定方法及用于获取彩色点云图像的方法
CN115345995A (zh) * 2022-08-10 2022-11-15 先临三维科技股份有限公司 三维重建方法及装置、***
CN115984371A (zh) * 2022-11-25 2023-04-18 杭州天远三维检测技术有限公司 一种扫描头位姿检测方法、装置、设备及介质
CN116664796B (zh) * 2023-04-25 2024-04-02 北京天翔睿翼科技有限公司 轻量级头部建模***及方法

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101245998A (zh) * 2008-02-01 2008-08-20 黑龙江科技学院 一种三维测量***的成像方法
CN101608908A (zh) * 2009-07-20 2009-12-23 杭州先临三维科技股份有限公司 数字散斑投影和相位测量轮廓术相结合的三维数字成像方法
CN101739717A (zh) * 2009-11-12 2010-06-16 天津汇信软件有限公司 三维彩色点云非接触扫描方法
US20140253929A1 (en) * 2011-10-18 2014-09-11 Nanyang Technological University Apparatus and method for 3d surface measurement
CN105869167A (zh) * 2016-03-30 2016-08-17 天津大学 基于主被动融合的高分辨率深度图获取方法

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE602004015799D1 (de) * 2003-07-24 2008-09-25 Cognitens Ltd Verfahren und system zur dreidimensionalen oberflächenrekonstruktion eines objekts
JP4577126B2 (ja) * 2005-07-08 2010-11-10 オムロン株式会社 ステレオ対応づけのための投光パターンの生成装置及び生成方法
CN101853528B (zh) * 2010-05-10 2011-12-07 沈阳雅克科技有限公司 一种手持式三维型面信息提取方法及其提取仪
CN103900494B (zh) * 2014-03-31 2016-06-08 中国科学院上海光学精密机械研究所 用于双目视觉三维测量的同源点快速匹配方法
CN103954239A (zh) * 2014-05-08 2014-07-30 青岛三友智控科技有限公司 一种三维测量***及方法
CN204043632U (zh) * 2014-08-29 2014-12-24 西安新拓三维光测科技有限公司 一种基于结构光多幅面三维测量仪装置
CN105115445A (zh) * 2015-09-14 2015-12-02 杭州光珀智能科技有限公司 基于深度相机与双目视觉复合的三维成像***及成像方法
CN106091987A (zh) * 2016-06-14 2016-11-09 中国科学院上海光学精密机械研究所 基于散斑时域相关的大尺寸光学毛坯三维测量方法

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101245998A (zh) * 2008-02-01 2008-08-20 黑龙江科技学院 一种三维测量***的成像方法
CN101608908A (zh) * 2009-07-20 2009-12-23 杭州先临三维科技股份有限公司 数字散斑投影和相位测量轮廓术相结合的三维数字成像方法
CN101739717A (zh) * 2009-11-12 2010-06-16 天津汇信软件有限公司 三维彩色点云非接触扫描方法
US20140253929A1 (en) * 2011-10-18 2014-09-11 Nanyang Technological University Apparatus and method for 3d surface measurement
CN105869167A (zh) * 2016-03-30 2016-08-17 天津大学 基于主被动融合的高分辨率深度图获取方法

Cited By (44)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109242956B (zh) * 2018-08-19 2023-05-23 雅派朗迪(北京)科技发展股份有限公司 3d人体扫描数字建模车
CN109242956A (zh) * 2018-08-19 2019-01-18 雅派朗迪(北京)科技发展股份有限公司 3d人体扫描数字建模车
CN109903382A (zh) * 2019-03-20 2019-06-18 中煤航测遥感集团有限公司 点云数据的融合方法及装置
CN109903382B (zh) * 2019-03-20 2023-05-23 中煤航测遥感集团有限公司 点云数据的融合方法及装置
CN110243307A (zh) * 2019-04-15 2019-09-17 深圳市易尚展示股份有限公司 一种自动化三维彩色成像与测量***
CN111932565A (zh) * 2019-05-13 2020-11-13 中国科学院沈阳自动化研究所 一种多靶标识别跟踪解算方法
CN111932565B (zh) * 2019-05-13 2023-09-19 中国科学院沈阳自动化研究所 一种多靶标识别跟踪解算方法
CN110244302A (zh) * 2019-07-05 2019-09-17 苏州科技大学 地基合成孔径雷达影像像元坐标三维变换方法
CN110244302B (zh) * 2019-07-05 2023-02-17 苏州科技大学 地基合成孔径雷达影像像元坐标三维变换方法
CN110517323A (zh) * 2019-08-16 2019-11-29 中铁第一勘察设计院集团有限公司 基于机械手单相机多目视觉的三维定位***及方法
CN111127625B (zh) * 2019-10-08 2024-01-12 新拓三维技术(深圳)有限公司 一种足部扫描方法、***及装置
CN111127625A (zh) * 2019-10-08 2020-05-08 新拓三维技术(深圳)有限公司 一种足部扫描方法、***及装置
CN111008602A (zh) * 2019-12-06 2020-04-14 青岛海之晨工业装备有限公司 一种小曲率薄壁零件用二维和三维视觉结合的划线特征提取方法
CN111008602B (zh) * 2019-12-06 2023-07-25 青岛海之晨工业装备有限公司 一种小曲率薄壁零件用二维和三维视觉结合的划线特征提取方法
CN113034603A (zh) * 2019-12-09 2021-06-25 百度在线网络技术(北京)有限公司 用于确定标定参数的方法和装置
CN113034603B (zh) * 2019-12-09 2023-07-14 百度在线网络技术(北京)有限公司 用于确定标定参数的方法和装置
CN111750805A (zh) * 2020-07-06 2020-10-09 山东大学 一种基于双目相机成像和结构光技术的三维测量装置及测量方法
CN112200911B (zh) * 2020-11-06 2024-05-28 北京易达恩能科技有限公司 结合标志物的区域重叠式三维地图构建方法和装置
CN112200911A (zh) * 2020-11-06 2021-01-08 北京易达恩能科技有限公司 结合标志物的区域重叠式三维地图构建方法和装置
CN112465912B (zh) * 2020-11-18 2024-03-29 新拓三维技术(深圳)有限公司 一种立体相机标定方法及装置
CN112465912A (zh) * 2020-11-18 2021-03-09 新拓三维技术(深圳)有限公司 一种立体相机标定方法及装置
CN112509057A (zh) * 2020-11-30 2021-03-16 北京百度网讯科技有限公司 相机外参标定方法、装置、电子设备以及计算机可读介质
CN112509057B (zh) * 2020-11-30 2024-04-12 北京百度网讯科技有限公司 相机外参标定方法、装置、电子设备以及计算机可读介质
US11875535B2 (en) 2020-11-30 2024-01-16 Beijing Baidu Netcom Science And Technology Co., Ltd. Method, apparatus, electronic device and computer readable medium for calibrating external parameter of camera
CN113313746A (zh) * 2020-12-01 2021-08-27 湖南长天自控工程有限公司 一种料堆盘库的方法及***
CN112530008A (zh) * 2020-12-25 2021-03-19 中国科学院苏州纳米技术与纳米仿生研究所 一种条纹结构光的参数确定方法、装置、设备及存储介质
CN114681089A (zh) * 2020-12-31 2022-07-01 先临三维科技股份有限公司 三维扫描装置和方法
CN114681089B (zh) * 2020-12-31 2023-06-06 先临三维科技股份有限公司 三维扫描装置和方法
CN115068833A (zh) * 2021-03-15 2022-09-20 湖南华创医疗科技有限公司 用于束流阻挡器的定位装置和放射治疗***
CN115068833B (zh) * 2021-03-15 2024-02-06 湖南华创医疗科技有限公司 用于束流阻挡器的定位装置和放射治疗***
CN113008164A (zh) * 2021-03-23 2021-06-22 南京理工大学 快速高精度三维面形重构方法
CN113655064A (zh) * 2021-08-11 2021-11-16 合肥工业大学 一种外观缺陷多模式视觉检测传感器
CN113655064B (zh) * 2021-08-11 2023-08-08 合肥工业大学 一种外观缺陷多模式视觉检测传感器
WO2023019833A1 (zh) * 2021-08-18 2023-02-23 梅卡曼德(北京)机器人科技有限公司 基于激光线扫的点云处理方法及装置
CN113483668A (zh) * 2021-08-19 2021-10-08 广东亚太新材料科技有限公司 一种碳纤维复合材料产品尺寸检测方法及***
CN114332349B (zh) * 2021-11-17 2023-11-03 浙江视觉智能创新中心有限公司 一种双目结构光边缘重建方法、***及存储介质
CN114332349A (zh) * 2021-11-17 2022-04-12 浙江智慧视频安防创新中心有限公司 一种双目结构光边缘重建方法、***及存储介质
CN114119747A (zh) * 2021-11-23 2022-03-01 四川大学 一种基于pmd波前检测的三维流场流动显示方法
CN114909993A (zh) * 2022-04-26 2022-08-16 泰州市创新电子有限公司 一种高精度式激光投影视觉三维测量***
CN114739312A (zh) * 2022-04-26 2022-07-12 黄晓明 一种手持式路面构造深度激光测定装置
CN114739312B (zh) * 2022-04-26 2024-04-23 黄晓明 一种手持式路面构造深度激光测定装置
WO2023207756A1 (zh) * 2022-04-28 2023-11-02 杭州海康机器人股份有限公司 图像重建方法和装置及设备
CN114998499B (zh) * 2022-06-08 2024-03-26 深圳大学 一种基于线激光振镜扫描的双目三维重建方法及***
CN114998499A (zh) * 2022-06-08 2022-09-02 深圳大学 一种基于线激光振镜扫描的双目三维重建方法及***

Also Published As

Publication number Publication date
CN108151671B (zh) 2019-10-25
CN108151671A (zh) 2018-06-12

Similar Documents

Publication Publication Date Title
WO2018103152A1 (zh) 一种三维数字成像传感器、三维扫描***及其扫描方法
WO2018152929A1 (zh) 一种三维扫描***及其扫描方法
JP6564537B1 (ja) 単眼3次元走査システムによる3次元再構成法および装置
CN110288642B (zh) 基于相机阵列的三维物体快速重建方法
CN107202554B (zh) 同时具备摄影测量和三维扫描功能的手持式大尺度三维测量扫描仪***
CN108267097B (zh) 基于双目三维扫描***的三维重构方法和装置
US8265376B2 (en) Method and system for providing a digital model of an object
US6781618B2 (en) Hand-held 3D vision system
CN106500628B (zh) 一种含有多个不同波长激光器的三维扫描方法及扫描仪
CA3022442C (en) Three-dimensional reconstruction method and device based on monocular three-dimensional scanning system
JP6429772B2 (ja) 3d走査および位置決めシステム
WO2016037486A1 (zh) 人体三维成像方法及***
CN110487216A (zh) 一种基于卷积神经网络的条纹投影三维扫描方法
CN109727277B (zh) 多目立体视觉的体表摆位跟踪方法
CN107860337B (zh) 基于阵列相机的结构光三维重建方法与装置
CN108665535A (zh) 一种基于编码光栅结构光的三维结构重建方法与***
CN105303572B (zh) 基于主被动结合的深度信息获取方法
CN103292699A (zh) 一种三维扫描***及方法
WO2020199439A1 (zh) 基于单双目混合测量的三维点云计算方法
CN104680534B (zh) 基于单帧复合模板的物体深度信息获取方法
CN111780678A (zh) 一种轨道板预埋套管直径的测量方法
Xu et al. Three degrees of freedom global calibration method for measurement systems with binocular vision
CN113421286B (zh) 一种动作捕捉***及方法
Ettl Introductory review on ‘Flying Triangulation’: a motion-robust optical 3D measurement principle
Zhang et al. Fusion of time-of-flight and phase shifting for high-resolution and low-latency depth sensing

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16923462

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 16923462

Country of ref document: EP

Kind code of ref document: A1