CN106225676B - Method for three-dimensional measurement, apparatus and system - Google Patents

Method for three-dimensional measurement, apparatus and system Download PDF

Info

Publication number
CN106225676B
CN106225676B CN201610804538.7A CN201610804538A CN106225676B CN 106225676 B CN106225676 B CN 106225676B CN 201610804538 A CN201610804538 A CN 201610804538A CN 106225676 B CN106225676 B CN 106225676B
Authority
CN
China
Prior art keywords
coordinate
power flow
point
matching power
vision system
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201610804538.7A
Other languages
Chinese (zh)
Other versions
CN106225676A (en
Inventor
王振杰
杨艺
张勇
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Lingyunguang Technology Group Co ltd
Luster LightTech Co Ltd
Original Assignee
Luster LightTech Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Luster LightTech Co Ltd filed Critical Luster LightTech Co Ltd
Priority to CN201610804538.7A priority Critical patent/CN106225676B/en
Publication of CN106225676A publication Critical patent/CN106225676A/en
Application granted granted Critical
Publication of CN106225676B publication Critical patent/CN106225676B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/002Measuring arrangements characterised by the use of optical techniques for measuring two or more coordinates
    • G01B11/005Measuring arrangements characterised by the use of optical techniques for measuring two or more coordinates coordinate measuring machines

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

The application discloses a kind of method for three-dimensional measurement, apparatus and system, passes through Constructed Lighting Vision System and determines coordinates of the tested object point P under Constructed Lighting Vision System coordinate system;Coordinate conversion is carried out according to the coordinate transformation relation between the two kinds of vision systems demarcated in advance, obtains reference coordinates of the object point P under the first coordinate system of Binocular Stereo Vision System;According to the reference coordinate and the measurement accuracy demarcated in advance, corresponding first image point position of smallest match costs of the object point P in Binocular Stereo Vision System and the second image point position are determined using the three-dimensional measurement principle and default Stereo Matching Algorithm of Binocular Stereo Vision System;And then the corresponding practical parallaxes of object point P are determined according to first image point position and the second image point position, and then according to the actual coordinates of the practical disparity computation object point P under the first coordinate system.The application realizes the fusion of two kinds of vision systems, while the shortcomings that both overcome, it is ensured that the high-acruracy survey demand under high motion scenes.

Description

Method for three-dimensional measurement, apparatus and system
Technical field
This application involves three-dimensional measurement technical field more particularly to a kind of method for three-dimensional measurement, apparatus and systems.
Background technology
Three-dimensional measurement, as the term suggests it is exactly to measure the three-dimensional coordinate data for determining measured object.Currently, three-dimensional measurement technology is It is widely used in the multiple fields such as railway, automobile, aviation, military project.Common three-dimension measuring system mainly has structure light vision System (Structured Light Vision System) and Binocular Stereo Vision System (Binocular Stereo Vision System)。
Wherein, Constructed Lighting Vision System is based on optic triangle method measuring principle, passes through dedicated optical projection device (main packet Include laser) by project structured light in measured object surface, and pass through image acquisition device in another location (such as camera, camera shooting Machine etc.) the two-dimentional fault image that obtains the measured object, when the relative position of optical projection device and image acquisition device determines, you can The three-dimensional coordinate of the corresponding measured point in measured object surface to be calculated according to the two-dimensional coordinate of any point in the two dimension fault image. Constructed Lighting Vision System has many advantages, such as that principle is simple, high certainty of measurement, strong interference immunity, but it is limited to laser scanning speed Rate, it is impossible to be used in the measured objects such as high-speed railway are in the scene of high-speed motion state.
Binocular Stereo Vision System is based on principle of parallax, and the eyes of people are simulated by two sets image acquisition devices, using being tested The deviation of some image space in two sets of image acquisition devices on object, the three-dimensional coordinate of the point on measured object is calculated.It is double Item stereo vision system principle, simple in structure, wide range of measurement, usually using two-wire array camera as image acquisition device, scanning Speed is fast, is adapted to the measurement scene of the high-speed motions such as high-speed railway, but its measurement accuracy is unstable.
Therefore, it is necessary to a kind of high certainty of measurement and the fast method for three-dimensional measurement and relevant apparatus of sweep speed are provided, with Meet the high-acruracy survey demand of the high motion scenes such as high-speed railway.
Invention content
The embodiment of the present application provides a kind of method for three-dimensional measurement, apparatus and system, simultaneous to solve to be difficult in the related technology The problem of caring for measurement accuracy and sweep speed.
In order to solve the above-mentioned technical problem, the embodiment of the invention discloses following technical solutions:
According to the embodiment of the present application in a first aspect, provide a kind of method for three-dimensional measurement, including:
Obtain the calibration information of Constructed Lighting Vision System and Binocular Stereo Vision System;The calibration information includes at least, Constructed Lighting Vision System coordinate system Os-XsYsZsThe first coordinate corresponding with the first image acquisition device of Binocular Stereo Vision System It is OL-XLYLZLBetween coordinate transformation relation, measurement accuracy δ, the stand-off b of the Binocular Stereo Vision System, focal length f, with And the principle point location u of described first image collector;
Using Constructed Lighting Vision System, determine that an object point P is in coordinate system O on measured objects-XsYsZsUnder coordinate (xs,zs), And according to the coordinate transformation relation to the coordinate (xs,zs) coordinate conversion is carried out, object point P is obtained in the first coordinate system OL- XLYLZLUnder reference coordinate (xL,zL);
According to the reference coordinate value z of the stand-off b, focal length f, measurement accuracy δ and the object point PL, determine that object point P exists Parallax minimum value d between the first image acquisition device and the second image acquisition device of the Binocular Stereo Vision SystemminAnd parallax Maximum value dmax;Wherein,
According to the reference coordinate (xL,zL) and principle point location u, determine of object point P captured by the first image acquisition device Corresponding first location of pixels x in one image1
According to the parallax minimum value dmin, parallax maximum value dmaxWith the first image point position x1, determine object point P in the second figure The second location of pixels x in the second image as captured by collector2Interval T=[x1-dmax,x1-dmin];
According to default Stereo Matching Algorithm, the first location of pixels x is calculated separately1Corresponding first picture point takes with described It is worth each value x in the T of section2Matching power flow between corresponding alternative second picture point;
Matching power flow minimum value is determined from each Matching power flow being calculated, and determines the Matching power flow minimum value Corresponding optimal second location of pixels x'2
According to the first location of pixels x1With optimal second location of pixels x2', object point P is calculated in the first image acquisition device With the practical parallax d in the second image acquisition deviceP
According to the practical parallax dPObject point P is calculated in the first coordinate system OL-XLYLZLUnder actual coordinate.
Optionally, according to the reference coordinate (xL,zL) and principle point location u, determine that object point P is acquired in described first image The first location of pixels x in the first image captured by device1, including:
Utilize formulaThe first location of pixels x is calculated1
Optionally, according to default Stereo Matching Algorithm, the first location of pixels x is calculated separately1Corresponding first picture point With each value x in the interval T2Matching power flow between corresponding second picture point, including:
Each value x in the interval T is obtained successively2, by its corresponding picture point alternately the second picture point;
The first Matching power flow C between first picture point and alternative second picture point is calculated according to absolute difference AD algorithmsAD(P, d);Wherein, For the gray value of first picture point,It is described alternative The gray value of second picture point;
Become scaling method according to census and calculates the second Matching power flow between first picture point and alternative second picture point Ccensus(P,d);
According to normalized function ρ (C, λ)=1-exp (C/ λ) respectively to first Matching power flow and the second Matching power flow Be normalized, and to after normalized the first Matching power flow and the second Matching power flow sum, obtain the value x2Corresponding Matching power flow C (P, d)=ρ (CAD(P,d),λAD)+ρ(Ccensus(P,d),λcensus);
Wherein, λADFor the default weight of first Matching power flow, λcensusFor the default power of second Matching power flow Weight.
Optionally, Matching power flow minimum value is determined from each Matching power flow being calculated, including:
The each Matching power flow being calculated is polymerize according to right-angled intersection algorithm;
For each Matching power flow by polymerization, according to the victor is a king, WTA algorithms determine that Matching power flow therein is minimum Value.
Optionally, according to the practical parallax dPObject point P is calculated in the first coordinate system OL-XLYLZLUnder practical seat Mark, including:
According to formula z=bf/dpObject point P is calculated in the first coordinate system OL-XLYLZLUnder ZLAxis actual coordinate value z;
According to formula x=zxL/zLObject point P is calculated in the first coordinate system OL-XLYLZLUnder XLAxis actual coordinate value x。
According to the second aspect of the embodiment of the present application, a kind of three-dimensional measuring apparatus is provided, including:
Calibration information acquiring unit, the calibration information for obtaining Constructed Lighting Vision System and Binocular Stereo Vision System; The calibration information includes at least, Constructed Lighting Vision System coordinate system Os-XsYsZsWith the first image of Binocular Stereo Vision System The corresponding first coordinate system O of collectorL-XLYLZLBetween coordinate transformation relation, measurement accuracy δ, the binocular stereo vision system Stand-off b, the focal length f of system and the principle point location u of described first image collector;
Reference coordinate determination unit determines that an object point P is in coordinate system on measured object for utilizing Constructed Lighting Vision System Os-XsYsZsUnder coordinate (xs,zs), and according to the coordinate transformation relation to the coordinate (xs,zs) coordinate conversion is carried out, it obtains To object point P in the first coordinate system OL-XLYLZLUnder reference coordinate (xL,zL);
Disparity range determination unit, for the reference according to the stand-off b, focal length f, measurement accuracy δ and the object point P Coordinate value zL, determine object point P between the first image acquisition device and the second image acquisition device of the Binocular Stereo Vision System Parallax minimum value dminWith parallax maximum value dmax;Wherein,
First picture point determination unit, for according to the reference coordinate (xL,zL) and principle point location u, determine object point P Corresponding first location of pixels x in the first image captured by one image acquisition device1
Second picture point determination unit, for according to the parallax minimum value dmin, parallax maximum value dmaxWith the first picture point position Set x1, determine the second location of pixels x in the second images of the object point P captured by the second image acquisition device2Interval T= [x1-dmax,x1-dmin];
Matching power flow computing unit, for according to Stereo Matching Algorithm is preset, calculating separately the first location of pixels x1 Corresponding first picture point and each value x in the interval T2Matching power flow between corresponding alternative second picture point;
Best match determination unit, for determining Matching power flow minimum value from each Matching power flow being calculated, and Determine the corresponding optimal second location of pixels x of the Matching power flow minimum value2';
Practical disparity computation unit, for according to the first location of pixels x1With optimal second location of pixels x2', calculate Practical parallax ds of the object point P in the first image acquisition device and the second image acquisition deviceP
Actual coordinate computing unit, for according to the practical parallax dPObject point P is calculated in the first coordinate system OL- XLYLZLUnder actual coordinate.
Optionally, the first picture point determination unit is specifically configured to:
Utilize formulaThe first location of pixels x is calculated1
Optionally, the Matching power flow computing unit is specifically configured to:
Each value x in the interval T is obtained successively2, by its corresponding picture point alternately the second picture point;
The first Matching power flow C between first picture point and alternative second picture point is calculated according to absolute difference AD algorithmsAD(P, d);Wherein, For the gray value of first picture point,It is described alternative The gray value of second picture point;
Become scaling method according to census and calculates the second Matching power flow between first picture point and alternative second picture point Ccensus(P,d);
According to normalized function ρ (C, λ)=1-exp (C/ λ) respectively to first Matching power flow and the second Matching power flow Be normalized, and to after normalized the first Matching power flow and the second Matching power flow sum, obtain the value x2Corresponding Matching power flow C (P, d)=ρ (CAD(P,d),λAD)+ρ(Ccensus(P,d),λcensus);
Wherein, λADFor the default weight of first Matching power flow, λcensusFor the default power of second Matching power flow Weight.
Optionally, the best match determination unit is specifically configured to:
The each Matching power flow being calculated is polymerize according to right-angled intersection algorithm;
For each Matching power flow by polymerization, according to the victor is a king, WTA algorithms determine that Matching power flow therein is minimum Value.
Optionally, the actual coordinate computing unit is specifically configured to:
According to formula z=bf/dpObject point P is calculated in the first coordinate system OL-XLYLZLUnder ZLAxis actual coordinate value z;
According to formula x=zxL/zLObject point P is calculated in the first coordinate system OL-XLYLZLUnder XLAxis actual coordinate value x。
According to the third aspect of the embodiment of the present application, a kind of three-dimension measuring system is provided, including:Constructed Lighting Vision System, Binocular Stereo Vision System and three-dimensional measuring apparatus described in any one of the above embodiments.
By above technical scheme it is found that for Constructed Lighting Vision System scan frequency is low, resolution ratio is low and binocular solid The low disadvantage of vision system measurement accuracy, the embodiment of the present application carry out preliminary three-dimensional measurement by Constructed Lighting Vision System first, And then the measurement accuracy of integrated structure light vision system, using preliminary three-dimensional measuring result as reference value, by presetting Stereo matching Algorithm determines corresponding first location of pixels of smallest match costs of the object point P in Binocular Stereo Vision System and optimal second picture Plain position determines the corresponding practical parallaxes of object point P, last root further according to first location of pixels and optimal second location of pixels The practical three-dimensional coordinate of object point P can be calculated according to the practical parallax.As it can be seen that the embodiment of the present application realizes structure light vision The fusion of system and Binocular Stereo Vision System can both overcome Constructed Lighting Vision System scan frequency is low, resolution ratio is low to ask Topic, and can overcome the problems, such as that Binocular Stereo Vision System measurement accuracy is low, to ensure the scene in measured object high-speed motion Under can also measure the three-dimensional coordinate exact value of measured object.
It should be understood that above general description and following detailed description is only exemplary and explanatory, not The disclosure can be limited.
Description of the drawings
In order to more clearly explain the embodiment of the invention or the technical proposal in the existing technology, to embodiment or will show below There is attached drawing needed in technology description to be briefly described, it should be apparent that, the accompanying drawings in the following description is only this Some embodiments of invention without having to pay creative labor, may be used also for those of ordinary skill in the art To obtain other attached drawings according to these attached drawings.
Fig. 1 is the hardware architecture diagram of three-dimension measuring system provided by the embodiments of the present application;
Fig. 2 is the flow chart of method for three-dimensional measurement provided by the embodiments of the present application;
Fig. 3 is that the measuring principle based on Binocular Stereo Vision System is shown in method for three-dimensional measurement provided by the embodiments of the present application It is intended to;
Fig. 4 is that the Pixel matching based on Binocular Stereo Vision System shows in method for three-dimensional measurement provided by the embodiments of the present application It is intended to;
Fig. 5 is the structure diagram of three-dimensional measuring apparatus provided by the embodiments of the present application;
Fig. 6 is the structure diagram of three-dimension measuring system provided by the embodiments of the present application.
Specific implementation mode
In order to make those skilled in the art more fully understand the technical solution in the embodiment of the present invention, and make of the invention real The above objects, features, and advantages for applying example can be more obvious and easy to understand, below in conjunction with the accompanying drawings to technical side in the embodiment of the present invention Case is described in further detail.
The embodiment of the present application provides a kind of method for three-dimensional measurement, apparatus and system, simultaneous to solve to be difficult in the related technology The problem of caring for measurement accuracy and sweep speed.
Inventor has found that Constructed Lighting Vision System and Binocular Stereo Vision System are carrying out in the research process of the application When three-dimensional measurement, advantage and disadvantage are individually present, therefore the embodiment of the present application carries out same measured object using above two vision system Three-dimensional measurement, the 3 d measurement data obtained to two kinds of vision systems merge, and make to maximize favourable factors and minimize unfavourable ones between two kinds of vision systems, Obtain accuracy height and the method for three-dimensional measurement suitable for high-speed moving object.With reference to the embodiment of the present application shown in FIG. 1 The embodiment of the present application is described in detail in the structure chart of the three-dimension measuring system of offer.
Referring to Fig.1, three-dimension measuring system provided by the embodiments of the present application includes at least an optical projection device and three figures As collector;Wherein, optical projection device 110 and image acquisition device 120 constitute Constructed Lighting Vision System, and optical projection device 110 has Body can be laser, and projection forms structure optical plane 130;220 structure of first image acquisition device 210 and the second image acquisition device At Binocular Stereo Vision System.
Optionally, area array cameras, the first image acquisition device 210 and second specifically may be used in above-mentioned image acquisition device 120 Image acquisition device 220 is all made of line-scan digital camera, to obtain image more higher than area array cameras scan frequency higher, resolution ratio.
The fusion of measurement data between two kinds of vision systems of realization, before measuring, first to two kinds of visions in Fig. 1 The relative position relation of system is demarcated, specific to demarcate content mainly including following three aspects:
1) calibration of Constructed Lighting Vision System
I.e. to the calibration of optical projection device 110 and image acquisition device 120 in Fig. 1, any existing calibration specifically may be used Method, the embodiment of the present application are defined not to this.Wherein, one directly related with the data fusion of two kinds of vision systems mark Determining parameter is, the measurement accuracy of entire three-dimension measuring system shown in the measurement accuracy δ namely Fig. 1 of Constructed Lighting Vision System.Example Such as, it as δ=1cm, indicates that the object point coordinate that the three-dimension measuring system finally measures is accurate to 1cm, as δ=1mm, indicates The object point coordinate that the three-dimension measuring system finally measures is accurate to 1mm.
2) calibration of Binocular Stereo Vision System
Include mainly in the two i.e. to the calibration of the first image acquisition device 210 and the second image acquisition device 220 in Fig. 1 The calibration of portion's parameter, as focal length f (the first image acquisition device 210 is identical with the focal length of the second image acquisition device 220, be f), base Principle point location u (the i.e. center of first image acquisition device 210 imaging plane of the standard away from b, distortion factor, the first image acquisition device 210 The intersection position of the optical axis and its imaging plane of point position namely the first image acquisition device 210) etc..Wherein, the first Image Acquisition The visual field plane of device 210 and the visual field plane of the second image acquisition device 220 overlap, and stand-off b indicates the first image acquisition device 210 Optical center and the second image acquisition device 220 the distance between optical center.
3) calibration of two kinds of vision system relative position relations
As shown in Figure 1, the embodiment of the present application utilizes sawtooth target to Constructed Lighting Vision System and Binocular Stereo Vision System Relative position relation demarcated;That is, using image acquisition device 120 and the first image acquisition device 210 simultaneously to sawtooth target It is shot, (optical axis of each image acquisition device is such as adjusted system light path to be adjusted by analyzing the image taken Direction, the light regulating projector light projecting direction) so that the visual field plane of the first image acquisition device 210 (namely the second figure As the visual field plane of collector 220) it is overlapped with structure optical plane 130, coordinate system O where analysis obtains Constructed Lighting Vision Systems- XsYsZsWith the first coordinate system O where the first image acquisition device 210L-XLYLZLBetween coordinate transformation relation M.Certainly, practical In, coordinate system O where obtaining Constructed Lighting Vision System can also be demarcated based on above-mentioned principles-XsYsZsIt is adopted with the second image The second coordinate system O where storage 220R-XRYRZRBetween coordinate transformation relation M ';Examples below, which is only converted with coordinate, to close It is to illustrate that specific method for three-dimensional measurement, those skilled in the art can be accordingly derived by based on coordinate transformation relation for M The method for three-dimensional measurement of M '.
Specifically, three coordinate systems mentioned above can define as follows:
First, as shown in Figure 1, for the first coordinate system O where the first image acquisition device 210L-XLYLZLIt is defined as follows: Origin OLIt can be set as the optical center of the first image acquisition device 210, ZLAxis is parallel to and (coincides with) the first image acquisition device 210 Optical axis, XLAxis is perpendicular to ZLAxis, and XLAxis is parallel with the visual field plane of the first image acquisition device 210, YLAxis is respectively perpendicular to described ZLAxis and XLAxis.In practical application, to reduce data processing difficulty, it is Y that can take the moving direction of measured objectLAxis direction, and phase The light path that the first image acquisition device 210 should be adjusted, the first coordinate system where making it meet above-mentioned definition.
Optionally, the origin O in the application other embodimentLIt can also be set as the principal point of the first image acquisition device 210, That is the intersection point of the optical axis and imaging plane of the first image acquisition device 210.
Secondly, the second coordinate system O where the second image acquisition device 220R-XRYRZRIt is referred to above-mentioned first coordinate system OL- XLYLZLIt establishes, and meets XLAxis and XRAxis is parallel, YLAxis and YRAxis is parallel, ZLAxis and ZRAxis is parallel (to avoid lines or label weight It closes, the second coordinate system O is illustrated only in Fig. 1R-XRYRZRZRAxis).
Again, for coordinate system O where above structure light vision systems-XsYsZsIn, origin OsImage is can be set as to adopt The optical center of storage 120, ZsAxis is parallel to the optical axis of image acquisition device 120, XsAxis is perpendicular to ZsAxis, and XsAxis and structure optical plane 130 is parallel (it is possible thereby to deduce, XsAxis is also parallel with XLAxis), YsAxis respectively with the ZsAxis and XsAxis is vertical.
After the completion of the definition of above three coordinate system, you can according to synchronization image acquisition device 120 and the first Image Acquisition The image analysis that device 210 takes obtains above-mentioned coordinate transformation relation M.
Below based on the above calibration result, start to introduce method for three-dimensional measurement provided by the embodiments of the present application.
With reference to Fig. 2, method for three-dimensional measurement provided by the embodiments of the present application includes the following steps:
S1, the calibration information for obtaining Constructed Lighting Vision System and Binocular Stereo Vision System.
Wherein, the calibration information includes at least the stand-off of measurement accuracy δ described above, Binocular Stereo Vision System B, focal length f, the principle point location u of the first image acquisition device 210 and Constructed Lighting Vision System coordinate system Os-XsYsZsIt is sat with first Mark system OL-XLYLZLBetween coordinate transformation relation M.
S2, using Constructed Lighting Vision System, determine that an object point P is in coordinate system O on measured objects-XsYsZsUnder coordinate (xs, zs)。
It optionally, can be optional one in the two dimensional image captured by the image acquisition device 120 of Constructed Lighting Vision System Pixel, and then utilize coordinate system Os-XsYsZsCoordinate (the x of the object point P on the corresponding measured object of the picture point is calculateds,zs)。
S3, according to the coordinate transformation relation M to the coordinate (xs,zs) coordinate conversion is carried out, object point P is obtained first Coordinate system OL-XLYLZLUnder reference coordinate (xL,zL)。
Since relative to the first image acquisition device 210, the scan frequency and resolution ratio of image acquisition device 120 are all relatively low, therefore The image that first image acquisition device 210 takes includes more pixels, only can not obtain the first figure by coordinate conversion The coordinate of each pixel, can not obtain the coordinate of each object point in the image taken as collector 210;Moreover, above-mentioned The object point P being converted to by coordinate is in the first coordinate system OL-XLYLZLUnder coordinate there are large errors, cannot function as reality Coordinate.In view of this, the embodiment of the present application by the object point P being converted to by coordinate in the first coordinate system OL-XLYLZLUnder coordinate As the reference coordinate of object point P, the reality of the higher object point P of accuracy is further determined that in conjunction with the measuring principle of binocular vision system Border coordinate.
S4, according to the reference coordinate value z of the stand-off b, focal length f, measurement accuracy δ and the object point PL, determine object point P Parallax minimum value d between the first image acquisition device and the second image acquisition device of the Binocular Stereo Vision SystemminWith regarding Poor maximum value dmax
Wherein, the calculation formula of parallax minimum value isThe calculation formula of parallax maximum value is
In the embodiment of the present application, above-mentioned parallax minimum value dminWith parallax maximum value dmaxCalculation formula according to binocular solid The measuring principle of vision system obtains, and is introduced with reference to schematic diagram shown in Fig. 3.
Plane shown in Fig. 3 is the first coordinate system OL-XLYLZLZLAxis and XLPlane that axis is constituted (namely the second coordinate system ZRAxis and XRThe visual field plane of plane namely the first image acquisition device 210 and the second image acquisition device 220 that axis is constituted), And meet coordinate system definition described previously;Wherein, S1 and S2 is respectively the first image acquisition device 210 and the second image acquisition device 220 imaging plane, OLAnd ORRespectively coordinate origin (and the first image acquisition device 210 and the second image acquisition device 220 Optical center);Object point P and optical center OLBetween line and imaging plane S1 intersection point P1, the first picture point of corresponding object point P;Object point P and Optical center ORBetween line and imaging plane S2 intersection point P2, the second picture point of corresponding object point P.
Assuming that P is in the first coordinate system OL-XLYLZLUnder coordinate be (xL,yL,zL), then two coordinate systems shown in Fig. 3 OL-XLZLUnder, the coordinate of P is (xL,zL), intersection point P1The coordinate at place is (xP1,zP1), the second coordinate system O shown in Fig. 3R-XRZR Under, intersection point P2The coordinate at place is (xP2,zP2), and have:
Parallaxes of the object point P in Binocular Stereo Vision System can be expressed as d=xP1-xP2
zP1=zP2=f, zL=Z (wherein, Z is known as the depth value of object point P).
By triangle POLORWith triangle PP1P2Known to this similar geometrical relationship:
Deformation obtains:That is,
Again since the measurement accuracy of Constructed Lighting Vision System is δ, therefore object point P is in the first coordinate system OL-XLYLZLZLAxis is real Border coordinate value value range is [zL-δ,zL+δ], the minimum value so as to obtain parallax d isMaximum value is
S5, according to the reference coordinate (xL,zL) and principle point location u, determine object point P captured by the first image acquisition device The first image in corresponding first location of pixels x1
First location of pixels namely intersection point P shown in Fig. 31Position in the first image.Due to first figure The second image captured by picture and the second imaging sensor is really made of the pel array of N row M rows, therefore arbitrary picture point corresponds to Location of pixels, (n, m) including principle point location can indicate, i.e., with the corresponding row number of respective pixel and line number;Wherein, 1 ≤ n≤N and 1≤m≤M, alternatively, 0≤n≤N-1 and 0≤m≤M-1.
In one feasible embodiment of the embodiment of the present application, for improve Binocular Stereo Vision System resolution ratio, first Image acquisition device 210 and the second image acquisition device 220 are using the line-scan digital camera for containing only a sensitivity speck in short transverse, i.e., described The total line number M=1 of pixel in first image and the second image, principle point location and the corresponding location of pixels of arbitrary picture point only need to consider Respective pixel row number.For example, the corresponding row number of principal point is n0, then principle point location u be denoted as u=n0
It is defined by coordinate system it is found that principal point is in XLCoordinate value on axis is 0, then for xP1、x1And u, there are following relationships: xP1- 0=k (x1- u), i.e.,
Wherein, conversion coefficients of the k between coordinate value and location of pixels indicates XLAxis unit distance and single pixel width Ratio.For example, each pixel wide is 0.5mm, XLAxis unit distance is 1mm, then k=1/0.5=2;Conversely, each pixel Width is 1mm, XLAxis unit distance is 0.5mm, then k=0.5.
Also, the schematic diagram with reference to shown in Fig. 3, by zP1=f and geometrical relationshipIt can obtain:
By above-mentioned two formulaWithIt can obtain:
Particularly, as k=1, the calculation formula of the first location of pixels is:
S6, according to the parallax minimum value dmin, parallax maximum value dmaxWith the first image point position x1, determine object point P The second location of pixels x in the second image captured by two image acquisition devices2Interval T.
Specifically, according to d=x1-x2And d ∈ [dmin,dmax], can obtain:x2∈ T=[x1-dmax,x1-dmin]。
S7, basis preset Stereo Matching Algorithm, calculate separately the first location of pixels x1Corresponding first picture point and institute State each value x in interval T2Matching power flow between corresponding alternative second picture point.
S8, Matching power flow minimum value is determined from each Matching power flow being calculated, and determine the Matching power flow most It is small to be worth corresponding optimal second location of pixels x2'。
S9, according to the first location of pixels x1With optimal second location of pixels x2', calculate object point P and adopted in the first image Practical parallax d in storage and the second image acquisition deviceP
Specifically, the calculation formula of practical parallax is:dP=x1-x 2'。
S10, according to the practical parallax dPObject point P is calculated in the first coordinate system OL-XLYLZLUnder actual coordinate.
As it can be seen that the embodiment of the present application determines tested object point P in structure light vision system by Constructed Lighting Vision System first Coordinate (the x to unite under coordinate systems,zs);Then it is sat according to the coordinate transformation relation between the two kinds of vision systems demarcated in advance Mark conversion obtains references of the object point P under corresponding first coordinate system of the first image acquisition device of Binocular Stereo Vision System and sits Mark (xL,zL);Then according to the reference coordinate (xL,zL) and the measurement accuracy δ of Constructed Lighting Vision System that demarcates in advance, profit With the three-dimensional measurement principle of Binocular Stereo Vision System, the first picture point that object point P is obtained by Binocular Stereo Vision System is determined The first location of pixels x at place1And the interval T of the second location of pixels where corresponding second picture point;Then to value Each value in the T of section calculates separately its corresponding alternative second picture point and x1Corresponding first picture point matches institute's band The Matching power flow come, and by the value x in the corresponding T of Matching power flow minimum value2'As optimal second location of pixels;Then may be used To determine practical parallax ds of the object point P in the first image acquisition device and the second image acquisition deviceP=x1-x'2, and then according to described Practical parallax dPCalculate actual coordinates of the object point P under first coordinate system.
By above technical scheme it is found that for Constructed Lighting Vision System scan frequency is low, resolution ratio is low and binocular solid The low disadvantage of vision system measurement accuracy, the embodiment of the present application carry out preliminary three-dimensional measurement by Constructed Lighting Vision System first, And then the measurement accuracy of integrated structure light vision system, using preliminary three-dimensional measuring result as reference value, by presetting Stereo matching Algorithm determines corresponding first location of pixels of smallest match costs of the object point P in Binocular Stereo Vision System and optimal second picture Plain position determines the corresponding practical parallaxes of object point P, last root further according to first location of pixels and optimal second location of pixels The practical three-dimensional coordinate of object point P can be calculated according to the practical parallax.As it can be seen that the embodiment of the present application realizes structure light vision The fusion of system and Binocular Stereo Vision System can both overcome Constructed Lighting Vision System scan frequency is low, resolution ratio is low to ask Topic, and can overcome the problems, such as that Binocular Stereo Vision System measurement accuracy is low, to ensure the scene in measured object high-speed motion Under can also measure the three-dimensional coordinate exact value of measured object.
It should be noted that since the corresponding practical parallax of different object points is not quite similar, therefore can implement according to the application Example determines the practical parallax of the corresponding object point of each pixel in the two dimensional image captured by image acquisition device 120 respectively, into And determine the object point in the first coordinate system OL-XLYLZLUnder actual coordinate (i.e. cycle execute step S2 to S10).
Optionally, to be different from the reference coordinate (x for the object point P being converted to by coordinateL,zL), the final institute of the present embodiment The actual coordinate for the object point P to be measured, i.e. object point P are in the first coordinate system OL-XLYLZLUnder actual coordinate, be denoted as (x, y, z), The then conclusion by being obtained above according to Fig. 3AndKnown to:
Object point P is in ZLThe calculation formula of axis actual coordinate value z is z=bf/dp
Object point P is in XLThe calculation formula of axis actual coordinate value x is x=zxL/zL
For object point P in YLAxis actual coordinate value y, the Y taken by the embodiment of the present applicationLAxis direction is the movement of object point P Direction, therefore can be calculated to obtain actual coordinate value y according to the movement velocity and run duration of object point P.
The embodiment of the present application illustrated below determines the original of optimal second location of pixels by presetting Stereo Matching Algorithm Reason.
For example, it is assumed that dmin=0.1, dmax=0.3, and the first location of pixels x is calculated in step S51=3.5, so as to To determine x2∈ T=[x1-dmax,x1-dmin]=[3.2,3.4], then calculate separately:x1=3.5 HesCorresponding With cost C1, x1=3.5 HesCorresponding Matching power flow C2 and x1=3.5 HesCorresponding Matching power flow C3;The minimum value that Matching power flow is obtained by comparing C1, C2 and C3 is C2, then illustrates x1=3.5 corresponding first picture points andCorresponding second picture point is matched for best picture point, i.e., optimal second location of pixels x'2=3.3;And then it can calculate To practical parallax ds of the object point P in Binocular Stereo Vision SystemP=3.5-3.3=0.2.
Pixel matching schematic diagram in Binocular Stereo Vision System as shown in Figure 4, the first picture point in determining the first image Corresponding first location of pixels, and determine according to disparity range the value range of corresponding second location of pixels in the second image Afterwards, each picture point in the value range can be described as alternative second picture point, be likely to be most matched with the first picture point Second picture point, and the picture point except the value range be impossible to for most matched second picture point of the first picture point;Therefore it only need to be all over Alternative second picture point each of is gone through within the scope of this, and calculates separately its Matching power flow between the first picture point, Matching power flow is most It is small to be worth corresponding alternative second picture point as with most matched second picture point of the first picture point, it should most be matched with the first picture point accordingly The location of pixels of the second picture point be optimal second location of pixels.
As it can be seen that measurement accuracy δ of the embodiment of the present application according to Constructed Lighting Vision System, object point P is in binocular tri-dimensional for prediction Disparity range in feel system, and then corresponding second picture point corresponding is determined according to the disparity range and the first location of pixels The value range of two location of pixels, to only being traversed in the corresponding image block of the value range, according to Matching power flow determine with The second best picture point of first Pixel matching degree;Obviously, compared with the existing technology in the case where disparity range can not be predicted, It is traversed in entire image captured by second image acquisition device to determine that the second best picture point, the embodiment of the present application can reduce The search range of second picture point improves matching speed, can also improve matching precision, avoid because being searched in incoherent position Second picture point and cause error hiding.
In one feasible embodiment of the application, default Stereo Matching Algorithm employed in step S7 can be based on The Region Matching Algorithm of AD-census transformation, that is, it is corresponding to traverse the first picture point in the image captured by the second image acquisition device Each alternative second picture point in disparity range calculates the first picture point using AD-census transformation for mula and each is alternative Matching power flow between second picture point.
Specifically, the step of calculating Matching power flow using AD-census transformation for mula is as follows:
S71, first picture point and alternative second are calculated according to AD (absolute difference, absolute difference) algorithm The first Matching power flow C between picture pointAD(P,d);
Wherein, the first Matching power flow CAD(P, d) indicates that the gray value absolute difference between two picture points, formula are:
Wherein,For the gray value of first picture point,It is current The gray value of alternative second picture point traversed.
S72, according to census become scaling method calculate between first picture point and alternative second picture point second matching generation Valence Ccensus(P,d);
S73, by normalized function by above-mentioned first Matching power flow CAD(P, d) and the second Matching power flow Ccensus(P, d) into Row normalized;
In the embodiment of the present application, normalized function formula is ρ (C, λ)=1-exp (C/ λ), to the first Matching power flow CAD(P, D) it is normalized to obtain:ρ(CAD(P,d),λAD)=1-exp (CAD(P,d)/λAD), to the second Matching power flow Ccensus(P, D) it is normalized to obtain:ρ(Ccensus(P,d),λcensus)=1-exp (Ccensus(P,d)/λcensus)。
Wherein, λADFor the default weight of first Matching power flow, λcensusFor the default power of second Matching power flow Weight.
S74, to treated two kinds of Matching power flow ρ (CAD(P,d),λAD) and ρ (Ccensus(P,d),λcensus) summation, it obtains To P1And P2 jBetween Matching power flow C (P, d)=ρ (CAD(P,d),λAD)+ρ(Ccensus(P,d),λcensus)。
By taking the calculating Matching power flow C1 described in embodiment above as an example, that is, calculate the image captured by the first image acquisition device Middle x1In image captured by=3.5 corresponding first picture points and the second image acquisition deviceCorresponding alternative second picture Matching power flow between point, calculating process are:1) above-mentioned C is utilizedADIts first Matching power flow is calculated in (P, d) calculation formulaWherein,For x1The gray value of=3.5 corresponding first picture points,ForCorresponding alternative The gray value of two picture points;2) become scaling method using census and x is calculated1=3.5 corresponding first picture points andIt is corresponding Alternative second picture point between the second Matching power flow3) it utilizes normalized function to execute normalized, obtainsWith4) read group total is executed, P is obtained1And P2 1Between Matching power flow
In one feasible embodiment of the application, being determined from each Matching power flow being calculated described in step S8 Matching power flow minimum value, specifically includes:
S81, each Matching power flow being calculated is polymerize according to right-angled intersection algorithm;
Since the image captured by image acquisition device can be influenced by factors such as illumination, noises, cause same in space Gray value difference in image of the object point captured by two image acquisition devices of Binocular Stereo Vision System is larger, therefore, this The Matching power flow of surrounding pixel point centered on point to be matched (i.e. above-mentioned first picture point) polymerize by application embodiment, adds Power together, come error caused by reducing single picture point.
S82, for each Matching power flow by polymerization, according to the victor is a king (winner takes all, WTA) algorithm Determine Matching power flow minimum value therein.
As it can be seen that the embodiment of the present application is polymerize by Matching power flow, to eliminate same object point in Binocular Stereo Vision System Influence of the gray value difference in image to Matching power flow captured by two image acquisition devices improves the calculating essence of Matching power flow Exactness to improve matching precision, and then improves the three-dimensional measurement precision of whole system.
Finally, it should be noted that one of ordinary skill in the art will appreciate that realizing the whole in above-described embodiment method Or part flow, it is that relevant hardware can be instructed to complete by computer program, it is non-that the program can be stored in one In transitory computer readable storage medium, the program is when being executed, it may include such as the flow of the embodiment of above-mentioned each method.Its In, the non-transient computer readable storage medium can be magnetic disc, CD, read-only memory (Read-Only Memory, ROM) or random access memory (RandomAccessMemory, RAM) etc..
Correspondingly, the embodiment of the present application discloses a kind of three-dimensional measuring apparatus, which regards with structure light shown in FIG. 1 respectively Feel system is connected with the data output interface of Binocular Stereo Vision System, is melted by the measurement data to two kinds of vision systems It closes, obtains more accurately, suitable for the three-dimensional measuring result of high-speed moving object.Structural schematic diagram shown in Figure 5, it is described Three-dimensional measuring apparatus 500 includes:
Calibration information acquiring unit 501, the calibration for obtaining Constructed Lighting Vision System and Binocular Stereo Vision System are believed Breath;The calibration information includes at least, Constructed Lighting Vision System coordinate system Os-XsYsZsWith the first of Binocular Stereo Vision System The corresponding first coordinate system O of image acquisition deviceL-XLYLZLBetween coordinate transformation relation, measurement accuracy δ, the binocular tri-dimensional Stand-off b, the focal length f of feel system and the principle point location u of described first image collector;
Reference coordinate determination unit 502 determines that an object point P is in coordinate on measured object for utilizing Constructed Lighting Vision System It is Os-XsYsZsUnder coordinate (xs,zs), and according to the coordinate transformation relation to the coordinate (xs,zs) coordinate conversion is carried out, Object point P is obtained in the first coordinate system OL-XLYLZLUnder reference coordinate (xL,zL);
Disparity range determination unit 503, for according to the stand-off b, focal length f, measurement accuracy δ and the object point P Reference coordinate value zL, determine object point P the Binocular Stereo Vision System the first image acquisition device and the second image acquisition device Between parallax minimum value dminWith parallax maximum value dmax;Wherein,
First picture point determination unit 504, for according to the reference coordinate (xL,zL) and principle point location u, determine that object point P exists Corresponding first location of pixels x in the first image captured by first image acquisition device1
Second picture point determination unit 505, for according to the parallax minimum value dmin, parallax maximum value dmaxWith the first picture point Position x1, determine the second location of pixels x in the second images of the object point P captured by the second image acquisition device2Interval T =[x1-dmax,x1-dmin];
Matching power flow computing unit 506, for according to Stereo Matching Algorithm is preset, calculating separately first location of pixels x1Corresponding first picture point and each value x in the interval T2Matching power flow between corresponding alternative second picture point;
Best match determination unit 507, for determining Matching power flow minimum value from each Matching power flow being calculated, And determine the corresponding optimal second location of pixels x of the Matching power flow minimum value2';
Practical disparity computation unit 508, for according to the first location of pixels x1With optimal second location of pixels x2', Calculate practical parallax ds of the object point P in the first image acquisition device and the second image acquisition deviceP
Actual coordinate computing unit 509, for according to the practical parallax dPObject point P is calculated in the first coordinate system OL- XLYLZLUnder actual coordinate.
The concrete operating principle of each unit can refer to embodiment of the method above in above-mentioned three-dimensional measuring apparatus.
Optionally, above-mentioned first picture point determination unit 504 specifically can be configured as:
Utilize formulaThe first location of pixels x is calculated1
Optionally, above-mentioned Matching power flow computing unit 506 specifically can be configured as:
Each value x in the interval T is obtained successively2, by its corresponding picture point alternately the second picture point;
The first Matching power flow C between first picture point and alternative second picture point is calculated according to absolute difference AD algorithmsAD(P, d);Wherein, For the gray value of first picture point,It is described alternative The gray value of second picture point;
Become scaling method according to census and calculates the second Matching power flow between first picture point and alternative second picture point Ccensus(P,d);
According to normalized function ρ (C, λ)=1-exp (C/ λ) respectively to first Matching power flow and the second Matching power flow Be normalized, and to after normalized the first Matching power flow and the second Matching power flow sum, obtain the value x2Corresponding Matching power flow C (P, d)=ρ (CAD(P,d),λAD)+ρ(Ccensus(P,d),λcensus);
Wherein, λADFor the default weight of first Matching power flow, λcensusFor the default power of second Matching power flow Weight.
Optionally, above-mentioned best match determination unit 507 specifically can be configured as:
The each Matching power flow being calculated is polymerize according to right-angled intersection algorithm;
For each Matching power flow by polymerization, according to the victor is a king, WTA algorithms determine that Matching power flow therein is minimum Value.
Optionally, above-mentioned actual coordinate computing unit 509 specifically can be configured as:
According to formula z=bf/dpObject point P is calculated in the first coordinate system OL-XLYLZLUnder ZLAxis actual coordinate value z;
According to formula x=zxL/zLObject point P is calculated in the first coordinate system OL-XLYLZLUnder XLAxis actual coordinate value x;
And object point P is calculated in the first coordinate system O according to the movement velocity and run duration of object point PL-XLYLZLUnder YLAxis actual coordinate value y.
By above technical scheme it is found that for Constructed Lighting Vision System scan frequency is low, resolution ratio is low and binocular solid The low disadvantage of vision system measurement accuracy, the embodiment of the present application carry out preliminary three-dimensional measurement by Constructed Lighting Vision System first, And then the measurement accuracy of integrated structure light vision system, using preliminary three-dimensional measuring result as reference value, by presetting Stereo matching Algorithm determines corresponding first location of pixels of smallest match costs of the object point P in Binocular Stereo Vision System and optimal second picture Plain position determines the corresponding practical parallaxes of object point P, last root further according to first location of pixels and optimal second location of pixels The practical three-dimensional coordinate of object point P can be calculated according to the practical parallax.As it can be seen that the embodiment of the present application realizes structure light vision The fusion of system and Binocular Stereo Vision System can both overcome Constructed Lighting Vision System scan frequency is low, resolution ratio is low to ask Topic, and can overcome the problems, such as that Binocular Stereo Vision System measurement accuracy is low, to ensure the scene in measured object high-speed motion Under can also measure the three-dimensional coordinate exact value of measured object.
The embodiment of the present application also provides a kind of three-dimension measuring systems;Frame diagram as shown in Figure 6, the system include:Structure Three-dimensional measuring apparatus 500 described in light vision system 100, Binocular Stereo Vision System 200 and any one embodiment above.
Wherein, Constructed Lighting Vision System 100 includes optical projection device 110 and image acquisition device 120, binocular stereo vision system System 200 includes the first image acquisition device 210 and the second image acquisition device 220, and hardware configuration can refer to shown in Fig. 1.
The measurement data with Constructed Lighting Vision System 100 and Binocular Stereo Vision System 200 respectively of three-dimensional measuring apparatus 500 Output interface (such as output interface of image acquisition device 120, the first image acquisition device 210 and the second image acquisition device 220) connects, It is merged by the measurement data to two kinds of vision systems, obtains three-dimensional survey more accurate, suitable for high-speed moving object Measure result.
As it can be seen that three-dimension measuring system provided by the embodiments of the present application, while utilizing Constructed Lighting Vision System and binocular solid Vision system carries out three-dimensional measurement, and the 3 d measurement data that the two obtains is merged, to can both overcome structure light The problem that vision system scan frequency is low, resolution ratio is low, and can overcome the problems, such as that Binocular Stereo Vision System measurement accuracy is low, Ensure also to measure the three-dimensional coordinate exact value of measured object under the scene of measured object high-speed motion.
Each embodiment in this specification is described in a progressive manner, identical similar portion between each embodiment Point just to refer each other, and the highlights of each of the examples are the differences with other embodiments.Especially for system reality For applying example, since it is substantially similar to the method embodiment, so description is fairly simple, related place is referring to embodiment of the method Part explanation.
It should be understood that the invention is not limited in the precision architectures for being described above and being shown in the accompanying drawings, and And various modifications and changes may be made without departing from the scope thereof.The scope of the present invention is limited only by the attached claims.

Claims (11)

1. a kind of method for three-dimensional measurement, which is characterized in that including:
Obtain the calibration information of Constructed Lighting Vision System and Binocular Stereo Vision System;The calibration information includes at least, structure Light vision system coordinate system Os-XsYsZsFirst coordinate system O corresponding with the first image acquisition device of Binocular Stereo Vision SystemL- XLYLZLBetween coordinate transformation relation, measurement accuracy δ, the stand-off b of the Binocular Stereo Vision System, focal length f, Yi Jisuo State the principle point location u of the first image acquisition device;
Using Constructed Lighting Vision System, determine that an object point P is in coordinate system O on measured objects-XsYsZsUnder coordinate (xs,zs);
According to the coordinate transformation relation to the coordinate (xs,zs) coordinate conversion is carried out, object point P is obtained in the first coordinate system OL- XLYLZLUnder reference coordinate (xL,zL);
According to the reference coordinate value z of the stand-off b, focal length f, measurement accuracy δ and the object point PL, determine object point P described double Parallax minimum value d between the first image acquisition device and the second image acquisition device of item stereo vision systemminWith parallax maximum value dmax;Wherein,
According to the reference coordinate (xL,zL) and principle point location u, determine first figures of the object point P captured by the first image acquisition device The corresponding first location of pixels x as in1
According to the parallax minimum value dmin, parallax maximum value dmaxWith the first image point position x1, determine that object point P is adopted in the second image The second location of pixels x in the second image captured by storage2Interval T=[x1-dmax,x1-dmin];
According to default Stereo Matching Algorithm, the first location of pixels x is calculated separately1Corresponding first picture point and the value area Between each value x in T2Matching power flow between corresponding alternative second picture point;
Matching power flow minimum value is determined from each Matching power flow being calculated, and determines that the Matching power flow minimum value corresponds to Optimal second location of pixels x '2
According to the first location of pixels x1With optimal second location of pixels x '2, object point P is calculated in the first image acquisition device and the Practical parallax d in two image acquisition devicesP
According to the practical parallax dPObject point P is calculated in the first coordinate system OL-XLYLZLUnder actual coordinate.
2. according to the method described in claim 1, it is characterized in that, according to the reference coordinate (xL,zL) and principle point location u, really The first location of pixels x in the first images of the earnest point P captured by described first image collector1, including:
Utilize formulaThe first location of pixels x is calculated1
3. according to the method described in claim 1, it is characterized in that, according to default Stereo Matching Algorithm, described is calculated separately One location of pixels x1Corresponding first picture point and each value x in the interval T2Between corresponding second picture point With cost, including:
Each value x in the interval T is obtained successively2, by its corresponding picture point alternately the second picture point;
The first Matching power flow C between first picture point and alternative second picture point is calculated according to absolute difference AD algorithmsAD(P,d); Wherein, For the gray value of first picture point,It is described alternative second The gray value of picture point;
Become scaling method according to census and calculates the second Matching power flow C between first picture point and alternative second picture pointcensus (P,d);
First Matching power flow and the second Matching power flow are carried out respectively according to normalized function ρ (C, λ)=1-exp (C/ λ) Normalized, and to after normalized the first Matching power flow and the second Matching power flow sum, obtain the value x2It is right Matching power flow C (P, the d)=ρ (C answeredAD(P,d),λAD)+ρ(Ccensus(P,d),λcensus);
Wherein, λADFor the default weight of first Matching power flow, λcensusFor the default weight of second Matching power flow.
4. according to the method described in claim 1, it is characterized in that, determining matching generation from each Matching power flow being calculated Valence minimum value, including:
The each Matching power flow being calculated is polymerize according to right-angled intersection algorithm;
For each Matching power flow by polymerization, according to the victor is a king, WTA algorithms determine Matching power flow minimum value therein.
5. according to the method described in claim 1, it is characterized in that, according to the practical parallax dPObject point P is calculated described first Coordinate system OL-XLYLZLUnder actual coordinate, including:
According to formula z=bf/dpObject point P is calculated in the first coordinate system OL-XLYLZLUnder ZLAxis actual coordinate value z;
According to formula x=zxL/zLObject point P is calculated in the first coordinate system OL-XLYLZLUnder XLAxis actual coordinate value x.
6. a kind of three-dimensional measuring apparatus, which is characterized in that including:
Calibration information acquiring unit, the calibration information for obtaining Constructed Lighting Vision System and Binocular Stereo Vision System;It is described Calibration information includes at least, Constructed Lighting Vision System coordinate system Os-XsYsZsWith the first Image Acquisition of Binocular Stereo Vision System The corresponding first coordinate system O of deviceL-XLYLZLBetween coordinate transformation relation, measurement accuracy δ, the Binocular Stereo Vision System The principle point location u of stand-off b, focal length f and described first image collector;
Reference coordinate determination unit determines that an object point P is in coordinate system O on measured object for utilizing Constructed Lighting Vision Systems- XsYsZsUnder coordinate (xs,zs), and according to the coordinate transformation relation to the coordinate (xs,zs) coordinate conversion is carried out, it obtains Object point P is in the first coordinate system OL-XLYLZLUnder reference coordinate (xL,zL);
Disparity range determination unit, for the reference coordinate according to the stand-off b, focal length f, measurement accuracy δ and the object point P Value zL, determine object point P regarding between the first image acquisition device and the second image acquisition device of the Binocular Stereo Vision System Poor minimum value dminWith parallax maximum value dmax;Wherein,
First picture point determination unit, for according to the reference coordinate (xL,zL) and principle point location u, determine object point P in the first figure Corresponding first location of pixels x in the first image as captured by collector1
Second picture point determination unit, for according to the parallax minimum value dmin, parallax maximum value dmaxWith the first image point position x1, Determine the second location of pixels x in the second images of the object point P captured by the second image acquisition device2Interval T=[x1- dmax,x1-dmin];
Matching power flow computing unit, for according to Stereo Matching Algorithm is preset, calculating separately the first location of pixels x1It is corresponding First picture point and each value x in the interval T2Matching power flow between corresponding alternative second picture point;
Best match determination unit for determining Matching power flow minimum value from each Matching power flow being calculated, and determines The corresponding optimal second location of pixels x ' of the Matching power flow minimum value2
Practical disparity computation unit, for according to formula dP=x1-x′2Object point P is calculated in the first image acquisition device and the second image Practical parallax d in collectorP
Actual coordinate computing unit, for according to the practical parallax dPObject point P is calculated in the first coordinate system OL-XLYLZLUnder Actual coordinate.
7. device according to claim 6, which is characterized in that the first picture point determination unit is specifically configured to:Profit Use formulaThe first location of pixels x is calculated1
8. device according to claim 6, which is characterized in that the Matching power flow computing unit is specifically configured to:
Each value x in the interval T is obtained successively2, by its corresponding picture point alternately the second picture point;
The first Matching power flow C between first picture point and alternative second picture point is calculated according to absolute difference AD algorithmsAD(P,d); Wherein, For the gray value of first picture point,It is described alternative second The gray value of picture point;
Become scaling method according to census and calculates the second Matching power flow C between first picture point and alternative second picture pointcensus (P,d);
First Matching power flow and the second Matching power flow are carried out respectively according to normalized function ρ (C, λ)=1-exp (C/ λ) Normalized, and to after normalized the first Matching power flow and the second Matching power flow sum, obtain the value x2It is right Matching power flow C (P, the d)=ρ (C answeredAD(P,d),λAD)+ρ(Ccensus(P,d),λcensus);
Wherein, λADFor the default weight of first Matching power flow, λcensusFor the default weight of second Matching power flow.
9. device according to claim 6, which is characterized in that the best match determination unit is specifically configured to:
The each Matching power flow being calculated is polymerize according to right-angled intersection algorithm;
For each Matching power flow by polymerization, according to the victor is a king, WTA algorithms determine Matching power flow minimum value therein.
10. device according to claim 6, which is characterized in that the actual coordinate computing unit is specifically configured to:
According to formula z=bf/dpObject point P is calculated in the first coordinate system OL-XLYLZLUnder ZLAxis actual coordinate value z;
According to formula x=zxL/zLObject point P is calculated in the first coordinate system OL-XLYLZLUnder XLAxis actual coordinate value x.
11. a kind of three-dimension measuring system, which is characterized in that including:Constructed Lighting Vision System, Binocular Stereo Vision System and right It is required that 6 to 10 any one of them three-dimensional measuring apparatus.
CN201610804538.7A 2016-09-05 2016-09-05 Method for three-dimensional measurement, apparatus and system Active CN106225676B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201610804538.7A CN106225676B (en) 2016-09-05 2016-09-05 Method for three-dimensional measurement, apparatus and system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201610804538.7A CN106225676B (en) 2016-09-05 2016-09-05 Method for three-dimensional measurement, apparatus and system

Publications (2)

Publication Number Publication Date
CN106225676A CN106225676A (en) 2016-12-14
CN106225676B true CN106225676B (en) 2018-10-23

Family

ID=58074425

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201610804538.7A Active CN106225676B (en) 2016-09-05 2016-09-05 Method for three-dimensional measurement, apparatus and system

Country Status (1)

Country Link
CN (1) CN106225676B (en)

Families Citing this family (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110855961A (en) * 2018-08-20 2020-02-28 奇景光电股份有限公司 Depth sensing device and operation method thereof
CN109373904A (en) * 2018-12-17 2019-02-22 石家庄爱赛科技有限公司 3D vision detection device and 3D vision detection method
CN109520480B (en) * 2019-01-22 2021-04-30 合刃科技(深圳)有限公司 Distance measurement method and distance measurement system based on binocular stereo vision
CN109724537B (en) * 2019-02-11 2020-05-12 吉林大学 Binocular three-dimensional imaging method and system
CN111829472A (en) * 2019-04-17 2020-10-27 初速度(苏州)科技有限公司 Method and device for determining relative position between sensors by using total station
CN110926371A (en) * 2019-11-19 2020-03-27 宁波舜宇仪器有限公司 Three-dimensional surface detection method and device
CN113252309A (en) * 2021-04-19 2021-08-13 苏州市计量测试院 Testing method and testing device for near-to-eye display equipment and storage medium

Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2008082870A (en) * 2006-09-27 2008-04-10 Setsunan Univ Image processing program, and road surface state measuring system using this
CN101387501A (en) * 2008-10-06 2009-03-18 天津大学 Ultra-large workpiece circular section shape and azimuthal measurement apparatus and method
CN102012217A (en) * 2010-10-19 2011-04-13 南京大学 Method for measuring three-dimensional geometrical outline of large-size appearance object based on binocular vision
CN102062588A (en) * 2009-11-11 2011-05-18 中国科学院沈阳自动化研究所 Computer binocular vision denture scanning device and three-dimensional reconstruction method thereof
CN102867304A (en) * 2012-09-04 2013-01-09 南京航空航天大学 Method for establishing relation between scene stereoscopic depth and vision difference in binocular stereoscopic vision system
CN102878925A (en) * 2012-09-18 2013-01-16 天津工业大学 Synchronous calibration method for binocular video cameras and single projection light source
CN102937811A (en) * 2012-10-22 2013-02-20 西北工业大学 Monocular vision and binocular vision switching device for small robot
CN103278139A (en) * 2013-05-06 2013-09-04 北京航空航天大学 Variable-focus monocular and binocular vision sensing device
CN103438834A (en) * 2013-09-17 2013-12-11 清华大学深圳研究生院 Hierarchy-type rapid three-dimensional measuring device and method based on structured light projection
CN103903277A (en) * 2012-12-28 2014-07-02 重庆凯泽科技有限公司 Multi-ocular vision based data fusion algorithm
CN104183010A (en) * 2013-05-22 2014-12-03 上海迪谱工业检测技术有限公司 Multi-view three-dimensional online reconstruction method
CN104217439A (en) * 2014-09-26 2014-12-17 南京工程学院 Indoor visual positioning system and method
CN204818380U (en) * 2015-07-15 2015-12-02 广东工业大学 Near -infrared and structured light dual wavelength binocular vision soldering joint tracking system
US9333208B2 (en) * 2013-07-16 2016-05-10 Movses H. Karakossian HCN inhibitors affecting ganglion cell function and visual function

Patent Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2008082870A (en) * 2006-09-27 2008-04-10 Setsunan Univ Image processing program, and road surface state measuring system using this
CN101387501A (en) * 2008-10-06 2009-03-18 天津大学 Ultra-large workpiece circular section shape and azimuthal measurement apparatus and method
CN102062588A (en) * 2009-11-11 2011-05-18 中国科学院沈阳自动化研究所 Computer binocular vision denture scanning device and three-dimensional reconstruction method thereof
CN102012217A (en) * 2010-10-19 2011-04-13 南京大学 Method for measuring three-dimensional geometrical outline of large-size appearance object based on binocular vision
CN102867304A (en) * 2012-09-04 2013-01-09 南京航空航天大学 Method for establishing relation between scene stereoscopic depth and vision difference in binocular stereoscopic vision system
CN102878925A (en) * 2012-09-18 2013-01-16 天津工业大学 Synchronous calibration method for binocular video cameras and single projection light source
CN102937811A (en) * 2012-10-22 2013-02-20 西北工业大学 Monocular vision and binocular vision switching device for small robot
CN103903277A (en) * 2012-12-28 2014-07-02 重庆凯泽科技有限公司 Multi-ocular vision based data fusion algorithm
CN103278139A (en) * 2013-05-06 2013-09-04 北京航空航天大学 Variable-focus monocular and binocular vision sensing device
CN104183010A (en) * 2013-05-22 2014-12-03 上海迪谱工业检测技术有限公司 Multi-view three-dimensional online reconstruction method
US9333208B2 (en) * 2013-07-16 2016-05-10 Movses H. Karakossian HCN inhibitors affecting ganglion cell function and visual function
CN103438834A (en) * 2013-09-17 2013-12-11 清华大学深圳研究生院 Hierarchy-type rapid three-dimensional measuring device and method based on structured light projection
CN104217439A (en) * 2014-09-26 2014-12-17 南京工程学院 Indoor visual positioning system and method
CN204818380U (en) * 2015-07-15 2015-12-02 广东工业大学 Near -infrared and structured light dual wavelength binocular vision soldering joint tracking system

Also Published As

Publication number Publication date
CN106225676A (en) 2016-12-14

Similar Documents

Publication Publication Date Title
CN106225676B (en) Method for three-dimensional measurement, apparatus and system
CN110288642B (en) Three-dimensional object rapid reconstruction method based on camera array
WO2019100933A1 (en) Method, device and system for three-dimensional measurement
CN110230998B (en) Rapid and precise three-dimensional measurement method and device based on line laser and binocular camera
US8718326B2 (en) System and method for extracting three-dimensional coordinates
US10582188B2 (en) System and method for adjusting a baseline of an imaging system with microlens array
EP2568253B1 (en) Structured-light measuring method and system
RU2668404C2 (en) Device for recording images in three-dimensional scale, method for formation of 3d-image and method for producing device for recording images in three dimensional scale
CN103075960B (en) Multi-visual-angle great-depth micro stereo visual-features fusion-measuring method
US20120176380A1 (en) Forming 3d models using periodic illumination patterns
US10271039B2 (en) Structured light matching of a set of curves from two cameras
US10643343B2 (en) Structured light matching of a set of curves from three cameras
JP5633058B1 (en) 3D measuring apparatus and 3D measuring method
CN102997891B (en) Device and method for measuring scene depth
WO2012096747A1 (en) Forming range maps using periodic illumination patterns
CN103299343A (en) Range image pixel matching method
CN102831601A (en) Three-dimensional matching method based on union similarity measure and self-adaptive support weighting
Martel et al. An active approach to solving the stereo matching problem using event-based sensors
CN113643436B (en) Depth data splicing and fusion method and device
Patel et al. Distance measurement system using binocular stereo vision approach
JP7184203B2 (en) Image processing device, three-dimensional measurement system, image processing method
CN113160416B (en) Speckle imaging device and method for coal flow detection
Harvent et al. Multi-view dense 3D modelling of untextured objects from a moving projector-cameras system
JP2008275366A (en) Stereoscopic 3-d measurement system
US20170150123A1 (en) High-Speed Depth Sensing With A Hybrid Camera Setup

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
CP01 Change in the name or title of a patent holder

Address after: 100094 701, 7 floor, 7 building, 13 Cui Hunan Ring Road, Haidian District, Beijing.

Patentee after: Lingyunguang Technology Co.,Ltd.

Address before: 100094 701, 7 floor, 7 building, 13 Cui Hunan Ring Road, Haidian District, Beijing.

Patentee before: Beijing lingyunguang Technology Group Co.,Ltd.

Address after: 100094 701, 7 floor, 7 building, 13 Cui Hunan Ring Road, Haidian District, Beijing.

Patentee after: Beijing lingyunguang Technology Group Co.,Ltd.

Address before: 100094 701, 7 floor, 7 building, 13 Cui Hunan Ring Road, Haidian District, Beijing.

Patentee before: LUSTER LIGHTTECH GROUP Co.,Ltd.

CP01 Change in the name or title of a patent holder