Invention content
The embodiment of the present application provides a kind of method for three-dimensional measurement, apparatus and system, simultaneous to solve to be difficult in the related technology
The problem of caring for measurement accuracy and sweep speed.
In order to solve the above-mentioned technical problem, the embodiment of the invention discloses following technical solutions:
According to the embodiment of the present application in a first aspect, provide a kind of method for three-dimensional measurement, including:
Obtain the calibration information of Constructed Lighting Vision System and Binocular Stereo Vision System;The calibration information includes at least,
Constructed Lighting Vision System coordinate system Os-XsYsZsThe first coordinate corresponding with the first image acquisition device of Binocular Stereo Vision System
It is OL-XLYLZLBetween coordinate transformation relation, measurement accuracy δ, the stand-off b of the Binocular Stereo Vision System, focal length f, with
And the principle point location u of described first image collector;
Using Constructed Lighting Vision System, determine that an object point P is in coordinate system O on measured objects-XsYsZsUnder coordinate (xs,zs),
And according to the coordinate transformation relation to the coordinate (xs,zs) coordinate conversion is carried out, object point P is obtained in the first coordinate system OL-
XLYLZLUnder reference coordinate (xL,zL);
According to the reference coordinate value z of the stand-off b, focal length f, measurement accuracy δ and the object point PL, determine that object point P exists
Parallax minimum value d between the first image acquisition device and the second image acquisition device of the Binocular Stereo Vision SystemminAnd parallax
Maximum value dmax;Wherein,
According to the reference coordinate (xL,zL) and principle point location u, determine of object point P captured by the first image acquisition device
Corresponding first location of pixels x in one image1;
According to the parallax minimum value dmin, parallax maximum value dmaxWith the first image point position x1, determine object point P in the second figure
The second location of pixels x in the second image as captured by collector2Interval T=[x1-dmax,x1-dmin];
According to default Stereo Matching Algorithm, the first location of pixels x is calculated separately1Corresponding first picture point takes with described
It is worth each value x in the T of section2Matching power flow between corresponding alternative second picture point;
Matching power flow minimum value is determined from each Matching power flow being calculated, and determines the Matching power flow minimum value
Corresponding optimal second location of pixels x'2;
According to the first location of pixels x1With optimal second location of pixels x2', object point P is calculated in the first image acquisition device
With the practical parallax d in the second image acquisition deviceP;
According to the practical parallax dPObject point P is calculated in the first coordinate system OL-XLYLZLUnder actual coordinate.
Optionally, according to the reference coordinate (xL,zL) and principle point location u, determine that object point P is acquired in described first image
The first location of pixels x in the first image captured by device1, including:
Utilize formulaThe first location of pixels x is calculated1。
Optionally, according to default Stereo Matching Algorithm, the first location of pixels x is calculated separately1Corresponding first picture point
With each value x in the interval T2Matching power flow between corresponding second picture point, including:
Each value x in the interval T is obtained successively2, by its corresponding picture point alternately the second picture point;
The first Matching power flow C between first picture point and alternative second picture point is calculated according to absolute difference AD algorithmsAD(P,
d);Wherein, For the gray value of first picture point,It is described alternative
The gray value of second picture point;
Become scaling method according to census and calculates the second Matching power flow between first picture point and alternative second picture point
Ccensus(P,d);
According to normalized function ρ (C, λ)=1-exp (C/ λ) respectively to first Matching power flow and the second Matching power flow
Be normalized, and to after normalized the first Matching power flow and the second Matching power flow sum, obtain the value
x2Corresponding Matching power flow C (P, d)=ρ (CAD(P,d),λAD)+ρ(Ccensus(P,d),λcensus);
Wherein, λADFor the default weight of first Matching power flow, λcensusFor the default power of second Matching power flow
Weight.
Optionally, Matching power flow minimum value is determined from each Matching power flow being calculated, including:
The each Matching power flow being calculated is polymerize according to right-angled intersection algorithm;
For each Matching power flow by polymerization, according to the victor is a king, WTA algorithms determine that Matching power flow therein is minimum
Value.
Optionally, according to the practical parallax dPObject point P is calculated in the first coordinate system OL-XLYLZLUnder practical seat
Mark, including:
According to formula z=bf/dpObject point P is calculated in the first coordinate system OL-XLYLZLUnder ZLAxis actual coordinate value
z;
According to formula x=zxL/zLObject point P is calculated in the first coordinate system OL-XLYLZLUnder XLAxis actual coordinate value
x。
According to the second aspect of the embodiment of the present application, a kind of three-dimensional measuring apparatus is provided, including:
Calibration information acquiring unit, the calibration information for obtaining Constructed Lighting Vision System and Binocular Stereo Vision System;
The calibration information includes at least, Constructed Lighting Vision System coordinate system Os-XsYsZsWith the first image of Binocular Stereo Vision System
The corresponding first coordinate system O of collectorL-XLYLZLBetween coordinate transformation relation, measurement accuracy δ, the binocular stereo vision system
Stand-off b, the focal length f of system and the principle point location u of described first image collector;
Reference coordinate determination unit determines that an object point P is in coordinate system on measured object for utilizing Constructed Lighting Vision System
Os-XsYsZsUnder coordinate (xs,zs), and according to the coordinate transformation relation to the coordinate (xs,zs) coordinate conversion is carried out, it obtains
To object point P in the first coordinate system OL-XLYLZLUnder reference coordinate (xL,zL);
Disparity range determination unit, for the reference according to the stand-off b, focal length f, measurement accuracy δ and the object point P
Coordinate value zL, determine object point P between the first image acquisition device and the second image acquisition device of the Binocular Stereo Vision System
Parallax minimum value dminWith parallax maximum value dmax;Wherein,
First picture point determination unit, for according to the reference coordinate (xL,zL) and principle point location u, determine object point P
Corresponding first location of pixels x in the first image captured by one image acquisition device1;
Second picture point determination unit, for according to the parallax minimum value dmin, parallax maximum value dmaxWith the first picture point position
Set x1, determine the second location of pixels x in the second images of the object point P captured by the second image acquisition device2Interval T=
[x1-dmax,x1-dmin];
Matching power flow computing unit, for according to Stereo Matching Algorithm is preset, calculating separately the first location of pixels x1
Corresponding first picture point and each value x in the interval T2Matching power flow between corresponding alternative second picture point;
Best match determination unit, for determining Matching power flow minimum value from each Matching power flow being calculated, and
Determine the corresponding optimal second location of pixels x of the Matching power flow minimum value2';
Practical disparity computation unit, for according to the first location of pixels x1With optimal second location of pixels x2', calculate
Practical parallax ds of the object point P in the first image acquisition device and the second image acquisition deviceP;
Actual coordinate computing unit, for according to the practical parallax dPObject point P is calculated in the first coordinate system OL-
XLYLZLUnder actual coordinate.
Optionally, the first picture point determination unit is specifically configured to:
Utilize formulaThe first location of pixels x is calculated1。
Optionally, the Matching power flow computing unit is specifically configured to:
Each value x in the interval T is obtained successively2, by its corresponding picture point alternately the second picture point;
The first Matching power flow C between first picture point and alternative second picture point is calculated according to absolute difference AD algorithmsAD(P,
d);Wherein, For the gray value of first picture point,It is described alternative
The gray value of second picture point;
Become scaling method according to census and calculates the second Matching power flow between first picture point and alternative second picture point
Ccensus(P,d);
According to normalized function ρ (C, λ)=1-exp (C/ λ) respectively to first Matching power flow and the second Matching power flow
Be normalized, and to after normalized the first Matching power flow and the second Matching power flow sum, obtain the value
x2Corresponding Matching power flow C (P, d)=ρ (CAD(P,d),λAD)+ρ(Ccensus(P,d),λcensus);
Wherein, λADFor the default weight of first Matching power flow, λcensusFor the default power of second Matching power flow
Weight.
Optionally, the best match determination unit is specifically configured to:
The each Matching power flow being calculated is polymerize according to right-angled intersection algorithm;
For each Matching power flow by polymerization, according to the victor is a king, WTA algorithms determine that Matching power flow therein is minimum
Value.
Optionally, the actual coordinate computing unit is specifically configured to:
According to formula z=bf/dpObject point P is calculated in the first coordinate system OL-XLYLZLUnder ZLAxis actual coordinate value
z;
According to formula x=zxL/zLObject point P is calculated in the first coordinate system OL-XLYLZLUnder XLAxis actual coordinate value
x。
According to the third aspect of the embodiment of the present application, a kind of three-dimension measuring system is provided, including:Constructed Lighting Vision System,
Binocular Stereo Vision System and three-dimensional measuring apparatus described in any one of the above embodiments.
By above technical scheme it is found that for Constructed Lighting Vision System scan frequency is low, resolution ratio is low and binocular solid
The low disadvantage of vision system measurement accuracy, the embodiment of the present application carry out preliminary three-dimensional measurement by Constructed Lighting Vision System first,
And then the measurement accuracy of integrated structure light vision system, using preliminary three-dimensional measuring result as reference value, by presetting Stereo matching
Algorithm determines corresponding first location of pixels of smallest match costs of the object point P in Binocular Stereo Vision System and optimal second picture
Plain position determines the corresponding practical parallaxes of object point P, last root further according to first location of pixels and optimal second location of pixels
The practical three-dimensional coordinate of object point P can be calculated according to the practical parallax.As it can be seen that the embodiment of the present application realizes structure light vision
The fusion of system and Binocular Stereo Vision System can both overcome Constructed Lighting Vision System scan frequency is low, resolution ratio is low to ask
Topic, and can overcome the problems, such as that Binocular Stereo Vision System measurement accuracy is low, to ensure the scene in measured object high-speed motion
Under can also measure the three-dimensional coordinate exact value of measured object.
It should be understood that above general description and following detailed description is only exemplary and explanatory, not
The disclosure can be limited.
Specific implementation mode
In order to make those skilled in the art more fully understand the technical solution in the embodiment of the present invention, and make of the invention real
The above objects, features, and advantages for applying example can be more obvious and easy to understand, below in conjunction with the accompanying drawings to technical side in the embodiment of the present invention
Case is described in further detail.
The embodiment of the present application provides a kind of method for three-dimensional measurement, apparatus and system, simultaneous to solve to be difficult in the related technology
The problem of caring for measurement accuracy and sweep speed.
Inventor has found that Constructed Lighting Vision System and Binocular Stereo Vision System are carrying out in the research process of the application
When three-dimensional measurement, advantage and disadvantage are individually present, therefore the embodiment of the present application carries out same measured object using above two vision system
Three-dimensional measurement, the 3 d measurement data obtained to two kinds of vision systems merge, and make to maximize favourable factors and minimize unfavourable ones between two kinds of vision systems,
Obtain accuracy height and the method for three-dimensional measurement suitable for high-speed moving object.With reference to the embodiment of the present application shown in FIG. 1
The embodiment of the present application is described in detail in the structure chart of the three-dimension measuring system of offer.
Referring to Fig.1, three-dimension measuring system provided by the embodiments of the present application includes at least an optical projection device and three figures
As collector;Wherein, optical projection device 110 and image acquisition device 120 constitute Constructed Lighting Vision System, and optical projection device 110 has
Body can be laser, and projection forms structure optical plane 130;220 structure of first image acquisition device 210 and the second image acquisition device
At Binocular Stereo Vision System.
Optionally, area array cameras, the first image acquisition device 210 and second specifically may be used in above-mentioned image acquisition device 120
Image acquisition device 220 is all made of line-scan digital camera, to obtain image more higher than area array cameras scan frequency higher, resolution ratio.
The fusion of measurement data between two kinds of vision systems of realization, before measuring, first to two kinds of visions in Fig. 1
The relative position relation of system is demarcated, specific to demarcate content mainly including following three aspects:
1) calibration of Constructed Lighting Vision System
I.e. to the calibration of optical projection device 110 and image acquisition device 120 in Fig. 1, any existing calibration specifically may be used
Method, the embodiment of the present application are defined not to this.Wherein, one directly related with the data fusion of two kinds of vision systems mark
Determining parameter is, the measurement accuracy of entire three-dimension measuring system shown in the measurement accuracy δ namely Fig. 1 of Constructed Lighting Vision System.Example
Such as, it as δ=1cm, indicates that the object point coordinate that the three-dimension measuring system finally measures is accurate to 1cm, as δ=1mm, indicates
The object point coordinate that the three-dimension measuring system finally measures is accurate to 1mm.
2) calibration of Binocular Stereo Vision System
Include mainly in the two i.e. to the calibration of the first image acquisition device 210 and the second image acquisition device 220 in Fig. 1
The calibration of portion's parameter, as focal length f (the first image acquisition device 210 is identical with the focal length of the second image acquisition device 220, be f), base
Principle point location u (the i.e. center of first image acquisition device 210 imaging plane of the standard away from b, distortion factor, the first image acquisition device 210
The intersection position of the optical axis and its imaging plane of point position namely the first image acquisition device 210) etc..Wherein, the first Image Acquisition
The visual field plane of device 210 and the visual field plane of the second image acquisition device 220 overlap, and stand-off b indicates the first image acquisition device 210
Optical center and the second image acquisition device 220 the distance between optical center.
3) calibration of two kinds of vision system relative position relations
As shown in Figure 1, the embodiment of the present application utilizes sawtooth target to Constructed Lighting Vision System and Binocular Stereo Vision System
Relative position relation demarcated;That is, using image acquisition device 120 and the first image acquisition device 210 simultaneously to sawtooth target
It is shot, (optical axis of each image acquisition device is such as adjusted system light path to be adjusted by analyzing the image taken
Direction, the light regulating projector light projecting direction) so that the visual field plane of the first image acquisition device 210 (namely the second figure
As the visual field plane of collector 220) it is overlapped with structure optical plane 130, coordinate system O where analysis obtains Constructed Lighting Vision Systems-
XsYsZsWith the first coordinate system O where the first image acquisition device 210L-XLYLZLBetween coordinate transformation relation M.Certainly, practical
In, coordinate system O where obtaining Constructed Lighting Vision System can also be demarcated based on above-mentioned principles-XsYsZsIt is adopted with the second image
The second coordinate system O where storage 220R-XRYRZRBetween coordinate transformation relation M ';Examples below, which is only converted with coordinate, to close
It is to illustrate that specific method for three-dimensional measurement, those skilled in the art can be accordingly derived by based on coordinate transformation relation for M
The method for three-dimensional measurement of M '.
Specifically, three coordinate systems mentioned above can define as follows:
First, as shown in Figure 1, for the first coordinate system O where the first image acquisition device 210L-XLYLZLIt is defined as follows:
Origin OLIt can be set as the optical center of the first image acquisition device 210, ZLAxis is parallel to and (coincides with) the first image acquisition device 210
Optical axis, XLAxis is perpendicular to ZLAxis, and XLAxis is parallel with the visual field plane of the first image acquisition device 210, YLAxis is respectively perpendicular to described
ZLAxis and XLAxis.In practical application, to reduce data processing difficulty, it is Y that can take the moving direction of measured objectLAxis direction, and phase
The light path that the first image acquisition device 210 should be adjusted, the first coordinate system where making it meet above-mentioned definition.
Optionally, the origin O in the application other embodimentLIt can also be set as the principal point of the first image acquisition device 210,
That is the intersection point of the optical axis and imaging plane of the first image acquisition device 210.
Secondly, the second coordinate system O where the second image acquisition device 220R-XRYRZRIt is referred to above-mentioned first coordinate system OL-
XLYLZLIt establishes, and meets XLAxis and XRAxis is parallel, YLAxis and YRAxis is parallel, ZLAxis and ZRAxis is parallel (to avoid lines or label weight
It closes, the second coordinate system O is illustrated only in Fig. 1R-XRYRZRZRAxis).
Again, for coordinate system O where above structure light vision systems-XsYsZsIn, origin OsImage is can be set as to adopt
The optical center of storage 120, ZsAxis is parallel to the optical axis of image acquisition device 120, XsAxis is perpendicular to ZsAxis, and XsAxis and structure optical plane
130 is parallel (it is possible thereby to deduce, XsAxis is also parallel with XLAxis), YsAxis respectively with the ZsAxis and XsAxis is vertical.
After the completion of the definition of above three coordinate system, you can according to synchronization image acquisition device 120 and the first Image Acquisition
The image analysis that device 210 takes obtains above-mentioned coordinate transformation relation M.
Below based on the above calibration result, start to introduce method for three-dimensional measurement provided by the embodiments of the present application.
With reference to Fig. 2, method for three-dimensional measurement provided by the embodiments of the present application includes the following steps:
S1, the calibration information for obtaining Constructed Lighting Vision System and Binocular Stereo Vision System.
Wherein, the calibration information includes at least the stand-off of measurement accuracy δ described above, Binocular Stereo Vision System
B, focal length f, the principle point location u of the first image acquisition device 210 and Constructed Lighting Vision System coordinate system Os-XsYsZsIt is sat with first
Mark system OL-XLYLZLBetween coordinate transformation relation M.
S2, using Constructed Lighting Vision System, determine that an object point P is in coordinate system O on measured objects-XsYsZsUnder coordinate (xs,
zs)。
It optionally, can be optional one in the two dimensional image captured by the image acquisition device 120 of Constructed Lighting Vision System
Pixel, and then utilize coordinate system Os-XsYsZsCoordinate (the x of the object point P on the corresponding measured object of the picture point is calculateds,zs)。
S3, according to the coordinate transformation relation M to the coordinate (xs,zs) coordinate conversion is carried out, object point P is obtained first
Coordinate system OL-XLYLZLUnder reference coordinate (xL,zL)。
Since relative to the first image acquisition device 210, the scan frequency and resolution ratio of image acquisition device 120 are all relatively low, therefore
The image that first image acquisition device 210 takes includes more pixels, only can not obtain the first figure by coordinate conversion
The coordinate of each pixel, can not obtain the coordinate of each object point in the image taken as collector 210;Moreover, above-mentioned
The object point P being converted to by coordinate is in the first coordinate system OL-XLYLZLUnder coordinate there are large errors, cannot function as reality
Coordinate.In view of this, the embodiment of the present application by the object point P being converted to by coordinate in the first coordinate system OL-XLYLZLUnder coordinate
As the reference coordinate of object point P, the reality of the higher object point P of accuracy is further determined that in conjunction with the measuring principle of binocular vision system
Border coordinate.
S4, according to the reference coordinate value z of the stand-off b, focal length f, measurement accuracy δ and the object point PL, determine object point P
Parallax minimum value d between the first image acquisition device and the second image acquisition device of the Binocular Stereo Vision SystemminWith regarding
Poor maximum value dmax。
Wherein, the calculation formula of parallax minimum value isThe calculation formula of parallax maximum value is
In the embodiment of the present application, above-mentioned parallax minimum value dminWith parallax maximum value dmaxCalculation formula according to binocular solid
The measuring principle of vision system obtains, and is introduced with reference to schematic diagram shown in Fig. 3.
Plane shown in Fig. 3 is the first coordinate system OL-XLYLZLZLAxis and XLPlane that axis is constituted (namely the second coordinate system
ZRAxis and XRThe visual field plane of plane namely the first image acquisition device 210 and the second image acquisition device 220 that axis is constituted),
And meet coordinate system definition described previously;Wherein, S1 and S2 is respectively the first image acquisition device 210 and the second image acquisition device
220 imaging plane, OLAnd ORRespectively coordinate origin (and the first image acquisition device 210 and the second image acquisition device 220
Optical center);Object point P and optical center OLBetween line and imaging plane S1 intersection point P1, the first picture point of corresponding object point P;Object point P and
Optical center ORBetween line and imaging plane S2 intersection point P2, the second picture point of corresponding object point P.
Assuming that P is in the first coordinate system OL-XLYLZLUnder coordinate be (xL,yL,zL), then two coordinate systems shown in Fig. 3
OL-XLZLUnder, the coordinate of P is (xL,zL), intersection point P1The coordinate at place is (xP1,zP1), the second coordinate system O shown in Fig. 3R-XRZR
Under, intersection point P2The coordinate at place is (xP2,zP2), and have:
Parallaxes of the object point P in Binocular Stereo Vision System can be expressed as d=xP1-xP2;
zP1=zP2=f, zL=Z (wherein, Z is known as the depth value of object point P).
By triangle POLORWith triangle PP1P2Known to this similar geometrical relationship:
Deformation obtains:That is,
Again since the measurement accuracy of Constructed Lighting Vision System is δ, therefore object point P is in the first coordinate system OL-XLYLZLZLAxis is real
Border coordinate value value range is [zL-δ,zL+δ], the minimum value so as to obtain parallax d isMaximum value is
S5, according to the reference coordinate (xL,zL) and principle point location u, determine object point P captured by the first image acquisition device
The first image in corresponding first location of pixels x1。
First location of pixels namely intersection point P shown in Fig. 31Position in the first image.Due to first figure
The second image captured by picture and the second imaging sensor is really made of the pel array of N row M rows, therefore arbitrary picture point corresponds to
Location of pixels, (n, m) including principle point location can indicate, i.e., with the corresponding row number of respective pixel and line number;Wherein, 1
≤ n≤N and 1≤m≤M, alternatively, 0≤n≤N-1 and 0≤m≤M-1.
In one feasible embodiment of the embodiment of the present application, for improve Binocular Stereo Vision System resolution ratio, first
Image acquisition device 210 and the second image acquisition device 220 are using the line-scan digital camera for containing only a sensitivity speck in short transverse, i.e., described
The total line number M=1 of pixel in first image and the second image, principle point location and the corresponding location of pixels of arbitrary picture point only need to consider
Respective pixel row number.For example, the corresponding row number of principal point is n0, then principle point location u be denoted as u=n0。
It is defined by coordinate system it is found that principal point is in XLCoordinate value on axis is 0, then for xP1、x1And u, there are following relationships:
xP1- 0=k (x1- u), i.e.,
Wherein, conversion coefficients of the k between coordinate value and location of pixels indicates XLAxis unit distance and single pixel width
Ratio.For example, each pixel wide is 0.5mm, XLAxis unit distance is 1mm, then k=1/0.5=2;Conversely, each pixel
Width is 1mm, XLAxis unit distance is 0.5mm, then k=0.5.
Also, the schematic diagram with reference to shown in Fig. 3, by zP1=f and geometrical relationshipIt can obtain:
By above-mentioned two formulaWithIt can obtain:
Particularly, as k=1, the calculation formula of the first location of pixels is:
S6, according to the parallax minimum value dmin, parallax maximum value dmaxWith the first image point position x1, determine object point P
The second location of pixels x in the second image captured by two image acquisition devices2Interval T.
Specifically, according to d=x1-x2And d ∈ [dmin,dmax], can obtain:x2∈ T=[x1-dmax,x1-dmin]。
S7, basis preset Stereo Matching Algorithm, calculate separately the first location of pixels x1Corresponding first picture point and institute
State each value x in interval T2Matching power flow between corresponding alternative second picture point.
S8, Matching power flow minimum value is determined from each Matching power flow being calculated, and determine the Matching power flow most
It is small to be worth corresponding optimal second location of pixels x2'。
S9, according to the first location of pixels x1With optimal second location of pixels x2', calculate object point P and adopted in the first image
Practical parallax d in storage and the second image acquisition deviceP。
Specifically, the calculation formula of practical parallax is:dP=x1-x 2'。
S10, according to the practical parallax dPObject point P is calculated in the first coordinate system OL-XLYLZLUnder actual coordinate.
As it can be seen that the embodiment of the present application determines tested object point P in structure light vision system by Constructed Lighting Vision System first
Coordinate (the x to unite under coordinate systems,zs);Then it is sat according to the coordinate transformation relation between the two kinds of vision systems demarcated in advance
Mark conversion obtains references of the object point P under corresponding first coordinate system of the first image acquisition device of Binocular Stereo Vision System and sits
Mark (xL,zL);Then according to the reference coordinate (xL,zL) and the measurement accuracy δ of Constructed Lighting Vision System that demarcates in advance, profit
With the three-dimensional measurement principle of Binocular Stereo Vision System, the first picture point that object point P is obtained by Binocular Stereo Vision System is determined
The first location of pixels x at place1And the interval T of the second location of pixels where corresponding second picture point;Then to value
Each value in the T of section calculates separately its corresponding alternative second picture point and x1Corresponding first picture point matches institute's band
The Matching power flow come, and by the value x in the corresponding T of Matching power flow minimum value2'As optimal second location of pixels;Then may be used
To determine practical parallax ds of the object point P in the first image acquisition device and the second image acquisition deviceP=x1-x'2, and then according to described
Practical parallax dPCalculate actual coordinates of the object point P under first coordinate system.
By above technical scheme it is found that for Constructed Lighting Vision System scan frequency is low, resolution ratio is low and binocular solid
The low disadvantage of vision system measurement accuracy, the embodiment of the present application carry out preliminary three-dimensional measurement by Constructed Lighting Vision System first,
And then the measurement accuracy of integrated structure light vision system, using preliminary three-dimensional measuring result as reference value, by presetting Stereo matching
Algorithm determines corresponding first location of pixels of smallest match costs of the object point P in Binocular Stereo Vision System and optimal second picture
Plain position determines the corresponding practical parallaxes of object point P, last root further according to first location of pixels and optimal second location of pixels
The practical three-dimensional coordinate of object point P can be calculated according to the practical parallax.As it can be seen that the embodiment of the present application realizes structure light vision
The fusion of system and Binocular Stereo Vision System can both overcome Constructed Lighting Vision System scan frequency is low, resolution ratio is low to ask
Topic, and can overcome the problems, such as that Binocular Stereo Vision System measurement accuracy is low, to ensure the scene in measured object high-speed motion
Under can also measure the three-dimensional coordinate exact value of measured object.
It should be noted that since the corresponding practical parallax of different object points is not quite similar, therefore can implement according to the application
Example determines the practical parallax of the corresponding object point of each pixel in the two dimensional image captured by image acquisition device 120 respectively, into
And determine the object point in the first coordinate system OL-XLYLZLUnder actual coordinate (i.e. cycle execute step S2 to S10).
Optionally, to be different from the reference coordinate (x for the object point P being converted to by coordinateL,zL), the final institute of the present embodiment
The actual coordinate for the object point P to be measured, i.e. object point P are in the first coordinate system OL-XLYLZLUnder actual coordinate, be denoted as (x, y, z),
The then conclusion by being obtained above according to Fig. 3AndKnown to:
Object point P is in ZLThe calculation formula of axis actual coordinate value z is z=bf/dp;
Object point P is in XLThe calculation formula of axis actual coordinate value x is x=zxL/zL。
For object point P in YLAxis actual coordinate value y, the Y taken by the embodiment of the present applicationLAxis direction is the movement of object point P
Direction, therefore can be calculated to obtain actual coordinate value y according to the movement velocity and run duration of object point P.
The embodiment of the present application illustrated below determines the original of optimal second location of pixels by presetting Stereo Matching Algorithm
Reason.
For example, it is assumed that dmin=0.1, dmax=0.3, and the first location of pixels x is calculated in step S51=3.5, so as to
To determine x2∈ T=[x1-dmax,x1-dmin]=[3.2,3.4], then calculate separately:x1=3.5 HesCorresponding
With cost C1, x1=3.5 HesCorresponding Matching power flow C2 and x1=3.5 HesCorresponding Matching power flow
C3;The minimum value that Matching power flow is obtained by comparing C1, C2 and C3 is C2, then illustrates x1=3.5 corresponding first picture points andCorresponding second picture point is matched for best picture point, i.e., optimal second location of pixels x'2=3.3;And then it can calculate
To practical parallax ds of the object point P in Binocular Stereo Vision SystemP=3.5-3.3=0.2.
Pixel matching schematic diagram in Binocular Stereo Vision System as shown in Figure 4, the first picture point in determining the first image
Corresponding first location of pixels, and determine according to disparity range the value range of corresponding second location of pixels in the second image
Afterwards, each picture point in the value range can be described as alternative second picture point, be likely to be most matched with the first picture point
Second picture point, and the picture point except the value range be impossible to for most matched second picture point of the first picture point;Therefore it only need to be all over
Alternative second picture point each of is gone through within the scope of this, and calculates separately its Matching power flow between the first picture point, Matching power flow is most
It is small to be worth corresponding alternative second picture point as with most matched second picture point of the first picture point, it should most be matched with the first picture point accordingly
The location of pixels of the second picture point be optimal second location of pixels.
As it can be seen that measurement accuracy δ of the embodiment of the present application according to Constructed Lighting Vision System, object point P is in binocular tri-dimensional for prediction
Disparity range in feel system, and then corresponding second picture point corresponding is determined according to the disparity range and the first location of pixels
The value range of two location of pixels, to only being traversed in the corresponding image block of the value range, according to Matching power flow determine with
The second best picture point of first Pixel matching degree;Obviously, compared with the existing technology in the case where disparity range can not be predicted,
It is traversed in entire image captured by second image acquisition device to determine that the second best picture point, the embodiment of the present application can reduce
The search range of second picture point improves matching speed, can also improve matching precision, avoid because being searched in incoherent position
Second picture point and cause error hiding.
In one feasible embodiment of the application, default Stereo Matching Algorithm employed in step S7 can be based on
The Region Matching Algorithm of AD-census transformation, that is, it is corresponding to traverse the first picture point in the image captured by the second image acquisition device
Each alternative second picture point in disparity range calculates the first picture point using AD-census transformation for mula and each is alternative
Matching power flow between second picture point.
Specifically, the step of calculating Matching power flow using AD-census transformation for mula is as follows:
S71, first picture point and alternative second are calculated according to AD (absolute difference, absolute difference) algorithm
The first Matching power flow C between picture pointAD(P,d);
Wherein, the first Matching power flow CAD(P, d) indicates that the gray value absolute difference between two picture points, formula are:
Wherein,For the gray value of first picture point,It is current
The gray value of alternative second picture point traversed.
S72, according to census become scaling method calculate between first picture point and alternative second picture point second matching generation
Valence Ccensus(P,d);
S73, by normalized function by above-mentioned first Matching power flow CAD(P, d) and the second Matching power flow Ccensus(P, d) into
Row normalized;
In the embodiment of the present application, normalized function formula is ρ (C, λ)=1-exp (C/ λ), to the first Matching power flow CAD(P,
D) it is normalized to obtain:ρ(CAD(P,d),λAD)=1-exp (CAD(P,d)/λAD), to the second Matching power flow Ccensus(P,
D) it is normalized to obtain:ρ(Ccensus(P,d),λcensus)=1-exp (Ccensus(P,d)/λcensus)。
Wherein, λADFor the default weight of first Matching power flow, λcensusFor the default power of second Matching power flow
Weight.
S74, to treated two kinds of Matching power flow ρ (CAD(P,d),λAD) and ρ (Ccensus(P,d),λcensus) summation, it obtains
To P1And P2 jBetween Matching power flow C (P, d)=ρ (CAD(P,d),λAD)+ρ(Ccensus(P,d),λcensus)。
By taking the calculating Matching power flow C1 described in embodiment above as an example, that is, calculate the image captured by the first image acquisition device
Middle x1In image captured by=3.5 corresponding first picture points and the second image acquisition deviceCorresponding alternative second picture
Matching power flow between point, calculating process are:1) above-mentioned C is utilizedADIts first Matching power flow is calculated in (P, d) calculation formulaWherein,For x1The gray value of=3.5 corresponding first picture points,ForCorresponding alternative
The gray value of two picture points;2) become scaling method using census and x is calculated1=3.5 corresponding first picture points andIt is corresponding
Alternative second picture point between the second Matching power flow3) it utilizes normalized function to execute normalized, obtainsWith4) read group total is executed, P is obtained1And P2 1Between Matching power flow
In one feasible embodiment of the application, being determined from each Matching power flow being calculated described in step S8
Matching power flow minimum value, specifically includes:
S81, each Matching power flow being calculated is polymerize according to right-angled intersection algorithm;
Since the image captured by image acquisition device can be influenced by factors such as illumination, noises, cause same in space
Gray value difference in image of the object point captured by two image acquisition devices of Binocular Stereo Vision System is larger, therefore, this
The Matching power flow of surrounding pixel point centered on point to be matched (i.e. above-mentioned first picture point) polymerize by application embodiment, adds
Power together, come error caused by reducing single picture point.
S82, for each Matching power flow by polymerization, according to the victor is a king (winner takes all, WTA) algorithm
Determine Matching power flow minimum value therein.
As it can be seen that the embodiment of the present application is polymerize by Matching power flow, to eliminate same object point in Binocular Stereo Vision System
Influence of the gray value difference in image to Matching power flow captured by two image acquisition devices improves the calculating essence of Matching power flow
Exactness to improve matching precision, and then improves the three-dimensional measurement precision of whole system.
Finally, it should be noted that one of ordinary skill in the art will appreciate that realizing the whole in above-described embodiment method
Or part flow, it is that relevant hardware can be instructed to complete by computer program, it is non-that the program can be stored in one
In transitory computer readable storage medium, the program is when being executed, it may include such as the flow of the embodiment of above-mentioned each method.Its
In, the non-transient computer readable storage medium can be magnetic disc, CD, read-only memory (Read-Only
Memory, ROM) or random access memory (RandomAccessMemory, RAM) etc..
Correspondingly, the embodiment of the present application discloses a kind of three-dimensional measuring apparatus, which regards with structure light shown in FIG. 1 respectively
Feel system is connected with the data output interface of Binocular Stereo Vision System, is melted by the measurement data to two kinds of vision systems
It closes, obtains more accurately, suitable for the three-dimensional measuring result of high-speed moving object.Structural schematic diagram shown in Figure 5, it is described
Three-dimensional measuring apparatus 500 includes:
Calibration information acquiring unit 501, the calibration for obtaining Constructed Lighting Vision System and Binocular Stereo Vision System are believed
Breath;The calibration information includes at least, Constructed Lighting Vision System coordinate system Os-XsYsZsWith the first of Binocular Stereo Vision System
The corresponding first coordinate system O of image acquisition deviceL-XLYLZLBetween coordinate transformation relation, measurement accuracy δ, the binocular tri-dimensional
Stand-off b, the focal length f of feel system and the principle point location u of described first image collector;
Reference coordinate determination unit 502 determines that an object point P is in coordinate on measured object for utilizing Constructed Lighting Vision System
It is Os-XsYsZsUnder coordinate (xs,zs), and according to the coordinate transformation relation to the coordinate (xs,zs) coordinate conversion is carried out,
Object point P is obtained in the first coordinate system OL-XLYLZLUnder reference coordinate (xL,zL);
Disparity range determination unit 503, for according to the stand-off b, focal length f, measurement accuracy δ and the object point P
Reference coordinate value zL, determine object point P the Binocular Stereo Vision System the first image acquisition device and the second image acquisition device
Between parallax minimum value dminWith parallax maximum value dmax;Wherein,
First picture point determination unit 504, for according to the reference coordinate (xL,zL) and principle point location u, determine that object point P exists
Corresponding first location of pixels x in the first image captured by first image acquisition device1;
Second picture point determination unit 505, for according to the parallax minimum value dmin, parallax maximum value dmaxWith the first picture point
Position x1, determine the second location of pixels x in the second images of the object point P captured by the second image acquisition device2Interval T
=[x1-dmax,x1-dmin];
Matching power flow computing unit 506, for according to Stereo Matching Algorithm is preset, calculating separately first location of pixels
x1Corresponding first picture point and each value x in the interval T2Matching power flow between corresponding alternative second picture point;
Best match determination unit 507, for determining Matching power flow minimum value from each Matching power flow being calculated,
And determine the corresponding optimal second location of pixels x of the Matching power flow minimum value2';
Practical disparity computation unit 508, for according to the first location of pixels x1With optimal second location of pixels x2',
Calculate practical parallax ds of the object point P in the first image acquisition device and the second image acquisition deviceP;
Actual coordinate computing unit 509, for according to the practical parallax dPObject point P is calculated in the first coordinate system OL-
XLYLZLUnder actual coordinate.
The concrete operating principle of each unit can refer to embodiment of the method above in above-mentioned three-dimensional measuring apparatus.
Optionally, above-mentioned first picture point determination unit 504 specifically can be configured as:
Utilize formulaThe first location of pixels x is calculated1。
Optionally, above-mentioned Matching power flow computing unit 506 specifically can be configured as:
Each value x in the interval T is obtained successively2, by its corresponding picture point alternately the second picture point;
The first Matching power flow C between first picture point and alternative second picture point is calculated according to absolute difference AD algorithmsAD(P,
d);Wherein, For the gray value of first picture point,It is described alternative
The gray value of second picture point;
Become scaling method according to census and calculates the second Matching power flow between first picture point and alternative second picture point
Ccensus(P,d);
According to normalized function ρ (C, λ)=1-exp (C/ λ) respectively to first Matching power flow and the second Matching power flow
Be normalized, and to after normalized the first Matching power flow and the second Matching power flow sum, obtain the value
x2Corresponding Matching power flow C (P, d)=ρ (CAD(P,d),λAD)+ρ(Ccensus(P,d),λcensus);
Wherein, λADFor the default weight of first Matching power flow, λcensusFor the default power of second Matching power flow
Weight.
Optionally, above-mentioned best match determination unit 507 specifically can be configured as:
The each Matching power flow being calculated is polymerize according to right-angled intersection algorithm;
For each Matching power flow by polymerization, according to the victor is a king, WTA algorithms determine that Matching power flow therein is minimum
Value.
Optionally, above-mentioned actual coordinate computing unit 509 specifically can be configured as:
According to formula z=bf/dpObject point P is calculated in the first coordinate system OL-XLYLZLUnder ZLAxis actual coordinate value
z;
According to formula x=zxL/zLObject point P is calculated in the first coordinate system OL-XLYLZLUnder XLAxis actual coordinate value
x;
And object point P is calculated in the first coordinate system O according to the movement velocity and run duration of object point PL-XLYLZLUnder
YLAxis actual coordinate value y.
By above technical scheme it is found that for Constructed Lighting Vision System scan frequency is low, resolution ratio is low and binocular solid
The low disadvantage of vision system measurement accuracy, the embodiment of the present application carry out preliminary three-dimensional measurement by Constructed Lighting Vision System first,
And then the measurement accuracy of integrated structure light vision system, using preliminary three-dimensional measuring result as reference value, by presetting Stereo matching
Algorithm determines corresponding first location of pixels of smallest match costs of the object point P in Binocular Stereo Vision System and optimal second picture
Plain position determines the corresponding practical parallaxes of object point P, last root further according to first location of pixels and optimal second location of pixels
The practical three-dimensional coordinate of object point P can be calculated according to the practical parallax.As it can be seen that the embodiment of the present application realizes structure light vision
The fusion of system and Binocular Stereo Vision System can both overcome Constructed Lighting Vision System scan frequency is low, resolution ratio is low to ask
Topic, and can overcome the problems, such as that Binocular Stereo Vision System measurement accuracy is low, to ensure the scene in measured object high-speed motion
Under can also measure the three-dimensional coordinate exact value of measured object.
The embodiment of the present application also provides a kind of three-dimension measuring systems;Frame diagram as shown in Figure 6, the system include:Structure
Three-dimensional measuring apparatus 500 described in light vision system 100, Binocular Stereo Vision System 200 and any one embodiment above.
Wherein, Constructed Lighting Vision System 100 includes optical projection device 110 and image acquisition device 120, binocular stereo vision system
System 200 includes the first image acquisition device 210 and the second image acquisition device 220, and hardware configuration can refer to shown in Fig. 1.
The measurement data with Constructed Lighting Vision System 100 and Binocular Stereo Vision System 200 respectively of three-dimensional measuring apparatus 500
Output interface (such as output interface of image acquisition device 120, the first image acquisition device 210 and the second image acquisition device 220) connects,
It is merged by the measurement data to two kinds of vision systems, obtains three-dimensional survey more accurate, suitable for high-speed moving object
Measure result.
As it can be seen that three-dimension measuring system provided by the embodiments of the present application, while utilizing Constructed Lighting Vision System and binocular solid
Vision system carries out three-dimensional measurement, and the 3 d measurement data that the two obtains is merged, to can both overcome structure light
The problem that vision system scan frequency is low, resolution ratio is low, and can overcome the problems, such as that Binocular Stereo Vision System measurement accuracy is low,
Ensure also to measure the three-dimensional coordinate exact value of measured object under the scene of measured object high-speed motion.
Each embodiment in this specification is described in a progressive manner, identical similar portion between each embodiment
Point just to refer each other, and the highlights of each of the examples are the differences with other embodiments.Especially for system reality
For applying example, since it is substantially similar to the method embodiment, so description is fairly simple, related place is referring to embodiment of the method
Part explanation.
It should be understood that the invention is not limited in the precision architectures for being described above and being shown in the accompanying drawings, and
And various modifications and changes may be made without departing from the scope thereof.The scope of the present invention is limited only by the attached claims.