CN102042807B - Flexible stereoscopic vision measuring unit for target space coordinate - Google Patents

Flexible stereoscopic vision measuring unit for target space coordinate Download PDF

Info

Publication number
CN102042807B
CN102042807B CN2010105286134A CN201010528613A CN102042807B CN 102042807 B CN102042807 B CN 102042807B CN 2010105286134 A CN2010105286134 A CN 2010105286134A CN 201010528613 A CN201010528613 A CN 201010528613A CN 102042807 B CN102042807 B CN 102042807B
Authority
CN
China
Prior art keywords
vision measurement
numerical control
measurement parts
testee
control rotating
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
CN2010105286134A
Other languages
Chinese (zh)
Other versions
CN102042807A (en
Inventor
李为民
李晓峰
金兢
张瑜
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
University of Science and Technology of China USTC
Original Assignee
University of Science and Technology of China USTC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by University of Science and Technology of China USTC filed Critical University of Science and Technology of China USTC
Priority to CN2010105286134A priority Critical patent/CN102042807B/en
Publication of CN102042807A publication Critical patent/CN102042807A/en
Application granted granted Critical
Publication of CN102042807B publication Critical patent/CN102042807B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Length Measuring Devices By Optical Means (AREA)

Abstract

The invention discloses a flexible stereoscopic vision measuring unit for a target space coordinate, comprising two vision measurement components, two numerical-control rotary tables, a rack, a network data line and a computer, wherein each vision measurement component comprises two industrial prime lenses and a planar array CCD (Charge Coupled Device) camera; two numerical-control rotary tables are installed on the rack at intervals; and the vision measurement components are fixed on the numerical-control rotary tables. The practical measurement operation method of the device comprises the following steps of: shooting an object to be measured at a proper station; obtaining the two-dimensional image information of the object to be measured and the attitude parameters of the numerical-control rotary tables at the station by the computer; sending the two-dimensional image information of the object to be measured and the attitude parameters of the numerical-control rotary tables to the computer via the network data line; processing the collected image information with a digital image processing algorithm by the computer according to the obtained image information of the object to be measured and the attitude parameters of the numerical-control rotary tables; and reconstructing the space coordinate of the object to be measured. By the flexible stereoscopic vision measuring unit, non-contact space coordinate measurement can be realized, and large-range coordinate measurement can be realized by expanding the effective view fields of the vision measurement components through the numerical-control rotary tables.

Description

A kind of flexible stereo vision measurement device of object space coordinate
Technical field
The invention belongs to the machine vision metrology field, is a kind of large scale stereoscopic vision measuring system.This device can be widely used in the workpiece sensing in the industry, the fields such as depth perception, object dimensional scanning and reverse engineering of scene.
Background technology
Computer vision is the visual performance with the computer realization people---to perception, identification and the understanding of the three-dimensional scenic of objective world.Machine vision is to be based upon on the theory on computer vision basis, biases toward the through engineering approaches of computer vision technique.Along with development of technologies such as electronics, computing machine and signal Processing; Machine vision has obtained developing rapidly; And, used widely in great fields such as economic construction, scientific research and national defense construction because it has good, visual good, robotization of noncontact, real-time and intelligent advantages of higher.
At present, the stereoscopic vision measuring mode mainly contains fixed binocular vision measurement, floating type measurement and rotary flexible binocular measurement.The binocular vision measurement is to utilize two known intrinsic parameter CCD cameras while shot object from different perspectives, uses this two width of cloth image reconstruction testee again.
Fixed binocular measurement is meant that the reference mark that utilizes some known spatial coordinates in the scene earlier marked the outer parameter of CCD camera, utilizes the image planes information of testee to come the reconstruct testee then.If will change the visual field, just must wholely move fixed binocular measuring system, be difficult in moving process guarantee that the relative position of left and right sides CCD camera immobilizes, this will have influence on the measuring accuracy of this system.Therefore, this measuring method is more loaded down with trivial details, and measurement environment is had higher requirement.
Floating type measurement can be divided into floating type monocular measurement, floating type binocular is measured and floating type many range estimations amount.Floating type measurement can utilize the reference mark of the some known spatial coordinates in the scene to demarcate the outer parameter of CCD camera.But if the outer parameter that the CCD camera is demarcated at the reference mark that expansion or change visual field just must utilize scene to dash again, this has just brought many extra links to measurement, makes that the efficient of measuring is not high.
Summary of the invention
In order to solve the problem that prior art exists, the present invention is directed to the demand of internal coordinate measurement on a large scale, a kind of flexible stereo vision measurement device and method of object space coordinate proposed.
The technical solution that realizes above-mentioned purpose is following:
This device of flexible stereo vision measurement device of a kind of object space coordinate comprises two vision measurement parts and numerical control rotating platform 4, frame 5, network data line 6 and computing machine 7;
Said vision measurement parts are connected with computing machine 7 through network data line 6 respectively with numerical control rotating platform 4, the two-dimensional image information of vision measurement parts output testee; Numerical control rotating platform 4 output region attitude informations;
The vision measurement parts comprise area array CCD camera 3 and industrial tight shot 2, and area array CCD camera 3 is installed in numerical control rotating platform 4;
Said numerical control rotating platform 4 is to be used to support the vision measurement parts, and control vision measurement component movement, makes the vision measurement parts rotate each different station according to the spatial attitude of testee 1, obtains numerical control rotating platform 4 spatial attitude information;
Said computing machine 7 is used to receive, store testee 1 image information and the numerical control rotating platform 4 spatial attitude information that the vision measurement parts obtain; And with testee 1 image information and the processing of numerical control rotating platform 4 spatial attitude information fusion, the volume coordinate of reconstruct testee 1.
Said numerical control rotating platform is the three-dimensional rotation worktable; Have 6 degree of freedom, be used to support the vision measurement parts, and control vision measurement component movement; Make the vision measurement parts rotate each different station, obtain numerical control rotating platform spatial attitude information according to the spatial attitude of testee; Distance between two numerical control rotating platforms is 0.8 meter~500 meters, and the distance between the vision measurement member distance testee is 0.9 meter~200 meters.
The valid pixel scope of said area array CCD camera 3 is 1000 pixels * 1000 pixels~4096 pixels * 4096 pixels, and the focal range of selecting for use of industrial tight shot 2 is 16 millimeters~1000 millimeters.
A kind of measuring method of using the flexible stereo vision measurement device of object space coordinate comprises proving operation and actual measurement operation two parts of whole measuring system.
The proving operation step of whole measuring system is following:
Demarcating steps 1: starting outfit also carries out initialization, makes equipment get into stable operation stage;
Demarcating steps 2: be placed on the reference mark of nine above known spatial coordinates in testee 1 space, the reference mark is evenly distributed in the visual field of two vision measurement parts; The same area of testee 1 is aimed in the visual field of two vision measurement parts;
Demarcating steps 3: adjust the industrial tight shot 2 of two vision measurement parts respectively, take the reference mark on the testee 1 through computing machine 7 control vision measurement parts; The two-dimensional image information that the vision measurement parts obtain the reference mark is transferred to computing machine 7 terminals through network data line 6 and carries out Flame Image Process;
Demarcating steps 4: utilize computer vision methods to demarcate the intrinsic parameter of vision measurement parts; The intrinsic parameter of said vision measurement parts is the two-dimensional image informations that obtained the reference mark in testee 1 space by the vision measurement parts, demarcates the intrinsic parameter of vision measurement parts;
Demarcating steps 5: computing machine 7 drives numerical control rotating platform 4, and the vision measurement parts are moved with numerical control rotating platform 4, and the vision measurement parts are taken the reference mark in testee 1 space from three different attitudes, and write down the spatial attitude of numerical control rotating platform 4 at each station;
Demarcating steps 6: the two-dimensional image information that the vision measurement parts obtain and the spatial attitude information of numerical control rotating platform 4 are sent to computing machine 7 through network data line 6 together;
Demarcating steps 7: the method for utilizing Robot Hand-eye to demarcate is calculated the spatial alternation matrix RT between numerical control rotating platform 4 coordinate systems and the vision measurement parts coordinate system x
Described spatial alternation matrix RT xBe to drive numerical control rotating platform 4 through computing machine 7; The vision measurement parts are moved with numerical control rotating platform 4, and the vision measurement parts are taken the reference mark in testee 1 space from three different attitudes of difference, and record numerical control rotating platform 4 is in the spatial attitude of each station; Calculate the relation between numerical control rotating platform 4 coordinate systems and the vision measurement parts coordinate system, obtain spatial alternation matrix RT x
The actual measurement operation steps is following:
Measuring process 1: utilize two vision measurement parts to obtain the two-dimensional image information of testee respectively;
Described two-dimensional image information is to utilize numerical control rotating platform 4 to control two vision measurement component alignment testees 1 respectively, takes testee 1, obtains the two-dimensional image information of testee 1 respectively;
Measuring process 2: the spatial attitude matrix RT of the system that obtains two numerical control rotating platforms 4 when current station is measured p
The spatial attitude matrix RT of described numerical control rotating platform 4 p, be to utilize numerical control rotating platform 4 to control the vision measurement parts respectively to forward suitable measurement station to, aim at testee 1, take testee 1, and note the coordinate parameters of two numerical control rotating platforms 4, thereby obtain the spatial attitude matrix RT of numerical control rotating platform 4 at this station p
Measuring process 3: the outer parameters R T of computation vision measurement component c=RT xRT p
The outer parameter of said computation vision measurement component is the spatial attitude matrix RT that utilizes measuring process 2 to obtain pThe matrix RT that obtains with the demarcating steps 7 of whole measuring system x, according to the transformational relation RT of space coordinates c=RT xRT p, calculate the outer parameters R T of vision measurement parts in object coordinates system c
Measuring process 4: call the outer parameter of vision measurement parts that vision measurement parts intrinsic parameter that the demarcating steps 4 of two-dimensional image information and the whole measuring system of the testee 1 that measuring process 1 obtains obtains and measuring process 3 obtain;
Measuring process 5: utilize spatial point reconstruct equation, calculate the 3 d space coordinate of testee 1;
Measuring process 6: the 3 d space coordinate of output testee 1.
Useful technique effect of the present invention: the flexible stereo vision measurement is that utilization can be known the outer parameter that the numerical control rotating platform of motion comes the computation vision measurement component.Rotate the vision measurement parts so at every turn and all need not demarcate outer parameter again, only need the spatial attitude of record turntable, then the outer parameter of computation vision measurement component.The flexible stereo vision measurement not only has higher measuring accuracy, can also solve the limitation of measurement range.
Description of drawings
Fig. 1 is the structural drawing of apparatus of the present invention.
Label wherein: 1 is testee, and 2 is industrial tight shot, and 3 is area array CCD camera, and 4 is numerical control rotating platform, and 5 is frame, and 6 is the network data line, and 7 is computing machine.
Fig. 2 is the graph of a relation of vision measurement parts coordinate system of the present invention and world coordinate system.
Fig. 3 is the coordinate system transition diagram that the present invention is based on the flexible stereo vision measurement of numerical control rotating platform.
Fig. 4 is the process flow diagram of scaling method according to the invention.
Fig. 5 is the process flow diagram of measuring method according to the invention.
Embodiment
Specify each related detailed problem in the technical scheme of the present invention below in conjunction with accompanying drawing.
Referring to Fig. 1, the flexible stereo vision measurement device of object space coordinate of the present invention comprises two vision measurement parts and numerical control rotating platform 4, frame 5, network data line 6 and computing machine 7.The vision measurement parts are connected with computing machine 7 through network data line 6 respectively with numerical control rotating platform 4, the two-dimensional image information of vision measurement parts output testee; Numerical control rotating platform 4 output region attitude informations.
The vision measurement parts comprise area array CCD camera 3 and industrial tight shot 2, and area array CCD camera 3 is installed on the numerical control rotating platform 4, and industrial tight shot 2 is installed on the area array CCD camera 3; Two numerical control rotating platform 4 separated by a distance being installed on the frame 5; Vision measurement parts, numerical control rotating platform and computing machine all adopt 220 volts of power supplys.According to testee size, far and near; Two numerical control rotating platforms distance can change to 500 meters from 0.8 meter, 0.9 meter to 200 meters of vision measurement member distance testee, and two vision measurement parts optical axises are intersected in a bit at testee; The visual field of every vision measurement parts overlaps on testee; Far and near different according to the distance of vision measurement parts and testee, the testee area size is different, and the valid pixel size of the vision measurement parts that adopted can be from 1000 pixels * 1000 pixels to 4096 pixels * 4096 pixels; The focal length of industry tight shot can be from 16 millimeters to 1000 millimeters; Apparent field's size of vision measurement parts is also different, and the apparent field of vision measurement parts can be from 0.2 meter * 0.2 meter to 40 meters * 40 meters, and the testee area size can be from 0.4 meter * 0.2 meter to 500 meters * 500 meters; Computing machine is connected with numerical control rotating platform with the vision measurement parts respectively through the network data line, in real time the athletic posture information of images acquired information and control turntable.
Area array CCD camera 3 is present machine vision imageing sensors the most commonly used.It is made up of sequential and synchronous generator, vertical driver, analog/digital signal treatment circuit, and union opto-electronic conversion and charge storage, electric charge shift, signal reads in one, is the typical solid image device.The outstanding feature of area array CCD camera 3 be with electric charge as signal, be to be signal and be different from its device with electric current or voltage.This type image device forms charge packet through opto-electronic conversion, then under the effect of driving pulse, shifts, amplifies output image signal.Industry tight shot 2 is through being threaded on the area array CCD camera 3; The optical axis of industry tight shot 2 is vertical with the chip of area array CCD camera 3; Can make scenery blur-free imaging in the visual field on the chip of area array CCD camera 3 through the focusing of regulating industrial tight shot 2, so just form vision measurement parts.
Control motor, driving governor and grating encoding dish are housed on the numerical control rotating platform 4, have 6 degree of freedom, promptly move and rotate, in three dimensions, do arbitrary motion along X axle, Y axle, Z axle.Can be through the computer control numerical control rotating platform, and can export the different spatial positions information of numerical control rotating platform.
Two numerical control rotating platforms 4 are according to testee size, far and near, and 0.9 meter~500 meters the distance of being separated by between the two connects computing machine through the network data line, and the computer acquisition view data is also carried out pre-service.Numerical control rotating platform connects computing machine through the network data line, the spatial attitude information of computer acquisition numerical control rotating platform motion, and the testee image information that will handle and the processing of numerical control rotating platform spatial attitude information fusion, the volume coordinate of reconstruct testee.
Existing proving operation with the flexible stereo vision measurement and measuring operation implementation method and the elaboration of part algorithmic formula be as follows:
One, proving operation part
The intrinsic parameter of vision measurement parts: as shown in Figure 2, utilize near the less character of picture centre point amount of distortion, with near the point center vision measurement parts intrinsic parameter under the pin-hole model is demarcated.The model representation of vision measurement parts is following:
s i u i v i 1 = m 11 m 12 m 13 m 14 m 21 m 22 m 23 m 24 m 31 m 32 m 33 m 34 X i Y i Z i 1 = M X i Y i Z i 1 - - - ( 1 )
Wherein, [X i, Y i, Z i, 1] TBe the homogeneous coordinates at the reference mark of i known spatial coordinate; [u i, v i, 1] TBe the image homogeneous coordinates at i reference mark; M is called projection matrix, M=KRT.S wherein iIt is scale-up factor; K only with vision measurement part interior parameter f u, f v, u 0, v 0Relevant, be called vision measurement parts intrinsic parameter, shown in formula (2); RT is called the outer parameter of vision measurement parts by the orientation decision of vision measurement parts with respect to object coordinates system, shown in formula (3):
K = f u 0 u 0 0 f v v 0 0 0 1 - - - ( 2 )
RT = r 11 r 12 r 13 t 1 r 21 r 22 r 23 t 2 r 31 r 32 r 33 t 3 - - - ( 3 )
Formula (1) comprises three equations, cancellation s iAfter, can get as follows two about m IjLinear equation:
X i m 11 + Y i m 12 + Z i m 13 + m 14 - u i X i m 31 - u i Y i m 32 - u i Z i m 33 - u i m 34 = 0 X i m 21 + Y i m 22 + Z i m 23 + m 24 - v i X i m 31 - v i Y i m 32 - v i Z i m 33 - v i m 34 = 0 - - - ( 4 )
For the reference mark of the n in the tested scene, then there be 2n about m IjLinear equation, be expressed as with matrix form
X 1 Y 1 Z 1 1 0 0 0 0 - u 1 X 1 - u 1 Y 1 - u 1 Z 1 - u 1 0 0 0 0 X 1 Y 1 Z 1 1 - v 1 X 1 - v 1 Y 1 - v 1 Z 1 - v 1 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . X n Y n Z n 1 0 0 0 0 - u n X n - u n Y n - u n Z n - u 1 0 0 0 0 X n Y n Z n 1 - v n X n - v n Y n - v n Z n - v 1 · m 11 m 12 m 13 m 14 m 21 m 22 m 23 m 24 m 31 m 32 m 33 m 34 = 0 0 0 0 0 0 0 0 0 0 0 0 - - - ( 5 )
When 2n>12, utilize singular value decomposition method in the matrix analysis can obtain 12 parameters of Metzler matrix.
After obtaining the M battle array, also need the intrinsic parameter of computation vision measurement component.Formula just can be obtained the intrinsic parameter of vision measurement parts below utilizing:
f u = m 34 2 | m 1 × m 3 | f v = m 34 2 | m 2 × m 3 | u 0 = m 34 2 m 1 T m 3 v 0 = m 34 2 m 2 T m 3 - - - ( 6 )
Wherein, Row vector for first three capable element composition of the i of matrix M; m I4(i=1,2,3) are capable the 4th column element of i of matrix M.
The outer parameter of vision measurement parts: that as shown in Figure 3 is the transformational relation figure between each coordinate system, wherein, and C ObjBe the testee coordinate system, C Cl, C CrBe respectively two vision measurement parts coordinate systems, C Pl, C PrBe respectively two numerical control rotating platform coordinate systems.
Suppose AP=[X, Y, Z, 1] TBe the homogeneous coordinates of some P in space in coordinate system A, BRT ABe 4 * 4 rotation translation matrix that transform to coordinate system A coordinate system B.With vision measurement parts wherein is example, vision measurement parts coordinate system C ClWith testee coordinate system C ObjBetween transformation relation
C cl P = C cl RT C obj · C obj P - - - ( 7 )
Numerical control rotating platform coordinate system C PlWith testee coordinate system C ObjBetween transformation relation be:
C pl P = C pl RT C obj · C obj P - - - ( 8 )
Vision measurement parts coordinate system C ClWith numerical control rotating platform coordinate system C PlBetween transformation relation be:
C cl P = C cl RT C pl · C pl P - - - ( 9 )
Can get by formula (8) and formula (9):
C cl P = C cl RT C pl · C pl RT C obj · C obj P - - - ( 10 )
Relatively formula (7) can be known with formula (10):
C cl RT C obj = C cl RT C pl · C pl RT C obj - - - ( 11 )
Can draw following coordinate system space conversion formula according to formula (9):
RT cl = RT xl · RT pl RT cr = RT xr · RT pr - - - ( 12 )
Wherein, RT Pl, RT PrRepresent the space conversion matrix between numerical control rotating platform coordinate system and the testee coordinate system respectively, promptly the spatial attitude matrix of numerical control rotating platform can obtain through numerical control rotating platform; RT Xl, RT XrRepresent the space conversion matrix between vision measurement parts coordinate system and the numerical control rotating platform coordinate system respectively, because the vision measurement parts are separately fixed on the numerical control rotating platform, RT then Xl, RT XrBe fixed value, concrete numerical value can be found the solution through the method for Robot Hand-eye calibration; RT Cl, RT CrRepresent the space conversion matrix between vision measurement parts coordinate system and the object coordinates system respectively, the outer parameter of Here it is desired vision measurement parts.
The scaling method process flow diagram that provides like Fig. 4; Computing machine collects the spatial attitude information of two-dimensional image information and numerical control rotating platform through the network data line, demarcates the intrinsic parameter of vision measurement parts and the spatial alternation matrix RT between numerical control rotating platform coordinate system and the vision measurement parts coordinate system xCorresponding algorithm flow chart is made up of following components:
(1) beginning, initialization:
Device start also carries out initialization, makes equipment get into stable operation stage.
(2) intrinsic parameter of demarcation vision measurement parts:
The two-dimensional image information at the reference mark that the vision measurement parts obtain is transferred to terminal through the network data line and carries out Flame Image Process; Utilize computer vision methods to demarcate vision measurement parts intrinsic parameter.
(3) computer memory transformation matrix:
The vision measurement parts are taken the reference mark in the testee space from three different attitudes, and the record numerical control rotating platform calculates the relation between numerical control rotating platform coordinate system and the vision measurement parts coordinate system in the spatial attitude of each station, obtains spatial alternation matrix RT x
(4) finish.
Two, measuring operation part
When measuring, numerical control rotating platform forwards suitable station to, vision measurement component alignment testee; The vision measurement parts are taken testee, obtain the two-dimensional image information of testee, the spatial attitude RT of numerical control rotating platform when writing down the shooting of vision measurement parts simultaneously Pl, RT PrThen, be invoked at the space conversion matrix RT that proving operation is partly demarcated Xl, RT Xr, substitution formula (12), the outer parameters R T of the vision measurement parts of calculating current station Cl, RT CrAt last, be invoked at the X-Y scheme information of the intrinsic parameter of the vision measurement parts that proving operation partly demarcates, testee that the vision measurement parts obtain and the outer parameters R T of the vision measurement parts of the current station that calculates through formula (12) Cl, RT Cr, three-dimensional spatial information that just can the reconstruct testee.Concrete reconstructing method is following:
The three-dimensionalreconstruction of extraterrestrial target: by vision measurement parts linear model (1), the scale-up factor s of cancellation formula (1) i, can obtain about X i, Y i, Z i2 equations:
( u i m 31 - m 11 ) X i + ( u i m 32 - m 12 ) Y i + ( u i m 33 - m 13 ) Z i + ( u i m 34 - m 14 ) = 0 ( v i m 31 - m 21 ) X i + ( v i m 32 - m 22 ) Y i + ( v i m 33 - m 23 ) Z i + ( v i m 34 - m 24 ) = 0 - - - ( 13 )
Two vision measurement parts can set up one group with the similar system of equations of formula (13), simultaneous obtains formula (14), wherein m IjThe subscript l of element, r represent the parameter of two vision measurement parts respectively.
( u i m 31 l - m 11 l ) X i + ( u i m 32 l - m 12 l ) Y i + ( u i m 33 l - m 13 l ) Z i + ( u i m 34 l - m 14 l ) = 0 ( v i m 31 l - m 21 l ) X i + ( v i m 32 l - m 22 l ) Y i + ( v i m 33 l - m 23 l ) Z i + ( v i m 34 l - m 24 l ) = 0 ( u i m 31 r - m 11 r ) X i + ( u i m 32 r - m 12 r ) Y i + ( u i m 33 r - m 13 r ) Z i + ( u i m 34 r - m 14 r ) = 0 ( v i m 31 r - m 21 r ) X i + ( v i m 32 r - m 22 r ) Y i + ( v i m 33 r - m 23 r ) i Z + ( v i m 34 r - m 24 r ) = 0 - - - ( 14 )
Under the situation that the vision measurement parts have all been demarcated, Metzler matrix is known.Can be through the image coordinates (u of corresponding point on two width of cloth images i, v i), find the solution the 3 d space coordinate (X that puts on the testee i, Y i, Z i), i.e. reconstruction attractor target.
The measuring method process flow diagram that provides like Fig. 5; Computing machine collects the attitude information of two-dimensional image information and numerical control rotating platform through the network data line, and is invoked at the intrinsic parameter of the vision measurement parts that proving operation partly demarcates and the spatial alternation matrix RT between numerical control rotating platform coordinate system and the vision measurement parts coordinate system x, utilize spatial point reconstruct equation, calculate the 3 d space coordinate of testee.Corresponding algorithm flow chart is made up of following components:
(1) beginning, initialization:
Device start also carries out initialization, makes equipment get into stable operation stage.
(2) obtain image information:
Utilize numerical control rotating platform to control two vision measurement component alignment testees respectively, take testee, obtain the two-dimensional image information of testee respectively, and be transferred to terminal through the network data line and carry out Flame Image Process.
(3) the outer parameter of computation vision measurement component:
The spatial attitude matrix RT of the system that obtains two numerical control rotating platforms when current station is measured Pl, RT Pr, call the spatial alternation matrix RT that proving operation partly obtains simultaneously Xl, RT Xr, utilize formula (12) to calculate the outer parameters R T of vision measurement parts in object coordinates system Cl, RT Cr
(4) reconstruct testee:
Call two-dimensional image information and the intrinsic parameter of vision measurement parts and the outer parameter of vision measurement parts of testee, utilize spatial point reconstruct equation (14), calculate the 3 d space coordinate of testee;
(5) obtain measurement result:
The 3 d space coordinate and the three-dimensional point cloud atlas of output testee; Again choose new measured zone, repeating step (2), (3), (4) can take multiple measurements.
(6) finish.
Application example:
The area array CCD camera that measurement mechanism adopted is the IPX-1M48-G profile array CCD camera of American I mperx company, and valid pixel is 1000 pixels * 1000 pixels; The industry tight shot is the LM35JC5M type tight shot of Japanese KOWA company, and focal length is 35 millimeters; The apparent field of vision measurement parts is 0.2 meter * 0.2 meter; The testee size is 0.4 meter * 0.2 meter; Two numerical control rotating platforms are the three-dimensional rotation worktable with 6 degree of freedom, about 0.95 meter apart from testee, and about 0.8 meter of the distance between two numerical control rotating platforms.At first, utilize scaling method of the present invention, the vision measurement parts are carried out proving operation, concrete operations are following:
1, device start and initialization, the two-dimensional image information at the reference mark that the vision measurement parts obtain calculates vision measurement parts intrinsic parameter according to formula (6), its result such as table 1:
Table 1 vision measurement parts intrinsic parameter
Left side vision measurement parts intrinsic parameter Right vision measurement parts intrinsic parameter
u 0 462.30(pixel) 477.96(pixel)
v 0 488.10(pixel) 520.63(pixel)
f u 36.3464mm 36.6054mm
f v 36.3510mm 36.5986mm
2, the vision measurement parts are taken the reference mark in the testee space from three different attitudes, and the record numerical control rotating platform is in the spatial attitude of each station, and the spatial alternation matrix that obtains between numerical control rotating platform coordinate system and the vision measurement parts coordinate system is following:
RT xl = 0.999833581870 - 0.014392328076 0.006517921156 - 1.381531282246 0.015168422941 0.997455791240 - 0.070235525145 100.611358229844 - 0.005498385652 0.070435224596 0.997502325253 6.679168395122 0 0 0 1
RT xr = 0.99964867771450 - 0.00607655301864 0.00003866460353 2.03943999017582 0.00598947106267 0.99748977926836 - 0.07553284200056 99.57030770582905 0.00062501806321 0.07555957898731 0.99711045358643 7.40971052761368 0 0 0 1
Then, utilize measuring method of the present invention, this testee is measured, concrete measuring operation is following:
1, device start and initialization.
2, numerical control rotating platform forwards suitable station to, vision measurement component alignment testee; Computing machine obtains the spatial attitude of the two-dimensional image information and the numerical control rotating platform of testee.Numerical control rotating platform spatial attitude matrix in that this measurement station computing machine obtains is following:
RT pl = 0.95364432384 0.01210639004 - 0.30059596756 - 232.28916295373 - 0.00412941156 0.99963299026 0.02786798081 - 78.46958967192 0.30081127120 - 0.02540385512 0.95334431935 1197.43340097863 0 0 0 1
RT pr = 0.95476425962 - 0.00025642373 0.29812595663 - 240.57961042855 - 0.00772726216 0.99942684740 0.02438294760 - 70.30334525638 - 0.29810009187 - 0.02545812478 0.95421505255 1366.87849057891 0 0 0 1
3, call the spatial alternation matrix RT that obtains in the proving operation step Xl, RT Xr, the outer parameter of vision measurement parts that calculates this measurement station according to formula (12) is following:
RT cl = RT xl · RT pl = 0.95550571613 - 0.00244825096 - 0.29473320496 - 224.74107751671 - 0.01078126263 0.99905760320 - 0.04372112683 - 62.29204506269 0.29452558219 0.04500240403 0.95457885536 1197.08326810855 0 0 0 1
RT cr = RT xr · RT pr = 0.95446425885 - 0.00633038820 0.29790994841 - 237.97559764004 0.02052701502 0.99883945408 - 0.04596721699 - 75.24172231357 - 0.29722584159 0.05013154920 0.95348650323 1364.87608449516 0 0 0 1
4, call two-dimensional image information and the intrinsic parameter of vision measurement parts and the outer parameter of vision measurement parts of testee, utilize spatial point reconstruct equation (14), calculate the testee 3 d space coordinate, its measurement error results such as table 2:
Table 2 measurement result
Figure BSA00000328657500103
The above; Be merely the embodiment among the present invention, but protection scope of the present invention is not limited thereto, anyly is familiar with this technological people in the technical scope that the present invention disclosed; Can understand conversion or the replacement expected; All should be encompassed in of the present invention comprising within the scope, therefore, protection scope of the present invention should be as the criterion with the protection domain of claims.

Claims (1)

1. measuring method of using the flexible stereo vision measurement device of object space coordinate, the flexible stereo vision measurement device of said object space coordinate comprises two vision measurement parts and numerical control rotating platform (4), frame (5), network data line (6) and computing machine (7);
Said vision measurement parts are connected with computing machine (7) through network data line (6) respectively with numerical control rotating platform (4), the two-dimensional image information of vision measurement parts output testee; Numerical control rotating platform (4) output region attitude information;
The vision measurement parts comprise area array CCD camera (3), and industrial tight shot (2) is installed on it, and area array CCD camera (3) is installed in numerical control rotating platform (4);
Said numerical control rotating platform (4) is to be used to support the vision measurement parts, and control vision measurement component movement, makes the vision measurement parts rotate each different station according to the spatial attitude of testee (1), obtains numerical control rotating platform (4) spatial attitude information;
Said computing machine (7) is used to receive, store testee (1) image information and numerical control rotating platform (4) the spatial attitude information that the vision measurement parts obtain; And with testee (1) image information and the processing of numerical control rotating platform (4) spatial attitude information fusion, the volume coordinate of reconstruct testee (1);
It is characterized in that the proving operation step of this device is following:
Demarcating steps 1: starting outfit also carries out initialization, makes equipment get into stable operation stage;
Demarcating steps 2: be placed on the reference mark of nine above known spatial coordinates in testee (1) space, the reference mark is evenly distributed in the visual field of two vision measurement parts; The same area of testee (1) is aimed in the visual field of two vision measurement parts;
Demarcating steps 3: adjust the industrial tight shot (2) of two vision measurement parts respectively, take the reference mark on the testee (1) through computing machine (7) control vision measurement parts; The two-dimensional image information that the vision measurement parts obtain the reference mark is transferred to computing machine (7) terminal through network data line (6) and carries out Flame Image Process;
Demarcating steps 4: utilize computer vision methods to demarcate the intrinsic parameter of vision measurement parts; The intrinsic parameter of said vision measurement parts is the two-dimensional image informations that obtained the reference mark in testee (1) space by the vision measurement parts, demarcates the intrinsic parameter of vision measurement parts;
Demarcating steps 5: computing machine (7) drives numerical control rotating platform (4); The vision measurement parts are moved with numerical control rotating platform (4); The vision measurement parts are taken the reference mark in testee (1) space from three different attitudes, and write down the spatial attitude of numerical control rotating platform (4) at each station;
Demarcating steps 6: the two-dimensional image information that the vision measurement parts obtain and the spatial attitude information of numerical control rotating platform (4) are sent to computing machine (7) through network data line (6) together;
Demarcating steps 7: the method for utilizing Robot Hand-eye to demarcate is calculated the spatial alternation matrix RT between numerical control rotating platform (4) coordinate system and the vision measurement parts coordinate system x
Described spatial alternation matrix RT xBe to drive numerical control rotating platform (4) through computing machine (7); The vision measurement parts are moved with numerical control rotating platform (4), and the vision measurement parts are taken the reference mark in testee (1) space from three different attitudes, and record numerical control rotating platform (4) is in the spatial attitude of each station; Calculate the relation between numerical control rotating platform (4) coordinate system and the vision measurement parts coordinate system, obtain spatial alternation matrix RT x
The actual measurement operation steps is following:
Measuring process 1: utilize two vision measurement parts to obtain the two-dimensional image information of testee respectively;
Described two-dimensional image information is to utilize numerical control rotating platform (4) to control two vision measurement component alignment testees (1) respectively, takes testee (1), obtains the two-dimensional image information of testee (1) respectively;
Measuring process 2: the spatial attitude matrix RT of the system that obtains two numerical control rotating platforms (4) when current station is measured p
The spatial attitude matrix RT of described numerical control rotating platform (4) pBe to utilize numerical control rotating platform (4) to control the vision measurement parts respectively to forward suitable measurement station to, aim at testee (1), take testee (1); And note the coordinate parameters of two numerical control rotating platforms (4), thereby obtain the spatial attitude matrix RT of numerical control rotating platform (4) at this station p
Measuring process 3: the outer parameters R T of computation vision measurement component c=RT xRT p
The outer parameter of said computation vision measurement component is the spatial attitude matrix RT that utilizes measuring process 2 to obtain pThe matrix RT that obtains with the proving operation step 7 of whole measuring system x, according to the transformational relation RT of space coordinates c=RT xRT p, calculate the outer parameters R T of vision measurement parts in object coordinates system c
Measuring process 4: call the outer parameter of vision measurement parts that vision measurement parts intrinsic parameter that the demarcating steps 4 of two-dimensional image information and the whole measuring system of the testee (1) that measuring process 1 obtains obtains and measuring process 3 obtain;
Measuring process 5: utilize spatial point reconstruct equation, calculate the 3 d space coordinate of testee (1);
Measuring process 6: the 3 d space coordinate of output testee (1).
CN2010105286134A 2010-10-29 2010-10-29 Flexible stereoscopic vision measuring unit for target space coordinate Expired - Fee Related CN102042807B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN2010105286134A CN102042807B (en) 2010-10-29 2010-10-29 Flexible stereoscopic vision measuring unit for target space coordinate

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN2010105286134A CN102042807B (en) 2010-10-29 2010-10-29 Flexible stereoscopic vision measuring unit for target space coordinate

Publications (2)

Publication Number Publication Date
CN102042807A CN102042807A (en) 2011-05-04
CN102042807B true CN102042807B (en) 2012-06-20

Family

ID=43909167

Family Applications (1)

Application Number Title Priority Date Filing Date
CN2010105286134A Expired - Fee Related CN102042807B (en) 2010-10-29 2010-10-29 Flexible stereoscopic vision measuring unit for target space coordinate

Country Status (1)

Country Link
CN (1) CN102042807B (en)

Families Citing this family (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103175512B (en) * 2013-03-08 2014-12-31 中国人民解放军国防科学技术大学 Shooting measurement method of attitude of tail end of boom of concrete pump truck
KR101611135B1 (en) * 2015-07-01 2016-04-08 기아자동차주식회사 Vehicle wheel alignment measuring apparatus and measuring method
CN105067011A (en) * 2015-09-15 2015-11-18 沈阳飞机工业(集团)有限公司 Overall measurement system calibration method based on vision calibration and coordinate transformation
CN105423912A (en) * 2015-11-05 2016-03-23 天津大学 Light pen tracking system
CN107941166A (en) * 2017-11-14 2018-04-20 中国矿业大学 A kind of adjustable composite three-dimensional scanning means of visual field and method
CN108393887B (en) * 2018-01-31 2019-03-19 湖北工业大学 One kind being used for workpiece hand and eye calibrating method
CN109978948A (en) * 2019-03-25 2019-07-05 国网上海市电力公司工程建设咨询分公司 A kind of distance measurement method of view-based access control model
CN112033284B (en) * 2020-08-28 2022-05-17 北京睿呈时代信息科技有限公司 Memory, interactive measurement method, system and equipment based on monitoring video
CN112344855B (en) * 2020-10-27 2022-08-26 阿波罗智联(北京)科技有限公司 Obstacle detection method and device, storage medium and drive test equipment

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0829120A (en) * 1994-07-12 1996-02-02 Sumitomo Heavy Ind Ltd Position measuring method of object having curved surface and positioning controller for two objects having curved surface
EP1640688A1 (en) * 2004-09-24 2006-03-29 Konrad Maierhofer Method and Apparatus for Measuring the Surface on an Object in three Dimensions
CN101458072A (en) * 2009-01-08 2009-06-17 西安交通大学 Three-dimensional contour outline measuring set based on multi sensors and measuring method thereof
CN101539397B (en) * 2009-04-17 2010-08-04 中国人民解放军国防科学技术大学 Method for measuring three-dimensional attitude of object on precision-optical basis
CN201522266U (en) * 2009-11-11 2010-07-07 中国科学院沈阳自动化研究所 Computer-based binocular vision false-tooth scanning device

Also Published As

Publication number Publication date
CN102042807A (en) 2011-05-04

Similar Documents

Publication Publication Date Title
CN102042807B (en) Flexible stereoscopic vision measuring unit for target space coordinate
CN109990701B (en) Mobile measurement system and method for large-scale complex curved surface three-dimensional shape robot
CN106981083B (en) The substep scaling method of Binocular Stereo Vision System camera parameters
Kanade et al. A stereo machine for video-rate dense depth mapping and its new applications
CN102692214B (en) Narrow space binocular vision measuring and positioning device and method
CN102519434B (en) Test verification method for measuring precision of stereoscopic vision three-dimensional recovery data
CN108198224B (en) Linear array camera calibration device and calibration method for stereoscopic vision measurement
CN201522266U (en) Computer-based binocular vision false-tooth scanning device
CN104616292A (en) Monocular vision measurement method based on global homography matrix
CN102155923A (en) Splicing measuring method and system based on three-dimensional target
CN103337069B (en) High-quality three-dimensional color image acquisition methods and device based on multiple camera
CN1971206A (en) Calibration method for binocular vision sensor based on one-dimension target
CN102062588A (en) Computer binocular vision denture scanning device and three-dimensional reconstruction method thereof
CN104268876A (en) Camera calibration method based on partitioning
Liu et al. External parameter calibration of widely distributed vision sensors with non-overlapping fields of view
CN102679959A (en) Omnibearing 3D (Three-Dimensional) modeling system based on initiative omnidirectional vision sensor
Kim et al. Extrinsic calibration of a camera-LIDAR multi sensor system using a planar chessboard
CN103093460A (en) Moving camera virtual array calibration method based on parallel parallax
CN103679693A (en) Multi-camera single-view calibration device and calibration method thereof
CN103162623A (en) Stereoscopic measuring system for double vertically mounted cameras and calibration method
CN103606181A (en) Microscopic three-dimensional reconstruction method
CN103634588A (en) Image composition method and electronic apparatus
CN102914295A (en) Computer vision cube calibration based three-dimensional measurement method
CN111640156A (en) Three-dimensional reconstruction method, equipment and storage equipment for outdoor weak texture target
CN201945293U (en) Flexibility stereoscopic vision measurement device of target space coordinate

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20120620

Termination date: 20171029

CF01 Termination of patent right due to non-payment of annual fee