CN104933717A - Camera intrinsic and extrinsic parameter automatic calibration method based on directional calibration target - Google Patents

Camera intrinsic and extrinsic parameter automatic calibration method based on directional calibration target Download PDF

Info

Publication number
CN104933717A
CN104933717A CN201510338308.1A CN201510338308A CN104933717A CN 104933717 A CN104933717 A CN 104933717A CN 201510338308 A CN201510338308 A CN 201510338308A CN 104933717 A CN104933717 A CN 104933717A
Authority
CN
China
Prior art keywords
target
feature angle
target image
point
angle point
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201510338308.1A
Other languages
Chinese (zh)
Other versions
CN104933717B (en
Inventor
卢荣胜
殷玉龙
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hefei University of Technology
Original Assignee
Hefei University of Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hefei University of Technology filed Critical Hefei University of Technology
Priority to CN201510338308.1A priority Critical patent/CN104933717B/en
Publication of CN104933717A publication Critical patent/CN104933717A/en
Application granted granted Critical
Publication of CN104933717B publication Critical patent/CN104933717B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Image Analysis (AREA)

Abstract

The invention discloses a camera intrinsic and extrinsic parameter automatic calibration method based on a directional calibration target. The method is characterized in that subpixel coordinates of all feature angular points in a target image are extracted by use of an improved Harris angular point detection algorithm; a target coordinate system is established by use of a directional symbol pattern on the calibration target, so as to determine and acquire target coordinates in the target coordinate system respectively corresponding to all feature angular points in the target image; rotation and translation of the calibration target in a three-dimensional space are estimated by use of a plane target space posture estimation algorithm, so as to realize rotation direction judgment of the directional calibration target; and G calibrated images meeting conditions are acquired, then camera intrinsic and extrinsic parameters are resolved by use of a Zhengyou Zhang calibration algorithm. Through adoption of the method, automatic calibration of camera intrinsic and extrinsic parameters can be realized, so that flexibility and practicability of calibration of camera intrinsic and extrinsic parameters are improved.

Description

The camera interior and exterior parameter automatic calibration method of target is demarcated based on directivity
Technical field
The present invention relates to camera marking method field in computer vision, be specially the camera interior and exterior parameter automatic calibration method demarcating target based on directivity.
Background technology
Computer vision technique has a wide range of applications in the field such as Industry Control, surveying, and computer vision technique mainly utilizes the imaging of video camera, by the three-dimensional information of testee in image information acquisition space, rebuilds thus and recognition object.The root problem of computer vision technique is camera calibration, the mapping relations between 3 d space coordinate and two-dimensional image coordinate can be obtained by camera calibration technology, camera calibration technology is the research emphasis of computer vision measurement technology, the task of camera calibration is exactly the inside and outside parameter solving video camera, and camera calibration technology is more and more paid close attention to and developed.
Within 1986, Roger Tsai proposes the Camera Calibration Algorithm based on radial constraint, and this calibration algorithm needs 3D stereo target, makes calibration process dumb, before and after 1999, the Zhang Zhengyou (Z.Y Zhang) of Microsoft Research proposes the Camera Calibration Algorithm based on plane target drone, this calibration algorithm employs the plane target drone of directionless information, the sense of rotation of the plane target drone of directionless information cannot be judged in calibration process, and the Camera Calibration Algorithm based on plane target drone that Zhang Zhengyou (Z.Y Zhang) proposes requires that video camera photographs complete plane target drone, but often need the sense of rotation of Judge plane target in practical application, and video camera usually can photograph the plane target drone of local in actual calibration process, such as, carry out binocular camera timing signal, binocular camera system needs to take same plane target drone in space simultaneously, now need the sense of rotation judging plane target drone, set up relative to the changeless target co-ordinates system in the position of plane target drone thus, a left side in binocular camera system, right video camera has common target co-ordinates system, using target co-ordinates system as conversion intermediary, just a left side can be calculated, spatial relation between right video camera, left, when the public view field of right video camera is less, left, right video camera all can photograph the plane target drone of local.
Summary of the invention
The present invention is the weak point existed to overcome prior art, a kind of camera interior and exterior parameter automatic calibration method demarcating target based on directivity is provided, to the automatic Calibration of camera interior and exterior parameter can be realized, thus improve dirigibility and the practicality of camera interior and exterior parameter demarcation.
In order to achieve the above object, the technical solution adopted in the present invention is:
The present invention is based on the feature of camera interior and exterior parameter automatic calibration method that directivity demarcates target is: described directivity is demarcated target and comprised by black lattice and the white square gridiron pattern formed alternate with each other and the directivity pattern arranged in described gridiron pattern middle position vicinity; The intersection point of the white square that the intersection point of the black lattice be connected using any two diagonal angles or any two diagonal angles are connected demarcates the feature angle point of target as described directivity;
Described gridiron pattern comprise M capable × a N row described feature angle point; M and N is positive integer; The length of side of described black lattice and white square is W; W > 0;
Described directivity pattern is made up of 3 little marking patterns; Remember that described 3 little marking patterns are respectively the first marking pattern, the second marking pattern and the 3rd marking pattern, and described first marking pattern be black pattern, described second marking pattern is black pattern, described 3rd marking pattern is white pattern; The inside that described first marking pattern is positioned at the inside of described white square, described second marking pattern is positioned at described white square, described 3rd marking pattern are positioned at the inside of described black lattice;
The center of described first marking pattern is designated as first nodal point o 1, described second marking pattern center be designated as the second central point o 2, described 3rd marking pattern center be designated as the 3rd central point o 3; Described first nodal point o 1be positioned at the center position of described white square, described second central point o 2be positioned at the center position of described white square, described 3rd central point o 3be positioned at described black center of a lattice position; With described first nodal point o in described gridiron pattern 1as right-angled apices and by described first nodal point o 1, the second central point o 2with the 3rd central point o 3form right angle triangle Δ o 2o 1o 3; Described first nodal point o 1with the second central point o 2be arranged in the row be made up of described black lattice or white square, and described first nodal point o 1with the 3rd central point o 3be arranged in the row be made up of described black lattice or white square, or, described first nodal point o 1with the second central point o 2be arranged in the row be made up of described black lattice or white square, and described first nodal point o 1with the 3rd central point o 3be arranged in the row be made up of described black lattice or white square; By described first nodal point o 1with the second central point o 2the right-angle side formed and by first nodal point o 1with the 3rd central point o 3the right-angle side formed meet o 1 o 2 ‾ > 0 And o 1 o 3 ‾ > 0 And o 1 o 2 ‾ ≠ o 1 o 3 ‾ .
Described automatic calibration method carries out as follows:
Step 1, definition uncalibrated image sum G; Definition threshold rotating value κ 1with translation threshold value κ 2; Defining variable α, variable β, and initialization α=1; β=1;
Step 2, utilize position to fix video camera captured in real-time space in movement directivity demarcate target, obtain target image;
Step 3, with the initial point o of the upper left corner of described target image for feature angle point pixel coordinate system, being the x-axis direction of described feature angle point pixel coordinate system from left to right, is from up to down the y-axis direction of described feature angle point pixel coordinate system; Thus set up described feature angle point pixel coordinate system o-xy;
Step 4, initial point O using the photocentre of described video camera as camera coordinate system c; Using the x-axis direction of described feature angle point pixel coordinate system as the X of described camera coordinate system cdirection of principal axis; Using the y-axis direction of described feature angle point pixel coordinate system as the Y of described camera coordinate system cdirection of principal axis; The X of described camera coordinate system caxle, Y caxle and Z caxle meets the right-hand rule, thus sets up described camera coordinate system O c-X cy cz c;
Step 5, suppose that described target image is that directivity on α shift position is demarcated target and carried out the corresponding α width target image of imaging through described video camera;
Step 6, first nodal point, the second central point and the 3rd central point that the directivity on described α shift position is demarcated in target are designated as α first nodal point respectively α the second central point with α the 3rd central point
Choose and described α first nodal point space length, with α the second central point space length and with α the 3rd central point the minimum feature angle point of space length sum as the initial point of α target co-ordinates system with described α first nodal point with α the 3rd central point α the space vector formed direction as described α target co-ordinates system direction of principal axis; With described α first nodal point with α the second central point α the space vector formed direction as described α target co-ordinates system direction of principal axis, described α target co-ordinates system axle, axle and axle meets the right-hand rule, thus sets up described α target co-ordinates system
The Harris Corner Detection Algorithm that step 7, utilization are improved is extracted directivity in described α width target image and is demarcated the subpixel coordinate of each feature angle point under described feature angle point pixel coordinate system o-xy on target, thus obtains the subpixel coordinate set of all feature angle points under described feature angle point pixel coordinate system o-xy in described α width target image;
Step 8, according to described directivity pattern, determine and obtain subpixel coordinate under described feature angle point pixel coordinate system o-xy of all feature angle points in described α width target image and with all feature angle points in described α width target image in described α target co-ordinates system under the matching relationship of target co-ordinates one to one, the described matching relationship obtained is saved as the group of mating of all feature angle point subpixel coordinates and target co-ordinates in α width target image;
Step 9, according to the mate group of feature angle point subpixel coordinates all in described α width target image with target co-ordinates, plane target drone spatial attitude estimating algorithm is utilized to calculate from described α target co-ordinates system transform to described camera coordinate system O c-X cy cz cα target rotation matrix R αwith α target translation matrix T α, thus realize the sense of rotation judgement that described directivity demarcates target;
Step 10, judge whether α equals 1, if α equals 1, then using described α width target image as β width uncalibrated image; And feature angle point subpixel coordinates all in described α width target image and target co-ordinates mated the group of mating that group saves as all feature angle point subpixel coordinates and target co-ordinates in β width uncalibrated image; Simultaneously by described α target rotation matrix R αwith described α target translation matrix T αassignment gives β to demarcate rotation matrix R respectively β' and β demarcation translation matrix T β' after, perform step 11; Otherwise perform step 12;
Step 11, by α+1 assignment to α, and return step 5 order perform;
Step 12, formula (1) and formula (2) is utilized to obtain from described α target co-ordinates system transform to described β target co-ordinates system rotation matrix R α, βwith translation matrix T α, β:
R α , β = ( R β ′ ) - 1 · R α - - - ( 1 )
T α , β = ( R β ′ ) - 1 · ( T α - T β ′ ) - - - ( 2 )
Step 13, judge rotation matrix R α, βnorm whether be greater than described threshold rotating value κ 1, and described translation matrix T α, βnorm whether be greater than translation threshold value κ 2if be all greater than, then by β+1 assignment to β, and using described α width target image as β width uncalibrated image; And feature angle point subpixel coordinates all in described α width target image and target co-ordinates mated the group of mating that group saves as all feature angle point subpixel coordinates and target co-ordinates in β width uncalibrated image; Simultaneously by described α target rotation matrix R αwith described α target translation matrix T αassignment gives β to demarcate rotation matrix R respectively β' and β demarcation translation matrix T β' after, perform step 14; Otherwise perform step 14;
Step 14, judge that whether β and uncalibrated image sum G is equal, if equal, then what obtain all feature angle point subpixel coordinates and target co-ordinates in G width uncalibrated image and G uncalibrated image mates group, and performs step 15; Otherwise return step 11 to perform;
Step 15, mate group according to feature angle point subpixel coordinates all in a described G uncalibrated image and target co-ordinates, utilize Zhang Zhengyou Camera Calibration Algorithm to calculate the inside and outside parameter of described video camera.
Feature of demarcating the camera interior and exterior parameter automatic calibration method of target based on directivity of the present invention is also, described step 7 is carried out according to the following procedure:
Step 7.1, to suppose in described α width target image that directivity is demarcated on target and had M 0oK × N 0a row feature angle point; 0 < M 0≤ M, 0 < N 0≤ N; M 0and N 0be integer; Iteration ends threshold value ψ is set; Definition iterations ξ;
Step 7.2, utilize Harris Corner Detection Algorithm to extract directivity in described α width target image to demarcate the Pixel-level coordinate of each feature angle point under described feature angle point pixel coordinate system o-xy on target, thus obtain directivity in described α width target image and demarcate all M on target 0oK × N 0the row Pixel-level coordinate set of feature angle point under described feature angle point pixel coordinate system o-xy;
Step 7.3, suppose that any one the feature angle point demarcated on target of directivity in described α width target image is that in α width target image, directivity demarcates τ feature angle point on target; 1≤τ≤(M 0× N 0); τ is integer; Initialization τ=1;
Step 7.4, initialization ξ=0;
Step 7.5, coordinate when the Pixel-level coordinate of τ feature angle point under described feature angle point pixel coordinate system o-xy that directivity in described α width target image is demarcated on target being designated as τ feature angle point the ξ time iteration in α width target image
Step 7.6, to obtain with the coordinate in described α width target image during τ feature angle point the ξ time iteration centered by μ × μ neighborhood, described with the coordinate in α width target image during τ feature angle point the ξ time iteration centered by μ × μ neighborhood not crossing with described directivity pattern; Described with α width target image in τ feature angle point the ξ time iteration time coordinate centered by μ × μ neighborhood in except the coordinate in described α width target image during τ feature angle point the ξ time iteration the coordinate of any one pixel is in addition designated as 1≤η≤(μ × μ-1); μ is odd number;
Step 7.7, to set up such as formula the subpixel coordinate optimization object function in the α width target image shown in (3) during τ feature angle point the ξ time iteration
S &alpha; , &tau; ( &xi; ) = &Sigma; &eta; = 1 &mu; &times; &mu; - 1 ( &dtri; H &alpha; , &tau; , &eta; ( &xi; ) &CenterDot; C &alpha; , &tau; ( &xi; ) D &alpha; , &tau; , &eta; ( &xi; ) &RightArrow; ) 2 - - - ( 3 )
In formula (3), represent described with α width target image in τ feature angle point the ξ time iteration time coordinate centered by μ × μ neighborhood in except the coordinate in described α width target image during τ feature angle point the ξ time iteration any one pixel coordinate in addition the shade of gray at place; represent the coordinate from described α width target image during τ feature angle point the ξ time iteration to any one pixel coordinate described vector;
Step 7.8, subpixel coordinate optimization object function when calculating τ feature angle point the ξ time iteration in described α width target image value;
Step 7.9, judgement whether be less than ψ, if be less than, then by the coordinate in described α width target image during τ feature angle point the ξ time iteration save as directivity in α width target image and demarcate the subpixel coordinate of τ feature angle point under described feature angle point pixel coordinate system o-xy on target, and perform step 7.10; Otherwise perform step 7.11;
Step 7.10, judge whether τ equals (M 0× N 0), if equal, then obtain the subpixel coordinate set of all feature angle points under described feature angle point pixel coordinate system o-xy in described α width target image; Otherwise, by τ+1 assignment to τ, and return the execution of step 7.4 order;
Step 7.11, according to the subpixel coordinate optimization object function in described α width target image during τ feature angle point the ξ time iteration coordinate when utilizing Levenberg-Marquard optimized algorithm iterative to obtain τ feature angle point ξ+1 iteration in α width target image by ξ+1 assignment to ξ, and return the execution of step 7.6 order.
Compared with the prior art, beneficial effect of the present invention is embodied in:
1, automatic calibration method proposed by the invention is carrying out timing signal to video camera, the directivity that video camera is taken in three dimensions in real time demarcates target, target image can be obtained in real time thus, utilize the Harris Corner Detection Algorithm improved automatically can extract directivity in target image and demarcate the subpixel coordinate of feature angle point on target, the directivity pattern on target is demarcated according to directivity, computing machine automatically can obtain the subpixel coordinate of feature angle point in target image and the matching relationship of the target co-ordinates corresponding with it, utilize plane target drone spatial attitude estimating algorithm can calculate directivity rapidly and demarcate target relative to the rotation of camera coordinate system and translation, as long as rotation and translation are all greater than the threshold value of setting, then this target image is saved as uncalibrated image, final acquisition G width uncalibrated image, utilize Zhang Zhengyou Camera Calibration Algorithm and G width uncalibrated image then can calculate the inside and outside parameter of video camera, best target image automatically can be selected for camera calibration at whole camera interior and exterior parameter calibration process Computer, without the need to artificial intervention in calibration process, thus improve dirigibility and the practicality of camera interior and exterior parameter demarcation,
2, present invention utilizes directivity demarcation target to demarcate video camera, make in camera calibration process, to judge that directivity demarcates the sense of rotation of target, camera interior and exterior parameter automatic calibration method provided by the invention is utilized to carry out timing signal to video camera, even if the directivity demarcation target that video camera photographs local still can be demarcated, thus significantly improves the demarcation performance that directivity demarcates target.
3, the present invention demarcates the directivity pattern on target according to directivity, utilize plane target drone spatial attitude estimating algorithm can calculate directivity rapidly and demarcate target relative to the rotation matrix R of camera coordinate system and translation matrix T, can judge that directivity demarcates target in three dimensions relative to the sense of rotation of camera coordinate system thus, make the present invention can be applicable to Binocular Stereo Vision System to demarcate, there is good actual application value;
4, after the present invention utilizes the subpixel coordinate of feature angle point in the Harris Corner Detection Algorithm extraction target image of improvement, the feature angle point subpixel coordinate detected is sorted, even if the directivity making video camera in calibration process photograph local demarcates target still can carry out camera calibration, thus significantly improve the precision of camera interior and exterior parameter demarcation; And improve directivity and demarcate operability in the demarcation performance of target and camera calibration process;
5, camera interior and exterior parameter automatic calibration method provided by the invention demarcates target based on directivity, and the patterning that directivity demarcates target is simple, and directivity is demarcated target and is easy to processing and to make precision high, has good practical value.
Accompanying drawing explanation
Fig. 1 is directivity pattern of the present invention is 3 directivity demarcation target schematic diagram indicating annulus;
Fig. 2 is that 3 in the present invention α width target image indicate annulus and Vector Groups the first relation schematic diagram;
Fig. 3 is that 3 in the present invention α width target image indicate annulus and Vector Groups the second relation schematic diagram;
Fig. 4 is that 3 in the present invention α width target image indicate annulus and Vector Groups the third relation schematic diagram;
Fig. 5 is that 3 in the present invention α width target image indicate annulus and Vector Groups the 4th kind of relation schematic diagram.
Embodiment
In the present embodiment, as shown in Figure 1, directivity demarcates that target is made up of black lattice and the white square gridiron pattern that forms alternate with each other and the directivity pattern that arranges in gridiron pattern middle position vicinity; The intersection point of the white square that the intersection point of the black lattice be connected using any two diagonal angles or any two diagonal angles are connected demarcates the feature angle point of target as directivity; Gridiron pattern comprise M capable × a N row feature angle point; M and N is positive integer; The length of side of black lattice and white square is W; W > 0; In the present embodiment, directivity is as shown in Figure 1 demarcated target and is comprised a 9 row × 12 row feature angle point; Its directivity demarcates the black lattice of target and the length of side of white square is 18 millimeters;
Directivity pattern is made up of 3 little marking patterns; Remember that 3 little marking patterns are respectively the first marking pattern, the second marking pattern and the 3rd marking pattern, as shown in Figure 1, the directivity pattern that directivity is demarcated in target is made up of 3 mark annulus, also can be made up of 3 mark circles; First marking pattern is black pattern, the second marking pattern is black pattern, the 3rd marking pattern is white pattern; And the inside that the first marking pattern is positioned at the inside of white square, the second marking pattern is positioned at white square, the 3rd marking pattern are positioned at the inside of black lattice; In the present embodiment, as shown in Figure 1, white marker annulus is the 3rd marking pattern, and the black designation annulus nearer apart from white marker annulus is the first marking pattern, and the black designation annulus far away apart from white marker annulus is the second marking pattern; The first marking pattern in Fig. 1 and the interior diameter of the second marking pattern and overall diameter are respectively 4 millimeters and 16 millimeters, and interior diameter and the overall diameter of the 3rd marking pattern in Fig. 1 are respectively 7 millimeters and 9 millimeters;
The center of the first marking pattern is designated as first nodal point o 1, the second marking pattern center be designated as the second central point o 2, the 3rd marking pattern center be designated as the 3rd central point o 3; And first nodal point o 1with the second central point o 2be positioned at the center position of white square, the 3rd central point o 3be positioned at black center of a lattice position; With first nodal point o in gridiron pattern 1as right-angled apices and by first nodal point o 1, the second central point o 2with the 3rd central point o 3form right angle triangle Δ o 2o 1o 3; As shown in Figure 1, first nodal point o 1with the second central point o 2be arranged in the row be made up of black lattice or white square, and first nodal point o 1with the 3rd central point o 3be arranged in the row be made up of black lattice or white square, by first nodal point o in Fig. 1 1to the second central point o 2distance be 36 millimeters, by first nodal point o 1to the 3rd central point o 3distance be 18 millimeters; By first nodal point o 1with the second central point o 2the right-angle side formed and by first nodal point o 1with the 3rd central point o 3the right-angle side formed meet and and o 1 o 2 &OverBar; &NotEqual; o 1 o 3 &OverBar; .
In the present embodiment, demarcate target for the directivity shown in Fig. 1, the camera interior and exterior parameter automatic calibration method that the present invention is based on directivity demarcation target be described in detail:
A kind of camera interior and exterior parameter automatic calibration method based on directivity demarcation target carries out to step 15 by step 1:
Step 1, definition uncalibrated image sum G; Definition threshold rotating value κ 1with translation threshold value κ 2; Defining variable α, variable β, and initialization α=1; β=1;
Step 2, Fig. 1 are the pictorial diagram that directivity demarcates target, and the directivity of movement in the video camera captured in real-time space utilizing position to fix demarcates target, can obtain target image in real time thus, comprise the image information that directivity demarcates target in target image;
Step 3, set up feature angle point pixel coordinate system o-xy: with the initial point o of the upper left corner of target image for feature angle point pixel coordinate system, be the x-axis direction of feature angle point pixel coordinate system from left to right, be from up to down the y-axis direction of feature angle point pixel coordinate system, thus establish feature angle point pixel coordinate system o-xy;
Step 4, set up camera coordinate system O c-X cy cz c: the initial point O using the photocentre of video camera as camera coordinate system c, using the x-axis direction of feature angle point pixel coordinate system as the X of camera coordinate system cdirection of principal axis; Using the y-axis direction of feature angle point pixel coordinate system as the Y of camera coordinate system cdirection of principal axis; The X of camera coordinate system caxle, Y caxle and Z caxle meets the right-hand rule, thus sets up camera coordinate system O c-X cy cz c;
Directivity in step 5, a α shift position is demarcated target and is carried out through video camera the image that imaging obtains and be designated as α width target image;
In step 6, the present embodiment, first nodal point, the second central point and the 3rd central point that the directivity on α shift position is demarcated in target is designated as α first nodal point respectively α the second central point with α the 3rd central point
In the present embodiment, set up α target co-ordinates system according to the directivity pattern (being 3 mark annulus) that the directivity on α shift position is demarcated in target choose and α first nodal point space length, with α the second central point space length and with α the 3rd central point the minimum feature angle point of space length sum as the initial point of α target co-ordinates system from α first nodal point to α the 3rd central point α the space vector formed direction as α target co-ordinates system direction of principal axis; From α first nodal point to α the second central point α the space vector formed direction as α target co-ordinates system direction of principal axis, α target co-ordinates system axle, axle and axle meets the right-hand rule, thus sets up α target co-ordinates system
The Harris Corner Detection Algorithm that step 7, utilization are improved is extracted directivity in α width target image and is demarcated the subpixel coordinate of each feature angle point under feature angle point pixel coordinate system o-xy on target, thus obtains the subpixel coordinate set of all feature angle points under feature angle point pixel coordinate system o-xy in α width target image;
In the present embodiment, utilizing the Harris Corner Detection Algorithm improved to extract in α width target image directivity, to demarcate the subpixel coordinate of each feature angle point under feature angle point pixel coordinate system o-xy on target be carry out to step 7.11 by step 7.1:
Step 7.1, to suppose in α width target image that directivity is demarcated on target and had individual feature angle point; in α width target image, directivity demarcates the line number of feature angle point on target the columns of feature angle point on target is demarcated with directivity in α width target image be integer; Iteration ends threshold value ψ is set; Definition iterations ξ;
Step 7.2, utilize Harris Corner Detection Algorithm to extract directivity in α width target image and demarcate the Pixel-level coordinate of each feature angle point under feature angle point pixel coordinate system o-xy on target, Harris Corner Detection Algorithm is proposed in 1988 by people such as Chris Harris, the principle that realizes of Harris Corner Detection Algorithm can see document " A combined corner and edgedetector ", utilize the cvcornerHarris function of increasing income in the OpenCV of storehouse directivity can demarcate the Pixel-level coordinate of each feature angle point under feature angle point pixel coordinate system o-xy on target in rapid extraction α width target image, thus obtain in α width target image all on directivity demarcation target the Pixel-level coordinate set of individual feature angle point under feature angle point pixel coordinate system o-xy,
In step 7.3, the present embodiment, suppose that in α width target image, directivity any one feature angle point demarcated on target is that in α width target image, directivity demarcates τ feature angle point on target; τ is integer; Initialization τ=1;
Step 7.4, initialization iterations ξ=0;
Step 7.5, coordinate when the Pixel-level coordinate of τ feature angle point under feature angle point pixel coordinate system o-xy that directivity in α width target image is demarcated on target being designated as τ feature angle point the ξ time iteration in α width target image
Step 7.6, to obtain with the coordinate in α width target image during τ feature angle point the ξ time iteration centered by 5 × 5 neighborhoods, with the coordinate in α width target image during τ feature angle point the ξ time iteration centered by 5 × 5 neighborhoods do not indicate with 3 that any one annulus in annulus is crossing; Coordinate in α width target image during τ feature angle point the ξ time iteration centered by 5 × 5 neighborhoods in except the coordinate in α width target image during τ feature angle point the ξ time iteration the coordinate of any one pixel is in addition designated as 1≤η≤(μ × μ-1); μ is odd number; In the present embodiment, 1≤η≤(5 × 5-1);
Step 7.7, to set up such as formula the subpixel coordinate optimization object function in the α width target image shown in (1) during τ feature angle point the ξ time iteration
S &alpha; , &tau; ( &xi; ) = &Sigma; &eta; = 1 5 &times; 5 - 1 ( &dtri; H &alpha; , &tau; , &eta; ( &xi; ) &CenterDot; C &alpha; , &tau; ( &xi; ) D &alpha; , &tau; , &eta; ( &xi; ) &RightArrow; ) 2 - - - ( 1 )
In formula (1), represent coordinate during τ feature angle point the ξ time iteration in α width target image centered by 5 × 5 neighborhoods in except the coordinate in α width target image during τ feature angle point the ξ time iteration any one pixel coordinate in addition the shade of gray at place; represent the coordinate from α width target image during τ feature angle point the ξ time iteration to coordinate during τ feature angle point the ξ time iteration in α width target image centered by 5 × 5 neighborhoods in except the coordinate in α width target image during τ feature angle point the ξ time iteration any one pixel coordinate in addition vector;
Step 7.8, subpixel coordinate optimization object function when calculating τ feature angle point the ξ time iteration in α width target image value;
Step 7.9, judgement whether be less than ψ, if be less than, then by the coordinate in α width target image during τ feature angle point the ξ time iteration save as directivity in α width target image and demarcate the subpixel coordinate of τ feature angle point under feature angle point pixel coordinate system o-xy on target, and perform step 7.10; Otherwise perform step 7.11;
Step 7.10, judge whether τ equals if equal, then obtain the subpixel coordinate set of all feature angle points under feature angle point pixel coordinate system o-xy in α width target image; Otherwise, by τ+1 assignment to τ, and return the execution of step 7.4 order;
Step 7.11, according to the subpixel coordinate optimization object function in α width target image during τ feature angle point the ξ time iteration coordinate when utilizing Levenberg-Marquard optimized algorithm iterative and obtain τ feature angle point ξ+1 iteration in α width target image by ξ+1 assignment to ξ, and return the execution of step 7.6 order;
In the present embodiment, the principle that realizes of the Levenberg-Marquard optimized algorithm in step 7.11 can see Science Press's " machine vision " (author of " machine vision " be Zhang Guangjun academician) in publication in 2005, and the 74th page to 75 pages of " machine vision " this this book describe Levenberg-Marquard optimized algorithm; Utilize the lmdif0 function of increasing income in the Cminpack of storehouse can realize Levenberg-Marquard nonlinear optimization.
Step 8,3 the mark annulus demarcated according to directivity on target, determine and obtain subpixel coordinate under feature angle point pixel coordinate system o-xy of all feature angle points in α width target image and with all feature angle points in α width target image in α target co-ordinates system under the matching relationship of target co-ordinates one to one, specific implementation process is carried out to step 8.18 by step 8.1:
In step 8.1, the present embodiment, the subpixel coordinate set of feature angle points all in α width target image under feature angle point pixel coordinate system o-xy is denoted as α stack features angular-point sub-pixel level coordinate set, from α stack features angular-point sub-pixel level coordinate set, chooses the maximum subpixel coordinate of subpixel coordinate y component as the 0th row the 0th row subpixel coordinate in α width target image 0th row the 0th row subpixel coordinate in α width target image characteristic of correspondence angle point is designated as the 0th row the 0th row feature angle point in α width target image
Step 8.2, in α stack features angular-point sub-pixel level coordinate set the 0th row the 0th row subpixel coordinate in selected distance α width target image three nearest subpixel coordinates, are designated as the first subpixel coordinate in α width target image respectively second subpixel coordinate in α width target image with the 3rd subpixel coordinate in α width target image first subpixel coordinate in α width target image characteristic of correspondence angle point is designated as fisrt feature angle point in α width target image second subpixel coordinate in α width target image characteristic of correspondence angle point is designated as second feature angle point in α width target image 3rd subpixel coordinate in α width target image characteristic of correspondence angle point is designated as third feature angle point in α width target image wherein, the 0th row the 0th row feature angle point in α width target image fisrt feature angle point in α width target image second feature angle point in α width target image with third feature angle point in α width target image be respectively 4 mutually different feature angle points;
0th row the 0th row feature angle point in step 8.3, α width target image with fisrt feature angle point in α width target image between degree of tilt be designated as the first degree of tilt in α width target image 0th row the 0th row feature angle point in α width target image with second feature angle point in α width target image between degree of tilt be designated as the second degree of tilt in α width target image 0th row the 0th row feature angle point in α width target image with third feature angle point in α width target image between degree of tilt be designated as the 3rd degree of tilt in α width target image formula (2), formula (3) and formula (4) is utilized to calculate the first degree of tilt in α width target image respectively second degree of tilt in α width target image with the 3rd degree of tilt in α width target image
k 1 ( &alpha; ) = ( x 1 ( &alpha; ) - x 00 ( &alpha; ) ) / ( y 1 ( &alpha; ) - y 00 ( &alpha; ) ) - - - ( 2 )
k 2 ( &alpha; ) = ( x 2 ( &alpha; ) - x 00 ( &alpha; ) ) / ( y 2 ( &alpha; ) - y 00 ( &alpha; ) ) - - - ( 3 )
k 3 ( &alpha; ) = ( x 3 ( &alpha; ) - x 00 ( &alpha; ) ) / ( y 3 ( &alpha; ) - y 00 ( &alpha; ) ) - - - ( 4 )
Step 8.4, choose the first degree of tilt in α width target image second degree of tilt in α width target image and the 3rd degree of tilt in α width target image feature angle point corresponding to middle minimum value is designated as the 0th row the 1st row feature angle point in α width target image feature angle point corresponding to maximal value is designated as the 1st row the 0th row feature angle point in α width target image
Step 8.5, formula (5) and formula (6) is utilized to calculate line direction locating vector in α width target image respectively with column direction locating vector in α width target image
Step 8.6, defining variable i and variable j; I and j is integer; Set up feature angle point subpixel coordinate three-dimensional array cor in α width target image (α)[9] [12] [2]; And set up the target co-ordinates three-dimensional array wor corresponding with feature angle point in α width target image (α)[9] [12] [3];
Step 8.7, initialization i=0;
Step 8.8, initialization j=0;
Step 8.9, by the i-th row jth row feature angle point in α width target image subpixel coordinate x component y component respectively stored in feature angle point subpixel coordinate three-dimensional array cor in α width target image (α)[9] the element cor in [12] [2] (α)[i] [j] [0] and element cor (α)in [i] [j] [1];
Step 8.10, with the i-th row jth row feature angle point in α width target image for search starting point, line direction locating vector in α width target image direction on search with α width target image in the i-th row jth row feature angle point the i-th row jth+1 row pairing feature angle point in nearest α width target image;
If step 8.11 searches the i-th row jth+1 row pairing feature angle point in α width target image, then the i-th row jth+1 row pairing feature angle point in α width target image is separately designated as the i-th row jth+1 row feature angle point in α width target image and by the i-th row jth+1 row feature angle point in α width target image subpixel coordinate x component y component respectively stored in feature angle point subpixel coordinate three-dimensional array cor in α width target image (α)[9] the element cor in [12] [2] (α)[i] [j+1] [0] and element cor (α)in [i] [j+1] [1], and by j+1 assignment to after j, return step 8.10; Otherwise, demarcate the columns of feature angle point on target j+1 assignment to directivity in α width target image perform step 8.12;
Step 8.12, initialization j=0;
Step 8.13, with the i-th row jth row feature angle point in α width target image as search starting point, column direction locating vector in α width target image direction on search with α width target image in the i-th row jth row feature angle point the i-th+1 row jth row pairing feature angle point in nearest α width target image;
If step 8.14 searches the i-th+1 row jth row pairing feature angle point in α width target image, then the i-th+1 row jth row pairing feature angle point in α width target image is separately designated as the i-th+1 row jth row feature angle point in α width target image and by the i-th+1 row jth row feature angle point in α width target image subpixel coordinate x component y component respectively stored in feature angle point subpixel coordinate three-dimensional array cor in α width target image (α)[9] the element cor in [12] [2] (α)[i+1] [j] [0] and element cor (α)in [i+1] [j] [1], and by i+1 assignment to after i, return step 8.10; Otherwise, demarcate the line number of feature angle point on target i+1 assignment to directivity in α width target image and perform step 8.15;
Step 8.15, position 3 mark annulus that directivity in α width target image is demarcated on target and distinguish, specific embodiments is by 1., 2. and 3. carrying out as follows:
1. from α stack features angular-point sub-pixel level coordinate set, the minimum subpixel coordinate of the maximum subpixel coordinate of the maximum subpixel coordinate of subpixel coordinate y component, subpixel coordinate x component, subpixel coordinate y component and the minimum subpixel coordinate of subpixel coordinate x component is chosen respectively as the first outermost subpixel coordinate in α width target image second outermost subpixel coordinate in α width target image 3rd outermost subpixel coordinate in α width target image with the 4th outermost subpixel coordinate in α width target image first outermost subpixel coordinate in α width target image second outermost subpixel coordinate in α width target image 3rd outermost subpixel coordinate in α width target image with the 4th outermost subpixel coordinate in α width target image characteristic of correspondence angle point is designated as the first outermost feature angle point in α width target image respectively second outermost feature angle point in α width target image 3rd outermost feature angle point in α width target image with the 4th outermost feature angle point in α width target image first outermost feature angle point in α width target image second outermost feature angle point in α width target image 3rd outermost feature angle point in α width target image with the 4th outermost feature angle point in α width target image the quadrilateral formed is designated as α quadrilateral; Because 3 mark annulus in α width target image are positioned at the inside of α quadrilateral, the present embodiment utilizes α quadrilateral to remove complex background in α width target image, the concrete grammar of the complex background utilizing α quadrilateral to remove in α width target image is: by α width target image and the gray-scale value assignment being positioned at the pixel of α quadrilateral outside is 0, in α width target image and the gray-scale value being positioned at the pixel of α quadrilateral inside remain unchanged, thus the target image of α width without complex background can be obtained;
2. without the target image of complex background, binary conversion treatment is carried out to α width, thus obtain α width without complex background binaryzation target image, the gray-scale value that α width demarcates all white square places on target without directivity in complex background binaryzation target image is 255, and the gray-scale value that α width demarcates all black lattice places on target without directivity in complex background binaryzation target image is 0, and the gray-scale value that α width demarcates two black designation annulus places on target without directivity in complex background binaryzation target image is 0, and the gray-scale value that α width demarcates white marker annulus place on target without directivity in complex background binaryzation target image is 255,
3. without complex background binaryzation target image, expansion process is carried out to α width, expansion process makes α width " reduce " without the white connected domain " growth " in complex background binaryzation target image, black connected domain, after expansion process, the point of intersection that α width demarcates the black lattice that any two diagonal angles are connected on target without directivity in complex background binaryzation target image disconnects, but 3 mark annulus in α width target image do not disconnect; calculate 3 barycenter 3 center-of-mass coordinates under feature angle point pixel coordinate system o-xy of α width without 3 minimum white connected domains in complex background binaryzation target image respectively, read the gray-scale value of α width without 3 center-of-mass coordinate places of 3 minimum white connected domains in complex background binaryzation target image respectively: the connected domain center-of-mass coordinate of gray-scale value corresponding to 0 is the pixel coordinate of white marker circle Ring current distribution under feature angle point pixel coordinate system o-xy in α width target image, gray-scale value is 255 and the connected domain center-of-mass coordinate that in distance α width target image, white marker circle Ring current distribution is nearer is the nearer pixel coordinate of black designation circle Ring current distribution under feature angle point pixel coordinate system o-xy of α width target image middle distance white marker annulus, gray-scale value is 255 and the connected domain center-of-mass coordinate that in distance α width target image, white marker circle Ring current distribution is far away is the α width target image middle distance white marker annulus pixel coordinate of black designation circle Ring current distribution under feature angle point pixel coordinate system o-xy far away, complete the location of 3 mark annulus in α width target image thus and distinguish, the pixel coordinate of black designation nearer for α width target image middle distance white marker annulus circle Ring current distribution under feature angle point pixel coordinate system o-xy is designated as the first picture point pixel coordinate in α width target image the pixel coordinate of black designation far away for α width target image middle distance white marker annulus circle Ring current distribution under feature angle point pixel coordinate system o-xy is designated as the second picture point pixel coordinate in α width target image the pixel coordinate of white marker circle Ring current distribution under feature angle point pixel coordinate system o-xy in α width target image is designated as the 3rd picture point pixel coordinate in α width target image black designation nearer for α width target image middle distance white marker annulus circle Ring current distribution is designated as the first picture point in α width target image black designation far away for α width target image middle distance white marker annulus circle Ring current distribution is designated as the second picture point in α width target image white marker circle Ring current distribution in α width target image is designated as the 3rd picture point in α width target image
Step 8.16, be chosen at feature angle point subpixel coordinate three-dimensional array cor in α width target image (α)[9] nearest feature angle point subpixel coordinate in the α width target image stored in [12] [2] nearest feature angle point subpixel coordinate in α width target image with the first picture point pixel coordinate in α width target image distance, with α width target image in the second picture point pixel coordinate distance and with the 3rd picture point pixel coordinate in α width target image distance sum is minimum, and nearest feature angle point subpixel coordinate in α width target image α target co-ordinates system initial point the subpixel coordinate of imaging point in α width target image; with be integer;
Step 8.17, formula (7) and formula (8) is utilized to obtain cosine value cos α in α width target image (α)with cos β (α):
In formula (7) and formula (8), with
Step 8.18, according to the cosine value cos α in α width target image (α)with cos β (α), and judge Vector Groups and Vector Groups between relation, thus obtain directivity in α width target image and demarcate the arbitrary characteristics angular-point sub-pixel level coordinate (cor on target (α)[p (α)] [q (α)] [0], cor (α)[p (α)] [q (α)] [1]) in α target co-ordinates system lower mated target co-ordinates (wor (α)[p (α)] [q (α)] [0], wor (α)[p (α)] [q (α)] [1], wor (α)[p (α)] [q (α)] [2]) and successively stored in the target co-ordinates three-dimensional array wor corresponding with feature angle point in α width target image (α)[9] in [12] [3]; Wherein p (α)and q (α)be integer; As shown in Fig. 2, Fig. 3, Fig. 4, Fig. 5, Vector Groups and Vector Groups following four kinds of relations will be had:
The first relation: if | cos α (α)| > | cos β (α)| and and then in α width target image, directivity demarcates the arbitrary characteristics angular-point sub-pixel level coordinate (cor on target (α)[p (α)] [q (α)] [0], cor (α)[p (α)] [q (α)] [1]) in α target co-ordinates system lower mated target co-ordinates (wor (α)[p (α)] [q (α)] [0], wor (α)[p (α)] [q (α)] [1], wor (α)[p (α)] [q (α)] [2]) be respectively wor ( &alpha; ) [ p ( &alpha; ) ] [ q ( &alpha; ) ] [ 0 ] = 18 &times; ( q ( &alpha; ) - n 0 ( &alpha; ) ) , wor ( &alpha; ) [ p ( &alpha; ) ] [ q ( &alpha; ) ] [ 1 ] = 18 &times; ( p ( &alpha; ) - m 0 ( &alpha; ) ) , wor (α)[p (α)][q (α)][2]=0;
The second relation: if | cos α (α)| > | cos β (α)| and and then in α width target image, directivity demarcates the arbitrary characteristics angular-point sub-pixel level coordinate (cor on target (α)[p (α)] [q (α)] [0], cor (α)[p (α)] [q (α)] [1]) in α target co-ordinates system lower mated target co-ordinates (wor (α)[p (α)] [q (α)] [0], wor (α)[p (α)] [q (α)] [1], wor (α)[p (α)] [q (α)] [2]) be respectively wor ( &alpha; ) [ p ( &alpha; ) ] [ q ( &alpha; ) ] [ 0 ] = 18 &times; ( n 0 ( &alpha; ) - q ( &alpha; ) ) , wor ( &alpha; ) [ p ( &alpha; ) ] [ q ( &alpha; ) ] [ 1 ] = 18 &times; ( m 0 ( &alpha; ) - p ( &alpha; ) ) , wor (α)[p (α)][q (α)][2]=0;
The third relation: if | cos α (α)| < | cos β (α)| and and then in α width target image, directivity demarcates the arbitrary characteristics angular-point sub-pixel level coordinate (cor on target (α)[p (α)] [q (α)] [0], cor (α)[p (α)] [q (α)] [1]) in α target co-ordinates system lower mated target co-ordinates (wor (α)[p (α)] [q (α)] [0], wor (α)[p (α)] [q (α)] [1], wor (α)[p (α)] [q (α)] [2]) be respectively wor ( &alpha; ) [ p ( &alpha; ) ] [ q ( &alpha; ) ] [ 0 ] = 18 &times; ( p ( &alpha; ) - m 0 ( &alpha; ) ) , wor ( &alpha; ) [ p ( &alpha; ) ] [ q ( &alpha; ) ] [ 1 ] = 18 &times; ( n 0 ( &alpha; ) - q ( &alpha; ) ) , wor (α)[p (α)][q (α)][2]=0;
4th kind of relation: if | cos α (α)| < | cos β (α)| and and then in α width target image, directivity demarcates the arbitrary characteristics angular-point sub-pixel level coordinate (cor on target (α)[p (α)] [q (α)] [0], cor (α)[p (α)] [q (α)] [1]) in α target co-ordinates system lower mated target co-ordinates (wor (α)[p (α)] [q (α)] [0], wor (α)[p (α)] [q (α)] [1], wor (α)[p (α)] [q (α)] [2]) be respectively wor ( &alpha; ) [ p ( &alpha; ) ] [ q ( &alpha; ) ] [ 0 ] = 18 &times; ( m 0 ( &alpha; ) - p ( &alpha; ) ) , wor ( &alpha; ) [ p ( &alpha; ) ] [ q ( &alpha; ) ] [ 1 ] = 18 &times; ( p ( &alpha; ) - n 0 ( &alpha; ) ) , wor (α)[p (α)][q (α)][2]=0;
Vector Groups according to the present embodiment and Vector Groups between 4 kinds of relations, thus obtain directivity in α width target image and demarcate the arbitrary characteristics angular-point sub-pixel level coordinate (cor on target (α)[p (α)] [q (α)] [0], cor (α)[p (α)] [q (α)] [1]) in α target co-ordinates system lower mated target co-ordinates (wor (α)[p (α)] [q (α)] [0], wor (α)[p (α)] [q (α)] [1], wor (α)[p (α)] [q (α)] [2]);
The present embodiment by the subpixel coordinate of all feature angle points under feature angle point pixel coordinate system o-xy in the α width target image that obtains in step 8 and with all feature angle points in α width target image in α target co-ordinates system under the matching relationship of target co-ordinates one to one save as the group of mating of all feature angle point subpixel coordinates and target co-ordinates in α width target image;
Step 9, according to the mate group of feature angle point subpixel coordinates all in α width target image with target co-ordinates, utilize plane target drone spatial attitude estimating algorithm just can calculate from α target co-ordinates system transform to camera coordinate system O c-X cy cz cα target rotation matrix R αwith α target translation matrix T α, thus realize the sense of rotation judgement that directivity demarcates target; The principle that realizes of the plane target drone spatial attitude estimating algorithm used in the present embodiment can see document " RobustPose Estimation from aPlanar Target ", this article is delivered in 2005 by Gerald Schweighofer and Axel Pinz, can the C language code corresponding to " RobustPose Estimation from aPlanar Target " pass through network address: http://nghiaho.com/? page_id=576 downloads to Open Source Code;
Step 10, judge whether α equals 1, if α equals 1, then using α width target image as β width uncalibrated image; And feature angle point subpixel coordinates all in α width target image and target co-ordinates mated the group of mating that group saves as all feature angle point subpixel coordinates and target co-ordinates in β width uncalibrated image; Simultaneously by α target rotation matrix R αwith α target translation matrix T αassignment gives β to demarcate rotation matrix R respectively β' and β demarcation translation matrix T β' after, perform step 11; Otherwise perform step 12;
Step 11, by α+1 assignment to α, and return step 5 order perform;
Step 12, formula (9) and formula (10) is utilized to obtain from described α target co-ordinates system transform to described β target co-ordinates system rotation matrix R α, βwith translation matrix T α, β:
R &alpha; , &beta; = ( R &beta; &prime; ) - 1 &CenterDot; R &alpha; - - - ( 9 )
T &alpha; , &beta; = ( R &beta; &prime; ) - 1 &CenterDot; ( T &alpha; - T &beta; &prime; ) - - - ( 10 )
Step 13, judge rotation matrix R α, βnorm whether be greater than threshold rotating value κ 1, and translation matrix T α, βnorm whether be greater than translation threshold value κ 2if be all greater than, then by β+1 assignment to β, and using α width target image as β width uncalibrated image; And feature angle point subpixel coordinates all in α width target image and target co-ordinates mated the group of mating that group saves as all feature angle point subpixel coordinates and target co-ordinates in β width uncalibrated image; Simultaneously by α target rotation matrix R αwith α target translation matrix T αassignment gives β to demarcate rotation matrix R respectively β' and β demarcation translation matrix T β' after, perform step 14; Otherwise perform step 14;
Step 14, judge that whether β and uncalibrated image sum G is equal, if equal, then what obtain all feature angle point subpixel coordinates and target co-ordinates in G width uncalibrated image and G uncalibrated image mates group, and performs step 15; Otherwise return step 11 to perform;
Step 15, mate group according to feature angle point subpixel coordinates all in G uncalibrated image and target co-ordinates, utilize Zhang Zhengyou Camera Calibration Algorithm to calculate the inside and outside parameter of video camera; In the present embodiment, the principle that realizes of Zhang Zhengyou Camera Calibration Algorithm can see article " Aflexible newtechnique for camera calibration ", this article is that the people such as Zhang Zhengyou delivered in 2000, utilizes the cvCalibrateCamera2 function of increasing income in the OpenCV of storehouse can realize Zhang Zhengyou Camera Calibration Algorithm.

Claims (2)

1. demarcate the camera interior and exterior parameter automatic calibration method of target based on directivity, it is characterized in that: described directivity is demarcated target and comprised by black lattice and the white square gridiron pattern formed alternate with each other and the directivity pattern arranged in described gridiron pattern middle position vicinity; The intersection point of the white square that the intersection point of the black lattice be connected using any two diagonal angles or any two diagonal angles are connected demarcates the feature angle point of target as described directivity;
Described gridiron pattern comprise M capable × a N row described feature angle point; M and N is positive integer; The length of side of described black lattice and white square is W; W > 0;
Described directivity pattern is made up of 3 little marking patterns; Remember that described 3 little marking patterns are respectively the first marking pattern, the second marking pattern and the 3rd marking pattern, and described first marking pattern be black pattern, described second marking pattern is black pattern, described 3rd marking pattern is white pattern; The inside that described first marking pattern is positioned at the inside of described white square, described second marking pattern is positioned at described white square, described 3rd marking pattern are positioned at the inside of described black lattice;
The center of described first marking pattern is designated as first nodal point o 1, described second marking pattern center be designated as the second central point o 2, described 3rd marking pattern center be designated as the 3rd central point o 3; Described first nodal point o 1be positioned at the center position of described white square, described second central point o 2be positioned at the center position of described white square, described 3rd central point o 3be positioned at described black center of a lattice position; With described first nodal point o in described gridiron pattern 1as right-angled apices and by described first nodal point o 1, the second central point o 2with the 3rd central point o 3form right angle triangle Δ o 2o 1o 3; Described first nodal point o 1with the second central point o 2be arranged in the row be made up of described black lattice or white square, and described first nodal point o 1with the 3rd central point o 3be arranged in the row be made up of described black lattice or white square, or, described first nodal point o 1with the second central point o 2be arranged in the row be made up of described black lattice or white square, and described first nodal point o 1with the 3rd central point o 3be arranged in the row be made up of described black lattice or white square; By described first nodal point o 1with the second central point o 2the right-angle side formed and by first nodal point o 1with the 3rd central point o 3the right-angle side formed meet and and
Described automatic calibration method carries out as follows:
Step 1, definition uncalibrated image sum G; Definition threshold rotating value κ 1with translation threshold value κ 2; Defining variable α, variable β, and initialization α=1; β=1;
Step 2, utilize position to fix video camera captured in real-time space in movement directivity demarcate target, obtain target image;
Step 3, with the initial point o of the upper left corner of described target image for feature angle point pixel coordinate system, being the x-axis direction of described feature angle point pixel coordinate system from left to right, is from up to down the y-axis direction of described feature angle point pixel coordinate system; Thus set up described feature angle point pixel coordinate system o-xy;
Step 4, initial point O using the photocentre of described video camera as camera coordinate system c; Using the x-axis direction of described feature angle point pixel coordinate system as the X of described camera coordinate system cdirection of principal axis; Using the y-axis direction of described feature angle point pixel coordinate system as the Y of described camera coordinate system cdirection of principal axis; The X of described camera coordinate system caxle, Y caxle and Z caxle meets the right-hand rule, thus sets up described camera coordinate system O c-X cy cz c;
Step 5, suppose that described target image is that directivity on α shift position is demarcated target and carried out the corresponding α width target image of imaging through described video camera;
Step 6, first nodal point, the second central point and the 3rd central point that the directivity on described α shift position is demarcated in target are designated as α first nodal point respectively α the second central point with α the 3rd central point
Choose and described α first nodal point space length, with α the second central point space length and with α the 3rd central point the minimum feature angle point of space length sum as the initial point of α target co-ordinates system with described α first nodal point with α the 3rd central point α the space vector formed direction as described α target co-ordinates system direction of principal axis; With described α first nodal point with α the second central point α the space vector formed direction as described α target co-ordinates system direction of principal axis, described α target co-ordinates system axle, axle and axle meets the right-hand rule, thus sets up described α target co-ordinates system
The Harris Corner Detection Algorithm that step 7, utilization are improved is extracted directivity in described α width target image and is demarcated the subpixel coordinate of each feature angle point under described feature angle point pixel coordinate system o-xy on target, thus obtains the subpixel coordinate set of all feature angle points under described feature angle point pixel coordinate system o-xy in described α width target image;
Step 8, according to described directivity pattern, determine and obtain subpixel coordinate under described feature angle point pixel coordinate system o-xy of all feature angle points in described α width target image and with all feature angle points in described α width target image in described α target co-ordinates system under the matching relationship of target co-ordinates one to one, the described matching relationship obtained is saved as the group of mating of all feature angle point subpixel coordinates and target co-ordinates in α width target image;
Step 9, according to the mate group of feature angle point subpixel coordinates all in described α width target image with target co-ordinates, plane target drone spatial attitude estimating algorithm is utilized to calculate from described α target co-ordinates system transform to described camera coordinate system O c-X cy cz cα target rotation matrix R αwith α target translation matrix T α, thus realize the sense of rotation judgement that described directivity demarcates target;
Step 10, judge whether α equals 1, if α equals 1, then using described α width target image as β width uncalibrated image; And feature angle point subpixel coordinates all in described α width target image and target co-ordinates mated the group of mating that group saves as all feature angle point subpixel coordinates and target co-ordinates in β width uncalibrated image; Simultaneously by described α target rotation matrix R αwith described α target translation matrix T αassignment gives β to demarcate rotation matrix R respectively β' and β demarcation translation matrix T β' after, perform step 11; Otherwise perform step 12;
Step 11, by α+1 assignment to α, and return step 5 order perform;
Step 12, formula (1) and formula (2) is utilized to obtain from described α target co-ordinates system transform to described β target co-ordinates system rotation matrix R α, βwith translation matrix T α, β:
R α,β=(R β′) -1·R α(1)
T α,β=(R β′) -1·(T α-T β′) (2)
Step 13, judge rotation matrix R α, βnorm whether be greater than described threshold rotating value κ 1, and described translation matrix T α, βnorm whether be greater than translation threshold value κ 2if be all greater than, then by β+1 assignment to β, and using described α width target image as β width uncalibrated image; And feature angle point subpixel coordinates all in described α width target image and target co-ordinates mated the group of mating that group saves as all feature angle point subpixel coordinates and target co-ordinates in β width uncalibrated image; Simultaneously by described α target rotation matrix R αwith described α target translation matrix T αassignment gives β to demarcate rotation matrix R respectively β' and β demarcation translation matrix T β' after, perform step 14; Otherwise perform step 14;
Step 14, judge that whether β and uncalibrated image sum G is equal, if equal, then what obtain all feature angle point subpixel coordinates and target co-ordinates in G width uncalibrated image and G uncalibrated image mates group, and performs step 15; Otherwise return step 11 to perform;
Step 15, mate group according to feature angle point subpixel coordinates all in a described G uncalibrated image and target co-ordinates, utilize Zhang Zhengyou Camera Calibration Algorithm to calculate the inside and outside parameter of described video camera.
2. the camera interior and exterior parameter automatic calibration method demarcating target based on directivity according to claim 1, it is characterized in that, described step 7 is carried out according to the following procedure:
Step 7.1, to suppose in described α width target image that directivity is demarcated on target and had M 0oK × N 0a row feature angle point; 0 < M 0≤ M, 0 < N 0≤ N; M 0and N 0be integer; Iteration ends threshold value ψ is set; Definition iterations ξ;
Step 7.2, utilize Harris Corner Detection Algorithm to extract directivity in described α width target image to demarcate the Pixel-level coordinate of each feature angle point under described feature angle point pixel coordinate system o-xy on target, thus obtain directivity in described α width target image and demarcate all M on target 0oK × N 0the row Pixel-level coordinate set of feature angle point under described feature angle point pixel coordinate system o-xy;
Step 7.3, suppose that any one the feature angle point demarcated on target of directivity in described α width target image is that in α width target image, directivity demarcates τ feature angle point on target; 1≤τ≤(M 0× N 0); τ is integer; Initialization τ=1;
Step 7.4, initialization ξ=0;
Step 7.5, coordinate when the Pixel-level coordinate of τ feature angle point under described feature angle point pixel coordinate system o-xy that directivity in described α width target image is demarcated on target being designated as τ feature angle point the ξ time iteration in α width target image
Step 7.6, to obtain with the coordinate in described α width target image during τ feature angle point the ξ time iteration centered by μ × μ neighborhood, described with the coordinate in α width target image during τ feature angle point the ξ time iteration centered by μ × μ neighborhood not crossing with described directivity pattern; Described with α width target image in τ feature angle point the ξ time iteration time coordinate centered by μ × μ neighborhood in except the coordinate in described α width target image during τ feature angle point the ξ time iteration the coordinate of any one pixel is in addition designated as 1≤η≤(μ × μ-1); μ is odd number;
Step 7.7, to set up such as formula the subpixel coordinate optimization object function in the α width target image shown in (3) during τ feature angle point the ξ time iteration
In formula (3), represent described with α width target image in τ feature angle point the ξ time iteration time coordinate centered by μ × μ neighborhood in except the coordinate in described α width target image during τ feature angle point the ξ time iteration any one pixel coordinate in addition the shade of gray at place; represent the coordinate from described α width target image during τ feature angle point the ξ time iteration to any one pixel coordinate described vector;
Step 7.8, subpixel coordinate optimization object function when calculating τ feature angle point the ξ time iteration in described α width target image value;
Step 7.9, judgement whether be less than ψ, if be less than, then by the coordinate in described α width target image during τ feature angle point the ξ time iteration save as directivity in α width target image and demarcate the subpixel coordinate of τ feature angle point under described feature angle point pixel coordinate system o-xy on target, and perform step 7.10; Otherwise perform step 7.11;
Step 7.10, judge whether τ equals (M 0× N 0), if equal, then obtain the subpixel coordinate set of all feature angle points under described feature angle point pixel coordinate system o-xy in described α width target image; Otherwise, by τ+1 assignment to τ, and return the execution of step 7.4 order;
Step 7.11, according to the subpixel coordinate optimization object function in described α width target image during τ feature angle point the ξ time iteration coordinate when utilizing Levenberg-Marquard optimized algorithm iterative to obtain τ feature angle point ξ+1 iteration in α width target image by ξ+1 assignment to ξ, and return the execution of step 7.6 order.
CN201510338308.1A 2015-06-17 2015-06-17 The camera interior and exterior parameter automatic calibration method of target is demarcated based on directionality Active CN104933717B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201510338308.1A CN104933717B (en) 2015-06-17 2015-06-17 The camera interior and exterior parameter automatic calibration method of target is demarcated based on directionality

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201510338308.1A CN104933717B (en) 2015-06-17 2015-06-17 The camera interior and exterior parameter automatic calibration method of target is demarcated based on directionality

Publications (2)

Publication Number Publication Date
CN104933717A true CN104933717A (en) 2015-09-23
CN104933717B CN104933717B (en) 2017-08-11

Family

ID=54120871

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201510338308.1A Active CN104933717B (en) 2015-06-17 2015-06-17 The camera interior and exterior parameter automatic calibration method of target is demarcated based on directionality

Country Status (1)

Country Link
CN (1) CN104933717B (en)

Cited By (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105427284A (en) * 2015-11-06 2016-03-23 西北工业大学 Fixed target marking method based on airborne android platform
CN106971406A (en) * 2017-03-06 2017-07-21 广州视源电子科技股份有限公司 The detection method and device of object pose
WO2017161608A1 (en) * 2016-03-21 2017-09-28 完美幻境(北京)科技有限公司 Geometric calibration processing method and device for camera
CN107622513A (en) * 2017-07-31 2018-01-23 惠州市德赛西威汽车电子股份有限公司 A kind of piece demarcation point detection device and viewing system automatic calibration method
CN108109179A (en) * 2017-12-29 2018-06-01 天津科技大学 Video camera attitude updating method based on pinhole camera modeling
CN108182707A (en) * 2017-12-21 2018-06-19 上海汇像信息技术有限公司 Acquire it is imperfect under the conditions of gridiron pattern calibrating template and its automatic identifying method
CN108257186A (en) * 2018-01-18 2018-07-06 广州视源电子科技股份有限公司 Determining method and device, video camera and the storage medium of uncalibrated image
CN108765489A (en) * 2018-05-29 2018-11-06 中国人民解放军63920部队 A kind of pose computational methods, system, medium and equipment based on combination target
CN109483516A (en) * 2018-10-16 2019-03-19 浙江大学 A kind of mechanical arm hand and eye calibrating method based on space length and epipolar-line constraint
CN109785394A (en) * 2018-12-31 2019-05-21 深圳积木易搭科技技术有限公司 A kind of camera and turntable combined calibrating method, device and storage medium
CN110458898A (en) * 2019-08-15 2019-11-15 北京迈格威科技有限公司 Camera calibration plate, nominal data acquisition method, distortion correction method and device
CN110930451A (en) * 2019-10-18 2020-03-27 广州点图识别信息科技有限公司 Three-dimensional space positioning method, system and storage medium based on two-dimensional image
CN111833405A (en) * 2020-07-27 2020-10-27 北京大华旺达科技有限公司 Calibration identification method and device based on machine vision
CN113096191A (en) * 2020-12-23 2021-07-09 合肥工业大学 Intelligent calibration method for monocular camera based on coding plane target
CN113112549A (en) * 2020-12-23 2021-07-13 合肥工业大学 Monocular camera rapid calibration method based on coding stereo target
CN113112550A (en) * 2020-12-23 2021-07-13 合肥工业大学 Coding plane target for calibrating internal and external parameters of camera and coding method thereof
CN113129388A (en) * 2020-12-23 2021-07-16 合肥工业大学 Coding stereo target for quickly calibrating internal and external parameters of camera and coding method thereof
CN113129386A (en) * 2020-12-23 2021-07-16 合肥工业大学 Intelligent calibration method for internal and external parameters of binocular camera based on coding plane target
CN113129385A (en) * 2020-12-23 2021-07-16 合肥工业大学 Binocular camera internal and external parameter calibration method based on multi-coding plane target in space
CN113160329A (en) * 2020-12-23 2021-07-23 合肥工业大学 Coding plane target for camera calibration and decoding method thereof
CN113192143A (en) * 2020-12-23 2021-07-30 合肥工业大学 Coding stereo target for camera quick calibration and decoding method thereof

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040258306A1 (en) * 2003-06-23 2004-12-23 Shoestring Research, Llc Fiducial designs and pose estimation for augmented reality
JP2008250487A (en) * 2007-03-29 2008-10-16 Kyushu Institute Of Technology Camera calibration method using model matching by edge detection
CN103530880A (en) * 2013-10-16 2014-01-22 大连理工大学 Camera calibration method based on projected Gaussian grid pattern
CN103679693A (en) * 2013-01-25 2014-03-26 杭州先临三维科技股份有限公司 Multi-camera single-view calibration device and calibration method thereof
CN103985118A (en) * 2014-04-28 2014-08-13 无锡观智视觉科技有限公司 Parameter calibration method for cameras of vehicle-mounted all-round view system

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040258306A1 (en) * 2003-06-23 2004-12-23 Shoestring Research, Llc Fiducial designs and pose estimation for augmented reality
JP2008250487A (en) * 2007-03-29 2008-10-16 Kyushu Institute Of Technology Camera calibration method using model matching by edge detection
CN103679693A (en) * 2013-01-25 2014-03-26 杭州先临三维科技股份有限公司 Multi-camera single-view calibration device and calibration method thereof
CN103530880A (en) * 2013-10-16 2014-01-22 大连理工大学 Camera calibration method based on projected Gaussian grid pattern
CN103985118A (en) * 2014-04-28 2014-08-13 无锡观智视觉科技有限公司 Parameter calibration method for cameras of vehicle-mounted all-round view system

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
S. PAL等: "Structural design of four-component optically compensated zoom lenses: Use of evolutionary programming", 《OPTIK》 *
刘宁等: "针对标定图像中Harris伪角点的自动修正算法", 《光子学报》 *
夏瑞雪等: "基于圆点阵列靶标的特征点坐标自动提取方法", 《中国机械工程》 *
陈琳等: "基于附加噪声预测器无迹卡尔曼滤波的摄像机标定算法", 《科学技术与工程》 *

Cited By (39)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105427284A (en) * 2015-11-06 2016-03-23 西北工业大学 Fixed target marking method based on airborne android platform
WO2017161608A1 (en) * 2016-03-21 2017-09-28 完美幻境(北京)科技有限公司 Geometric calibration processing method and device for camera
CN106971406B (en) * 2017-03-06 2019-10-29 广州视源电子科技股份有限公司 The detection method and device of object pose
CN106971406A (en) * 2017-03-06 2017-07-21 广州视源电子科技股份有限公司 The detection method and device of object pose
CN107622513A (en) * 2017-07-31 2018-01-23 惠州市德赛西威汽车电子股份有限公司 A kind of piece demarcation point detection device and viewing system automatic calibration method
CN108182707B (en) * 2017-12-21 2021-08-10 上海汇像信息技术有限公司 Chessboard grid calibration template under incomplete collection condition and automatic identification method thereof
CN108182707A (en) * 2017-12-21 2018-06-19 上海汇像信息技术有限公司 Acquire it is imperfect under the conditions of gridiron pattern calibrating template and its automatic identifying method
CN108109179A (en) * 2017-12-29 2018-06-01 天津科技大学 Video camera attitude updating method based on pinhole camera modeling
CN108109179B (en) * 2017-12-29 2021-05-18 天津科技大学 Camera attitude correction method based on pinhole camera model
CN108257186A (en) * 2018-01-18 2018-07-06 广州视源电子科技股份有限公司 Determining method and device, video camera and the storage medium of uncalibrated image
CN108257186B (en) * 2018-01-18 2021-03-23 广州视源电子科技股份有限公司 Method and device for determining calibration image, camera and storage medium
CN108765489A (en) * 2018-05-29 2018-11-06 中国人民解放军63920部队 A kind of pose computational methods, system, medium and equipment based on combination target
CN108765489B (en) * 2018-05-29 2022-04-29 中国人民解放军63920部队 Pose calculation method, system, medium and equipment based on combined target
CN109483516A (en) * 2018-10-16 2019-03-19 浙江大学 A kind of mechanical arm hand and eye calibrating method based on space length and epipolar-line constraint
CN109483516B (en) * 2018-10-16 2020-06-05 浙江大学 Mechanical arm hand-eye calibration method based on space distance and polar line constraint
CN109785394A (en) * 2018-12-31 2019-05-21 深圳积木易搭科技技术有限公司 A kind of camera and turntable combined calibrating method, device and storage medium
CN110458898B (en) * 2019-08-15 2022-03-22 北京迈格威科技有限公司 Camera calibration board, calibration data acquisition method, distortion correction method and device
CN110458898A (en) * 2019-08-15 2019-11-15 北京迈格威科技有限公司 Camera calibration plate, nominal data acquisition method, distortion correction method and device
CN110930451B (en) * 2019-10-18 2022-04-22 广州点图识别信息科技有限公司 Three-dimensional space positioning method, system and storage medium based on two-dimensional image
CN110930451A (en) * 2019-10-18 2020-03-27 广州点图识别信息科技有限公司 Three-dimensional space positioning method, system and storage medium based on two-dimensional image
CN111833405A (en) * 2020-07-27 2020-10-27 北京大华旺达科技有限公司 Calibration identification method and device based on machine vision
CN111833405B (en) * 2020-07-27 2023-12-08 北京大华旺达科技有限公司 Calibration and identification method and device based on machine vision
CN113129386A (en) * 2020-12-23 2021-07-16 合肥工业大学 Intelligent calibration method for internal and external parameters of binocular camera based on coding plane target
CN113112550B (en) * 2020-12-23 2022-08-02 合肥工业大学 Coding plane target for calibrating internal and external parameters of camera and coding method thereof
CN113160329A (en) * 2020-12-23 2021-07-23 合肥工业大学 Coding plane target for camera calibration and decoding method thereof
CN113192143A (en) * 2020-12-23 2021-07-30 合肥工业大学 Coding stereo target for camera quick calibration and decoding method thereof
CN113096191A (en) * 2020-12-23 2021-07-09 合肥工业大学 Intelligent calibration method for monocular camera based on coding plane target
CN113129388A (en) * 2020-12-23 2021-07-16 合肥工业大学 Coding stereo target for quickly calibrating internal and external parameters of camera and coding method thereof
CN113112550A (en) * 2020-12-23 2021-07-13 合肥工业大学 Coding plane target for calibrating internal and external parameters of camera and coding method thereof
CN113112548A (en) * 2020-12-23 2021-07-13 合肥工业大学 Quick calibration method for internal and external parameters of binocular camera based on coded three-dimensional target
CN113129386B (en) * 2020-12-23 2022-07-29 合肥工业大学 Intelligent calibration method for internal and external parameters of binocular camera based on coding plane target
CN113129385A (en) * 2020-12-23 2021-07-16 合肥工业大学 Binocular camera internal and external parameter calibration method based on multi-coding plane target in space
CN113160329B (en) * 2020-12-23 2022-08-09 合肥工业大学 Coding plane target for camera calibration and decoding method thereof
CN113096191B (en) * 2020-12-23 2022-08-16 合肥工业大学 Intelligent calibration method for monocular camera based on coding plane target
CN113112548B (en) * 2020-12-23 2022-08-19 合肥工业大学 Rapid calibration method for internal and external parameters of binocular camera based on coded three-dimensional target
CN113112549B (en) * 2020-12-23 2022-08-23 合肥工业大学 Monocular camera rapid calibration method based on coding stereo target
CN113129385B (en) * 2020-12-23 2022-08-26 合肥工业大学 Binocular camera internal and external parameter calibration method based on multi-coding plane target in space
CN113192143B (en) * 2020-12-23 2022-09-06 合肥工业大学 Coding stereo target for camera quick calibration and decoding method thereof
CN113112549A (en) * 2020-12-23 2021-07-13 合肥工业大学 Monocular camera rapid calibration method based on coding stereo target

Also Published As

Publication number Publication date
CN104933717B (en) 2017-08-11

Similar Documents

Publication Publication Date Title
CN104933717A (en) Camera intrinsic and extrinsic parameter automatic calibration method based on directional calibration target
CN104867160B (en) A kind of directionality demarcation target demarcated for camera interior and exterior parameter
CN106651752B (en) Three-dimensional point cloud data registration method and splicing method
CN109272537B (en) Panoramic point cloud registration method based on structured light
CN104463108B (en) A kind of monocular real time target recognitio and pose measuring method
CN106919944A (en) A kind of wide-angle image method for quickly identifying based on ORB algorithms
CN104034269B (en) A kind of monocular vision measuring method and device
JP2009093611A (en) System and method for recognizing three-dimensional object
Houshiar et al. A study of projections for key point based registration of panoramic terrestrial 3D laser scan
Lv et al. Build 3D Scanner System based on Binocular Stereo Vision.
CN112990228B (en) Image feature matching method, related device, equipment and storage medium
CN103198475B (en) Based on the total focus synthetic aperture perspective imaging method that multilevel iteration visualization is optimized
CN106934824B (en) Global non-rigid registration and reconstruction method for deformable object
CN116129037B (en) Visual touch sensor, three-dimensional reconstruction method, system, equipment and storage medium thereof
CN110136048B (en) Image registration method and system, storage medium and terminal
Andaló et al. Efficient height measurements in single images based on the detection of vanishing points
CN101196988B (en) Palm locating and center area extraction method of three-dimensional palm print identity identification system
CN101882309A (en) Calibration pattern of camera and calibration and detection method based on same
Chen et al. Image stitching algorithm research based on OpenCV
Huang et al. Multimodal image matching using self similarity
Hafeez et al. 3D surface reconstruction of smooth and textureless objects
Tong et al. 3D information retrieval in mobile robot vision based on spherical compound eye
KR101673144B1 (en) Stereoscopic image registration method based on a partial linear method
CN109741389A (en) One kind being based on the matched sectional perspective matching process of region base
Li Spherical gradient operator

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant