CN108428250B - X-corner detection method applied to visual positioning and calibration - Google Patents

X-corner detection method applied to visual positioning and calibration Download PDF

Info

Publication number
CN108428250B
CN108428250B CN201810077053.1A CN201810077053A CN108428250B CN 108428250 B CN108428250 B CN 108428250B CN 201810077053 A CN201810077053 A CN 201810077053A CN 108428250 B CN108428250 B CN 108428250B
Authority
CN
China
Prior art keywords
point
pixel
corner
value
sample data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201810077053.1A
Other languages
Chinese (zh)
Other versions
CN108428250A (en
Inventor
赵子健
王芳
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shandong University
Original Assignee
Shandong University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shandong University filed Critical Shandong University
Priority to CN201810077053.1A priority Critical patent/CN108428250B/en
Publication of CN108428250A publication Critical patent/CN108428250A/en
Application granted granted Critical
Publication of CN108428250B publication Critical patent/CN108428250B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/60Analysis of geometric attributes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10016Video; Image sequence
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30204Marker
    • G06T2207/30208Marker matrix

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Geometry (AREA)
  • Image Analysis (AREA)

Abstract

The invention relates to an X angular point detection method applied to visual positioning and calibration, which comprises the following steps: s1: collecting an image, and sampling the image by adopting an annular square window; s2: preliminarily judging whether the sample data contains the X angular point or not according to the image characteristics of the X angular point; s3: further judging whether the sample data contains an X angular point, and eliminating the X angular point which is repeatedly judged; s4: re-acquiring sample data by taking the X corner point as a window center, judging whether the data meets the symmetry condition of the X corner point, calculating the sub-pixel level position of the X corner point by using a curve fitting method if the data meets the symmetry condition of the X corner point, and setting a repeated detection mark of the X corner point; s5: the steps S2 to S4 are repeated, and all X corner points are detected. When the invention samples the image, half of the side length of the window is sampled at intervals each time, the detection speed is improved, and the X angular point can not be omitted. The method and the device judge whether the sampling window contains the X angular point or not based on the image characteristics of the X angular point, and enhance the robustness of the algorithm.

Description

X-corner detection method applied to visual positioning and calibration
Technical Field
The invention relates to an X-corner detection method applied to visual positioning and calibration, belonging to the technical field of computer visual application.
Background
Visual positioning and calibration are important components of three-dimensional computer vision. One of the basic tasks of computer vision is to calculate the geometric information of an object in a three-dimensional space based on the image information acquired by a camera, and to reconstruct and identify the object accordingly, and the correlation between the three-dimensional geometric position of a point on the surface of the object in the space and the corresponding point in the image is determined by the geometric model imaged by the camera, and the parameters of the geometric model are the parameters of the camera. Under most conditions, these parameters must be obtained through experimentation and calculation, a process known as visual calibration. The calibration process is to determine the geometric and optical parameters of the camera and the position of the camera relative to the world coordinate system; the visual positioning process is to calculate the three-dimensional information of the object through the two-dimensional image information according to the parameters of the visual calibration. The size of the calibration precision directly influences the positioning precision of computer vision.
The plane calibration method is a common visual calibration method for a camera, and comprises the steps of establishing a mathematical model by utilizing the corresponding relation between an X angular point on a known chessboard calibration plate and a corresponding point on an image obtained by shooting the calibration plate by means of the known chessboard calibration plate, namely the dimension and the shape of the calibration plate, and calibrating the internal and external parameters of the camera by using the mathematical model. The chessboard calibration plate is widely applied to calibration of the camera due to economical and simple manufacture; in addition, the optical positioning system with the X-corner visual mark is widely applied.
For the detection of the X corner, some methods have been proposed at present, and a method based on Harris corner detection, a detection method based on Hessian matrix, and a method based on improved Susan corner detection are commonly used. The proposed method mainly judges the strength of the X corner point through characteristic calculation in various different modes, has large arithmetic operation amount and is not suitable for parallel batch processing.
Disclosure of Invention
Aiming at the defects of the prior art, the invention provides an X-corner detection method applied to visual positioning and calibration;
the invention improves the operation speed, the interference resistance and the accuracy of the angular point detection algorithm.
Interpretation of terms:
an X corner point: the international chessboard for visual calibration is formed by combining black and white color mutation areas, wherein the critical point of the boundary of adjacent black and white chequers is the X angular point.
The technical scheme of the invention is as follows:
an X-corner detection method applied to visual positioning and calibration comprises the following steps:
s1: collecting images bySampling the image by the clip window; setting the side length of the sampling of the square-shaped window to be 2r pixel points, wherein the sampling of the square-shaped window contains 8r-4 pixel points, and r is less than half of the side length of the minimum X angular point in the image; all pixel points of the clip window are counted into an annular data queue, all pixel points of the clip window are sample data, and the ith pixel point is marked as Pi,PiHas a gray value of fi,i=1,2...(8r-4);
S2: preliminarily judging whether the sample data contains the X corner or not according to the image characteristics of the X corner, if so, calculating the sub-pixel level position of the X corner, and otherwise, entering the step S5;
s3: further judging whether the sample data contains an X corner or not according to the sub-pixel level position of the X corner obtained in the step S2, and eliminating the repeatedly judged X corner;
s4: taking the X corner point as the center of the rectangular window, re-acquiring sample data, judging whether the data meets the symmetry condition of the X corner point, if so, calculating the sub-pixel level position of the X corner point by using a curve fitting method, and setting an X corner point repeated detection mark;
s5: and moving the clip window on the image to acquire new sample data, repeating the steps S2 to S4 every time n pixels belong to (1,2r), and detecting all X corner points.
Preferably, according to the invention, n ═ r.
Preferably, step S2 includes:
s21: carrying out graying on the sample data in sequence; the threshold value may be chosen adaptively.
S22: binarizing the gray value of the sample data twice, and calculating the step number N of the sample data processed in the step S21sIf N is presentsIf yes, go to step S23, otherwise, go to step S5;
s23: taking the mean value of the gray value of the sample data as a threshold value, and binarizing the gray value of the sample data; setting the pixels of the sample data gray value calculated in the step S22 as a step A, a step B, a step C and a step DCalculating the distance L between the index values of the four pixelsAB、LBC、LCD、LDAIf L isAB、LBC、LCD、LDAAre all less than max _ T and LAB、LBC、LCD、LDAIf the values are all larger than min _ T, max _ T belongs to (10,15), and min _ T belongs to (5,10), preliminarily judging that the sample data contains an X corner point, and continuing to execute the step S24, otherwise, executing the step S5;
s24: and calculating a sub-pixel level position L of the X corner point, namely the intersection point of the straight line AC and the BD according to the photographic geometry and the symmetry principle. The calculation formula is L ═ AC × BD.
The X corner position calculated in step S2 is at the sub-pixel level (one digit after the decimal point), and the position accuracy is relatively high.
Preferably, in step S22, the binarization threshold is mean ± Δ, mean is a mean of the gray-level values of the sample data, Δ is a threshold adjustment value, and Δ has a value range of 20-160 pixels. The value of delta is related to the brightness of the whole image, and the delta is used as a threshold adjusting value, so that the wrong judgment caused by the influence of image noise can be avoided, and the robustness of the algorithm is enhanced.
Preferably, step S24 includes:
taking the pixel values of the step A, the step B, the step C and the step D as coordinate values of the point A, B, C, D, obtaining three-dimensional homogeneous coordinates of the step A, the step B, the step C and the step D, cross-multiplying the homogeneous coordinates of the point A and the point C to obtain a vector representation form of a homogeneous equation of a straight line AC, cross-multiplying the homogeneous coordinates of the point B and the point D to obtain a vector representation form of a homogeneous equation of a straight line BD, cross-multiplying the vector of the homogeneous equation of the straight line AC and the vector of the homogeneous equation of the straight line BD to obtain a homogeneous coordinate L1 of the intersection point of the straight line AC and the BD, setting the coordinate of L1 as (X1, X2 and X3), setting the point (X1/X3 and X2/X3) as a two-dimensional coordinate of the intersection point, and obtaining a pixel value L (sub-pixel level position L) of the X corner point after the integration.
Preferably, step S3 includes:
s31: judging an X corner repeated detection mark, if the pixel value L of the X corner obtained in the step S23 is located in an inactive area, judging that the X corner is detected, jumping out of the loop, and executing a step S5; otherwise, go to step S32;
s32: acquiring a gray value of a pixel value L neighborhood pixel of an X corner point, wherein the neighborhood is a range taking the pixel value L of the X corner point as a center and taking r pixels as a radius; binarizing the neighborhood by taking the mean value of the neighborhood gray value as a threshold value, and calculating the step number delta V of the gray valueCIf Δ VC>min _ V, continue to execute step S4, otherwise, execute step S5; min _ V ═ 4.
Preferably, step S4 specifically includes:
s41: the pixel value L of the X angular point is used as the center of the square-wave window, and the sample sequence P' is obtained again;
s42: the gray values of the sample sequence P ' are binarized by using the mean value of the gray values as a threshold value, the pixels when the step is generated by the binarization of the gray values are represented as a step A1, a step B1, a step C1 and a step D1, and the distance L ' between the index values of the four pixels is calculated 'A1B1、L'B1C1、L'C1D1、L'D1A1If LA1B1=L'C1D1And LB1C1=L′D1A1Continuing to execute step S42, otherwise, executing step S5;
s43: solving the one-dimensional sub-pixel positions A ', B', C 'and D' of the step A1, the step B1, the step C1 and the step D1 by a curve fitting method;
s44: calculating two-dimensional sub-pixel positions A ', B', C 'and D' of the step A1, the step B1, the step C1 and the step D1 according to the one-dimensional sub-pixel positions A ', B', C 'and D' of the step A1, the step B1, the step C1 and the step D1 calculated in the step S43 and the pixel value L of the X corner calculated in the step S32;
assuming that the one-dimensional sub-pixel position of a certain step is m, and the pixel corresponding to the center of the X corner point is (X, y), the two-dimensional sub-pixel positions of a step A1, a step B1, a step C1 and a step D1 are obtained; the two-dimensional sub-pixel position of the step A1 is (x + A '-r +1, y-r +0.5), the two-dimensional sub-pixel position of the step B1 is (x + r +0.5, y + B' -3r +1), the two-dimensional sub-pixel position of the step C1 is (x-C '+ 5r-1, y + r +0.5), and the two-dimensional sub-pixel position of the step D1 is (x-r +0.5, y-D' +7 r-1);
s45: calculating the coordinates of the intersection point of the straight lines A 'C' and B 'D', namely the sub-pixel position of the pixel value L of the X angular point according to the method of the step S32;
s46: calculating the direction information of the X angular point: according to the anticlockwise direction, two boundary lines are obtained according to the Black-White change sequence, wherein the two boundary lines comprise a BW (Black-to-White) line and a WB (White-to-Black) line, and the BW line refers to the boundary line jumping from Black to White; the WB line refers to the boundary line jumping from white to black; calculating the included angle theta between BW line, WB line and horizontal direction1、θ2I.e. the direction information of the X corner point;
s47: the neighborhood of the pixel value L of the X corner point is set as an inactive region, indicating that the X corner point has been detected. And avoiding repeated detection of the X corner points.
Preferably, in step S43, the one-dimensional sub-pixel positions a ', B', C ', D' of the step a1, the step B1, the step C1, and the step D1 are determined by curve fitting, including: taking five pixels (three in front of the step A1 and two behind the step A1) near the step A1 of the sample sequence P ', taking index values of the five pixels in the sample sequence P ' as x coordinates, and the gradient of the gray value as y coordinates, performing quadratic curve fitting, wherein the fitted curve is approximately a quadratic parabola, and the extreme point of the quadratic parabola is the place with the maximum gray change along the gradient direction, namely the one-dimensional sub-pixel position A ' of the step A1; the one-dimensional sub-pixel positions B ', C ', D ' of the step B1, the step C1, and the step D1 were determined in the same manner.
The invention has the beneficial effects that:
1. when the method is used for sampling the image, half of the side length of the window is sampled at intervals every time, so that the detection speed is improved, and the condition of omitting X angular points is avoided. Assuming that the radius of the rectangular window is r-10, the detection speed of the method of the present invention is 10 times that of the pixel-by-pixel scanning method in the prior art; taking 640 × 480 resolution images as an example, if the detection speed of the pixel-by-pixel scanning method is 3 frames/second, the detection speed of the present invention will be 30 frames/second, and the purpose of real-time detection can be achieved.
2. The method and the device judge whether the sampling window contains the X angular point or not based on the image characteristics of the X angular point, and enhance the robustness of the algorithm.
Drawings
Fig. 1 is a schematic diagram of an X-corner point and a rectangular window used for detection.
Fig. 2 is a schematic flow chart of the X-corner detection method applied to visual positioning and calibration according to the present invention.
Detailed Description
The invention is further defined in the following, but not limited to, the figures and examples in the description.
Example 1
An X-corner detection method applied to visual positioning and calibration, as shown in fig. 2, includes:
s1: acquiring an image, and sampling the image by adopting a rectangular window (shown in figure 1); setting the side length of the sampling of the square-shaped window to be 2r pixel points, wherein the sampling of the square-shaped window contains 8r-4 pixel points, and r is less than half of the side length of the minimum X angular point in the image; all pixel points of the clip window are counted into an annular data queue, all pixel points of the clip window are sample data P, and the ith pixel point is marked as Pi,PiHas a gray value of fi,i=1,2...(8r-4);
S2: preliminarily judging whether the sample data P contains the X corner or not according to the image characteristics of the X corner, if so, calculating the sub-pixel level position of the X corner, and otherwise, entering the step S5;
s3: further judging whether the sample data contains an X corner or not according to the sub-pixel level position of the X corner obtained in the step S2, and eliminating the repeatedly judged X corner;
s4: taking the X corner point as the center of the rectangular window, re-acquiring sample data, judging whether the data meets the symmetry condition of the X corner point, if so, calculating the sub-pixel level position of the X corner point by using a curve fitting method, and setting an X corner point repeated detection mark;
s5: and moving the clip window on the image to acquire new sample data, repeating the steps S2 to S4 every time n pixels belong to (1,2r), and detecting all X corner points. n is r.
Example 2
The method for detecting an X-corner applied to visual positioning and calibration in embodiment 1 is different from the method in that step S2 includes:
s21: carrying out graying on the sample data in sequence; the threshold value may be chosen adaptively.
S22: and carrying out binarization twice on the gray value of the sample data, wherein the binarization threshold value is mean +/-delta, mean is the mean value of the gray value of the sample data, delta is a threshold value adjusting value, and the value range of delta is 20-160 pixels. The value of delta is related to the brightness of the whole image, and the delta is used as a threshold adjusting value, so that the wrong judgment caused by the influence of image noise can be avoided, and the robustness of the algorithm is enhanced. Calculating the number of steps N of the sample data processed in the step S21sIf N is presentsIf yes, go to step S23, otherwise, go to step S5;
s23: taking the mean value of the gray value of the sample data as a threshold value, and binarizing the gray value of the sample data; setting the pixels when the sample data gray value calculated in step S22 has a step as step a, step B, step C and step D, and calculating the distance L between the index values of these four pixelsAB、LBC、LCD、LDAIf L isAB、LBC、LCD、LDAAre all less than max _ T and LAB、LDC、LCD、LDAIf the values are all larger than min _ T, max _ T belongs to (10,15), and min _ T belongs to (5,10), preliminarily judging that the sample data contains an X corner point, and continuing to execute the step S24, otherwise, executing the step S5;
s24: and calculating a sub-pixel level position L of the X corner point, namely the intersection point of the straight line AC and the BD according to the photographic geometry and the symmetry principle. The calculation formula is L ═ AC × BD. The method comprises the following steps: taking the pixel values of the step A, the step B, the step C and the step D as coordinate values of the point A, B, C, D, obtaining three-dimensional homogeneous coordinates of the step A, the step B, the step C and the step D, cross-multiplying the homogeneous coordinates of the point A and the point C to obtain a vector representation form of a homogeneous equation of a straight line AC, cross-multiplying the homogeneous coordinates of the point B and the point D to obtain a vector representation form of a homogeneous equation of a straight line BD, cross-multiplying the vector of the homogeneous equation of the straight line AC and the vector of the homogeneous equation of the straight line BD to obtain a homogeneous coordinate L1 of the intersection point of the straight line AC and the BD, setting the coordinate of L1 as (X1, X2 and X3), setting the point (X1/X3 and X2/X3) as a two-dimensional coordinate of the intersection point, and obtaining the pixel value L of the X corner point after the integration. The X corner position calculated in step S2 is at the sub-pixel level (one digit after the decimal point), and the position accuracy is relatively high.
Example 3
The method for detecting an X-corner applied to visual positioning and calibration in embodiment 1 is different from the method in that step S3 includes:
s31: judging an X corner repeated detection mark, if the pixel value L of the X corner obtained in the step S23 is located in an inactive area, judging that the X corner is detected, jumping out of the loop, and executing a step S5; otherwise, go to step S32;
s32: acquiring a gray value of a pixel value L neighborhood pixel of an X corner point, wherein the neighborhood is a range taking the pixel value L of the X corner point as a center and taking r pixels as a radius; binarizing the neighborhood by taking the mean value of the neighborhood gray value as a threshold value, and calculating the step number delta V of the gray valueCIf Δ VC>min _ V, continue to execute step S4, otherwise, execute step S5; min _ V ═ 4.
Example 4
The method for detecting an X-corner applied to visual positioning and calibration in embodiment 1 is different from the method in that step S4 includes:
s41: the pixel value L of the X angular point is used as the center of the square-wave window, and the sample sequence P' is obtained again;
s42: the gray values of the sample sequence P ' are binarized by using the mean value of the gray values as a threshold value, the pixels when the step is generated by the gray value binarization are step A1, step B1, step C1 and step D1, and the distance L ' between the four pixel index values is calculated 'A1B1、L′B1C1、L′C1D1、L′D1A1If L'A1B1=L′C1D1And L'B1C1=L′D1A1Continuing to execute step S42, otherwise, executing step S5;
s43: and solving the one-dimensional sub-pixel positions A ', B', C 'and D' of the step A1, the step B1, the step C1 and the step D1 by a curve fitting method, wherein the method comprises the following steps: taking five pixels (three in front of the step A1 and two behind the step A1) near the step A1 of the sample sequence P ', taking index values of the five pixels in the sample sequence P ' as x coordinates, and the gradient of the gray value as y coordinates, performing quadratic curve fitting, wherein the fitted curve is approximately a quadratic parabola, and the extreme point of the quadratic parabola is the place with the maximum gray change along the gradient direction, namely the one-dimensional sub-pixel position A ' of the step A1; the one-dimensional sub-pixel positions B ', C ', and D ' of the step B1, the step C1, and the step D1 were determined in the same manner.
S44: two-dimensional sub-pixel positions A ', B', C 'and D' of the step A1, the step B1, the step C1 and the step D1 are calculated according to the one-dimensional sub-pixel positions A ', B', C 'and D' of the step A1, the step B1, the step C1 and the step D1 calculated in the step S43 and the pixel value L of the X corner calculated in the step S32;
assuming that the one-dimensional sub-pixel position of a certain step is m, and the pixel corresponding to the center of the X corner point is (X, y), the two-dimensional sub-pixel positions of a step A1, a step B1, a step C1 and a step D1 are obtained; the two-dimensional sub-pixel position of the step A1 is (x + A '-r +1, y-r +0.5), the two-dimensional sub-pixel position of the step B1 is (x + r +0.5, y + B' -3r +1), the two-dimensional sub-pixel position of the step C1 is (x-C '+ 5r-1, y + r +0.5), and the two-dimensional sub-pixel position of the step D1 is (x-r +0.5, y-D' +7 r-1);
s45: calculating the coordinates of the intersection of the straight lines A 'C' and B 'D', namely the sub-pixel position of the pixel value L of the X corner point according to the method of the step S32;
s46: calculating the direction information of the X angular point: according to the anticlockwise direction, two boundary lines are obtained according to the Black-White change sequence, wherein the two boundary lines comprise a BW (Black-to-White) line and a WB (White-to-Black) line, and the BW line refers to the boundary line jumping from Black to White; the WB line refers to the boundary line jumping from white to black; calculating the included angle theta between BW line, WB line and horizontal direction1、θ2I.e. the direction information of the X corner point;
s47: the neighborhood of the pixel value L of the X corner point is set as an inactive region, indicating that the X corner point has been detected. And avoiding repeated detection of the X corner points.

Claims (5)

1. An X angular point detection method applied to visual positioning and calibration is characterized by comprising the following steps:
s1: collecting an image, and sampling the image by adopting a square-shaped window; setting the side length of the sampling of the square-shaped window to be 2r pixel points, wherein the sampling of the square-shaped window contains 8r-4 pixel points, and r is less than half of the side length of the minimum X angular point in the image; all pixel points of the clip window are counted into an annular data queue, all pixel points of the clip window are sample data, and the ith pixel point is marked as Pi,PiHas a gray value of fi,i=1,2...(8r-4);
S2: preliminarily judging whether the sample data contains the X corner or not according to the image characteristics of the X corner, if so, calculating the sub-pixel level position of the X corner, and otherwise, entering the step S5;
step S2, including:
s21: carrying out graying on the sample data in sequence;
s22: binarizing the gray value of the sample data twice, and calculating the step number N of the sample data processed in the step S21sIf N is presentsIf yes, go to step S23, otherwise, go to step S5;
s23: taking the mean value of the gray value of the sample data as a threshold value, and binarizing the gray value of the sample data; setting the pixels when the sample data gray value calculated in step S22 has a step as step a, step B, step C and step D, and calculating the distance L between the index values of these four pixelsAB、LBC、LCD、LDAIf L isAB、LBC、LCD、LDAAre all less than max _ T and LAB、LBC、LCD、LDAIf the values are all larger than min _ T, max _ T belongs to (10,15), and min _ T belongs to (5,10), preliminarily judging that the sample data contains an X corner point, and continuing to execute the step S24, otherwise, executing the step S5;
s24: calculating a sub-pixel level position L of an X angular point, namely an intersection point of a straight line AC and a BD according to the principles of photographing geometry and symmetry, wherein the calculation formula is L ═ AC multiplied by BD;
s3: further judging whether the sample data contains an X corner or not according to the sub-pixel level position of the X corner obtained in the step S2, and eliminating the repeatedly judged X corner;
step S3, including:
s31: judging an X corner repeated detection mark, if the pixel value L of the X corner obtained in the step S24 is located in an inactive area, judging that the X corner is detected, jumping out of the loop, and executing a step S5; otherwise, go to step S32;
s32: acquiring a gray value of a pixel value L neighborhood pixel of an X corner point, wherein the neighborhood is a range taking the pixel value L of the X corner point as a center and taking r pixels as a radius; binarizing the neighborhood by taking the mean value of the neighborhood gray value as a threshold value, and calculating the step number delta V of the gray valueCIf Δ VC>min _ V, continue to execute step S4, otherwise, execute step S5; min _ V ═ 4;
s4: taking the X corner point as the center of the rectangular window, re-acquiring sample data, judging whether the data meets the symmetry condition of the X corner point, if so, calculating the sub-pixel level position of the X corner point by using a curve fitting method, and setting an X corner point repeated detection mark;
step S4, specifically including:
s41: the pixel value L of the X angular point is used as the center of the square-wave window, and the sample sequence P' is obtained again;
s42: the gray values of the sample sequence P ' are binarized by using the mean value of the gray values as a threshold value, the pixels when the step is generated by the gray value binarization are step A1, step B1, step C1 and step D1, and the distance L ' between the four pixel index values is calculated 'A1B1、L′B1C1、L′C1D1、L′D1A1If L'A1B1=L′C1D1And L'B1C1=L′D1A1Continuing to execute step S43, otherwise, executing step S5;
s43: solving the one-dimensional sub-pixel positions A ', B', C 'and D' of the step A1, the step B1, the step C1 and the step D1 by a curve fitting method;
s44: two-dimensional sub-pixel positions A ', B', C 'and D' of the step A1, the step B1, the step C1 and the step D1 are calculated according to the one-dimensional sub-pixel positions A ', B', C 'and D' of the step A1, the step B1, the step C1 and the step D1 calculated in the step S43 and the pixel value L of the X corner calculated in the step S24; namely: assuming that the one-dimensional sub-pixel position of a certain step is m, and the pixel corresponding to the center of the X corner point is (X, y), the two-dimensional sub-pixel positions of a step A1, a step B1, a step C1 and a step D1 are obtained; the two-dimensional sub-pixel position of the step A1 is (x + A '-r +1, y-r +0.5), the two-dimensional sub-pixel position of the step B1 is (x + r +0.5, y + B' -3r +1), the two-dimensional sub-pixel position of the step C1 is (x-C '+ 5r-1, y + r +0.5), and the two-dimensional sub-pixel position of the step D1 is (x-r +0.5, y-D' +7 r-1);
s45: calculating the coordinates of the intersection of the straight lines A 'C' and B 'D', namely the sub-pixel position of the pixel value L of the X corner point according to the method of the step S24;
s46: calculating the direction information of the X angular point: according to the anticlockwise direction, two boundary lines are obtained according to the black-white change sequence, wherein the two boundary lines comprise a BW line and a WB line, and the BW line refers to the boundary line jumping from black to white; the WB line refers to the boundary line jumping from white to black; calculating the included angle theta between BW line, WB line and horizontal direction1、θ2I.e. the direction information of the X corner point;
s47: setting the neighborhood of the pixel value L of the X corner point as an inactive area to represent that the X corner point is detected;
s5: and moving the clip window on the image to acquire new sample data, repeating the steps S2 to S4 every time n pixels belong to (1,2r), and detecting all X corner points.
2. The method as claimed in claim 1, wherein in step S22, the binarization threshold is mean ± Δ, mean is the mean of the gray values of the sample data, Δ is the threshold adjustment value, and Δ has a value range of 20-160 pixels.
3. The method for detecting X-corner applied to visual positioning and calibration according to claim 1, wherein the step S24 includes: taking pixel values of the step A, the step B, the step C and the step D as coordinate values of the point A, B, C, D, obtaining three-dimensional homogeneous coordinates of the step A, the step B, the step C and the step D, cross-multiplying the homogeneous coordinates of the point A and the point C to obtain a vector representation form of a homogeneous equation of a straight line AC, cross-multiplying the homogeneous coordinates of the point B and the point D to obtain a vector representation form of a homogeneous equation of a straight line BD, cross-multiplying a vector representing the homogeneous equation of the straight line AC and a vector representing the homogeneous equation of the straight line BD to obtain a homogeneous coordinate L1 of an intersection point of the straight line AC and the BD, setting a coordinate of L1 as (X1, X2 and X3), setting the point (X1/X3 and X2/X3) as a two-dimensional coordinate of the intersection point, and obtaining a pixel value L of an X corner point after the pixel value L is the sub-level position L of the X corner point.
4. The method of claim 3, wherein the step S43 of finding the one-dimensional sub-pixel positions A ', B', C ', D' of the step A1, the step B1, the step C1 and the step D1 by curve fitting comprises: sampling five pixels near a step A1 of the sequence P ', taking index values of the five pixels in the sample sequence P ' as x coordinates, taking gradient of gray values as y coordinates, performing quadratic curve fitting, wherein the fitted curve is approximately a quadratic parabola, and an extreme point of the quadratic parabola is a place with maximum gray change along the gradient direction, namely a one-dimensional sub-pixel position A ' of the step A1; the one-dimensional sub-pixel positions B ', C ', D ' of the step B1, the step C1, and the step D1 were determined in the same manner.
5. An X-corner detection method for visual alignment and calibration according to any one of claims 1-4, wherein n-r.
CN201810077053.1A 2018-01-26 2018-01-26 X-corner detection method applied to visual positioning and calibration Active CN108428250B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810077053.1A CN108428250B (en) 2018-01-26 2018-01-26 X-corner detection method applied to visual positioning and calibration

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810077053.1A CN108428250B (en) 2018-01-26 2018-01-26 X-corner detection method applied to visual positioning and calibration

Publications (2)

Publication Number Publication Date
CN108428250A CN108428250A (en) 2018-08-21
CN108428250B true CN108428250B (en) 2021-09-21

Family

ID=63156290

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810077053.1A Active CN108428250B (en) 2018-01-26 2018-01-26 X-corner detection method applied to visual positioning and calibration

Country Status (1)

Country Link
CN (1) CN108428250B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111047614B (en) * 2019-10-10 2023-09-29 南昌市微轲联信息技术有限公司 Feature extraction-based method for extracting target corner of complex scene image
CN111428720B (en) * 2020-04-14 2023-09-26 北京神工科技有限公司 Sub-pixel level visual feature point positioning method and device based on step response matching

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1896682A (en) * 2005-07-12 2007-01-17 北京航空航天大学 X-shaped angular-point sub-pixel extraction
CN102095370A (en) * 2010-11-22 2011-06-15 北京航空航天大学 Detection identification method for three-X combined mark
CN103093451A (en) * 2011-11-03 2013-05-08 北京理工大学 Checkerboard intersection recognition algorithm
CN103345755A (en) * 2013-07-11 2013-10-09 北京理工大学 Chessboard angular point sub-pixel extraction method based on Harris operator
CN103927750A (en) * 2014-04-18 2014-07-16 上海理工大学 Detection method of checkboard grid image angular point sub pixel
CN104036516A (en) * 2014-06-30 2014-09-10 山东科技大学 Camera calibration checkerboard image corner detection method based on symmetry analysis
CN104331900A (en) * 2014-11-25 2015-02-04 湖南科技大学 Corner sub-pixel positioning method in CCD (charge coupled device) camera calibration
CN105740818A (en) * 2016-01-29 2016-07-06 山东大学 Artificial mark detection method applied to augmented reality
CN105787912A (en) * 2014-12-18 2016-07-20 南京大目信息科技有限公司 Classification-based step type edge sub pixel localization method
CN106846412A (en) * 2017-01-23 2017-06-13 上海兴芯微电子科技有限公司 A kind of checkerboard angle point detection process and device

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1896682A (en) * 2005-07-12 2007-01-17 北京航空航天大学 X-shaped angular-point sub-pixel extraction
CN102095370A (en) * 2010-11-22 2011-06-15 北京航空航天大学 Detection identification method for three-X combined mark
CN103093451A (en) * 2011-11-03 2013-05-08 北京理工大学 Checkerboard intersection recognition algorithm
CN103345755A (en) * 2013-07-11 2013-10-09 北京理工大学 Chessboard angular point sub-pixel extraction method based on Harris operator
CN103927750A (en) * 2014-04-18 2014-07-16 上海理工大学 Detection method of checkboard grid image angular point sub pixel
CN104036516A (en) * 2014-06-30 2014-09-10 山东科技大学 Camera calibration checkerboard image corner detection method based on symmetry analysis
CN104331900A (en) * 2014-11-25 2015-02-04 湖南科技大学 Corner sub-pixel positioning method in CCD (charge coupled device) camera calibration
CN105787912A (en) * 2014-12-18 2016-07-20 南京大目信息科技有限公司 Classification-based step type edge sub pixel localization method
CN105740818A (en) * 2016-01-29 2016-07-06 山东大学 Artificial mark detection method applied to augmented reality
CN106846412A (en) * 2017-01-23 2017-06-13 上海兴芯微电子科技有限公司 A kind of checkerboard angle point detection process and device

Non-Patent Citations (5)

* Cited by examiner, † Cited by third party
Title
"A new X-Corner Detection for Camera Calibration Using Saddle Points";Abdulrahman S. Alturki 等;《ResearchGate》;20160430;第1-6页 *
"An Automated X-corner Detection Algorithm(AXDA)";Fuqing Zhao 等;《JOURNAL OF SOFTWARE》;20110531;第6卷(第5期);第791-797页 *
"An X-corner Detection Algorithm Based on Checkerboard Features";Wang Yan 等;《International Conference on Logistics Engineering, Management and Computer Science (LEMCS 2014)》;20141231;第190-193页 *
"具有方向特性的X角点的亚像素检测定位";孟偲 等;《北京航空航天大学学报》;20150430;第41卷(第4期);第580-588页 *
"基于人工标记的手术导航仪";马帅依凡 等;《山东大学学报》;20170630;第47卷(第3期);第63-68页 *

Also Published As

Publication number Publication date
CN108428250A (en) 2018-08-21

Similar Documents

Publication Publication Date Title
CN108921176B (en) Pointer instrument positioning and identifying method based on machine vision
CN107014294B (en) Contact net geometric parameter detection method and system based on infrared image
CN107369159B (en) Threshold segmentation method based on multi-factor two-dimensional gray level histogram
CN107203973B (en) Sub-pixel positioning method for center line laser of three-dimensional laser scanning system
CN108596878B (en) Image definition evaluation method
CN106204524B (en) A kind of method and device for evaluating picture quality
CN107784669A (en) A kind of method that hot spot extraction and its barycenter determine
CN115170669B (en) Identification and positioning method and system based on edge feature point set registration and storage medium
KR20130030220A (en) Fast obstacle detection
CN109813727A (en) A kind of pcb board weld defects detection method based on depth information
KR101589167B1 (en) System and Method for Correcting Perspective Distortion Image Using Depth Information
CN107358628B (en) Linear array image processing method based on target
CN111080661A (en) Image-based line detection method and device and electronic equipment
CN111462066A (en) Thread parameter detection method based on machine vision
CN110111387A (en) A kind of pointer gauge positioning and reading algorithm based on dial plate feature
CN108428250B (en) X-corner detection method applied to visual positioning and calibration
JP5812705B2 (en) Crack detection method
CN104574312A (en) Method and device of calculating center of circle for target image
CN115096206A (en) Part size high-precision measurement method based on machine vision
CN111354047A (en) Camera module positioning method and system based on computer vision
CN115345821A (en) Steel coil binding belt loosening abnormity detection and quantification method based on active visual imaging
CN116563298B (en) Cross line center sub-pixel detection method based on Gaussian fitting
KR101733028B1 (en) Method For Estimating Edge Displacement Againt Brightness
CN114998571B (en) Image processing and color detection method based on fixed-size markers
CN113554688B (en) O-shaped sealing ring size measurement method based on monocular vision

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant