CN107101584A - Ohject displacement measuring method based on image recognition, apparatus and system - Google Patents
Ohject displacement measuring method based on image recognition, apparatus and system Download PDFInfo
- Publication number
- CN107101584A CN107101584A CN201710285370.8A CN201710285370A CN107101584A CN 107101584 A CN107101584 A CN 107101584A CN 201710285370 A CN201710285370 A CN 201710285370A CN 107101584 A CN107101584 A CN 107101584A
- Authority
- CN
- China
- Prior art keywords
- mrow
- detection image
- image
- msub
- detection
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01B—MEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
- G01B11/00—Measuring arrangements characterised by the use of optical techniques
- G01B11/02—Measuring arrangements characterised by the use of optical techniques for measuring length, width or thickness
- G01B11/022—Measuring arrangements characterised by the use of optical techniques for measuring length, width or thickness by means of tv-camera scanning
Landscapes
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Length Measuring Devices By Optical Means (AREA)
Abstract
The present invention relates to a kind of ohject displacement measuring method based on image recognition, the measuring method comprises the following steps:Based on default target image, obtain object first position the first detection image data and object the second place the second detection image data;Edge detection process is carried out to the first detection image data and the second detection image data, the corresponding first edge diagram data of the first detection image and the corresponding second edge diagram data of the second detection image is obtained;Positioned according to the corresponding first edge diagram data of the first detection image, the corresponding second edge diagram data of the second detection image and sub-pix centroid method, obtain the corresponding set of characteristic points of the first detection image and the corresponding set of characteristic points of the second detection image;According to the corresponding set of characteristic points of the first detection image and the corresponding feature point set chalaza of the second detection image, the best projection relational matrix of the first detection image and the second detection image is obtained;According to default focal length and the best projection relational matrix of default object distance and the first detection image and the second detection image, obtain object first position and object the second place relative distance.
Description
Technical field
The present invention relates to field of measuring technique, more particularly to a kind of ohject displacement measuring method based on image recognition, survey
Measure device and measuring system.
Background technology
With the development of technology, it is necessary to which the relative position to object is measured when micro-displacement occurs for object,
But a kind of existing relative position measurement device, there is measurement error during measurement larger, it is impossible to which meeting precision will
Ask;, could be to the micro-displacement of object and existing another relative position measurement device contacts object during measurement
Measure, but contact measurement easily can cause cut or scuffing to the surface of object, can not also meet the requirement of measurement.
The content of the invention
Present invention seek to address that being difficult to the technical problem of measurement to the relative position of object in the prior art, there is provided one
Kind meet measurement accuracy requirement and the surface of object will not be caused damage the ohject displacement measuring method based on image recognition,
Measurement apparatus and measuring system.
The present invention provides a kind of ohject displacement measuring method based on image recognition, and the measuring method includes following step
Suddenly:
Based on default target image, object is obtained in the first detection image data and object of first position in second
The the second detection image data put;
Edge detection process is carried out to the first detection image data and the second detection image data, the first detection image is obtained
Corresponding first edge diagram data and the corresponding second edge diagram data of the second detection image;
According to the corresponding first edge diagram data of the first detection image, the corresponding second edge diagram data of the second detection image
With the positioning of sub-pix centroid method, the corresponding set of characteristic points of the first detection image and the corresponding characteristic point of the second detection image are obtained
Set;
According to the corresponding set of characteristic points of the first detection image and the corresponding feature point set chalaza of the second detection image, obtain
The best projection relational matrix of first detection image and the second detection image;
Closed according to the best projection of default focal length and default object distance and the first detection image and the second detection image
Be matrix, obtain object first position and object the second place relative distance.
The present invention provides a kind of ohject displacement measurement apparatus based on image recognition, and the measurement apparatus includes:
Acquisition module, for based on default target pattern, obtaining first detection image data of the object in first position
With object the second place the second detection image data;
Edge detection process module, for carrying out rim detection to the first detection image data and the second detection image data
Processing, obtains the corresponding first edge diagram data of the first detection image and the corresponding second edge diagram data of the second detection image;
Locating module, for corresponding according to the corresponding first edge diagram data of the first detection image, the second detection image
Second edge diagram data and the positioning of sub-pix centroid method, obtain the corresponding set of characteristic points of the first detection image and the second detection figure
As corresponding set of characteristic points;
First processing module, for corresponding according to the corresponding set of characteristic points of the first detection image and the second detection image
Feature point set chalaza, obtains the best projection relational matrix of the first detection image and the second detection image;
Second processing module, for being examined according to default camera focus and camera object distance and the first detection image and second
The best projection relational matrix of altimetric image, obtain object first position and object the second place relative distance.
The present invention provides a kind of ohject displacement measuring system based on image recognition, including camera arrangement, light source, beam splitter
And the above-mentioned ohject displacement measurement apparatus based on image recognition, optical axis setting of the beam splitter along the light source, in advance
If target image be located between the light source and beam splitter so as to target image be projected on testee, the dress of taking pictures
Put be connected with the ohject displacement measurement apparatus and the camera arrangement optical axis perpendicular to the testee object plane.
Compared with prior art, beneficial effect is technical scheme:By camera to default target image
It is incident upon testee to be taken pictures twice, followed by the imaging relations between image, the relative position of testee is closed
System is transformed into as the displacement relation as between, is extracted and subpixel registration by the rim detection to two images, center of mass point, precisely
The spatial relationship matrix between two images is calculated, the displacement relation of testee meter is finally obtained.Due to the method for image recognition
Contact measurement is not needed not only, and possesses accurate measurement accuracy, metering system simple and effective, can quickly obtain displacement letter
Breath.
Brief description of the drawings
Fig. 1 is ohject displacement measuring method a kind of flow chart of embodiment of the invention based on image recognition;
Fig. 2 is default a kind of schematic diagram of embodiment of target image of the invention;
Fig. 3 photographs a kind of schematic diagram of embodiment of image for the present invention;
Fig. 4 photographs the schematic diagram that image carries out rim detection and center coordination for the present invention;
Fig. 5 is ohject displacement measurement apparatus a kind of structural representation of embodiment of the invention based on image recognition;
Fig. 6 is ohject displacement measuring system a kind of structural representation of embodiment of the invention based on image recognition.
In figure, 1, camera arrangement, 2, light source, 3, beam splitter, 4, default target image, 50, acquisition module, 51, edge
Detection process module, 52, locating module, 53, first processing module, 54, Second processing module.
Embodiment
The embodiment to the present invention is described further below in conjunction with the accompanying drawings.
The present invention provides a kind of ohject displacement measuring method based on image recognition of embodiment, as shown in figure 1, described survey
Amount method includes:
Step S11, based on default target image, obtains first detection image data and object of the object in first position
The second detection image data in the second place;
First detection image data and the second detection image data are carried out edge detection process, obtain first by step S12
The corresponding first edge diagram data of detection image and the corresponding second edge diagram data of the second detection image;
Step S13, according to the corresponding first edge diagram data of the first detection image, corresponding second side of the second detection image
Edge diagram data and the positioning of sub-pix centroid method, obtain the corresponding set of characteristic points of the first detection image corresponding with the second detection image
Set of characteristic points;
Step S14, according to the corresponding set of characteristic points of the first detection image and the corresponding feature point set of the second detection image
Chalaza, obtains the best projection relational matrix of the first detection image and the second detection image;
Step S15, according to default focal length and object distance and the best projection of the first detection image and the second detection image
Relational matrix, obtain object first position and object the second place relative distance.
In specific implementation, as shown in Figure 2 by the way that default target image 4 is projected on testee Object, clap
Taken pictures according to the optical axis of device 1 perpendicular to testee Object object plane, obtain the first detection image Img1 data, work as measured object
Body Object occurs after minute movement, and now default target image 4 still projects dress of being taken pictures on testee Object
The optical axis for putting 1 is taken pictures perpendicular to testee Object object plane, obtains the second detection image Img2 data.Wherein, such as Fig. 3 institutes
Show, photograph image schematic diagram.
Specifically, the edge detection process is specially:Canny rim detections, that is to say, that in step s 12, to
One detection image Img1 and the second detection image Img2 data carry out Canny rim detections, obtain the first detection image corresponding
The first edge diagram data Img1_b and corresponding second edge diagram data Img2_b of the second detection image, wherein, the inspection of Canny edges
Method of determining and calculating concrete implementation step is as follows:Coloured image is converted into gray level image;Gaussian Blur is carried out to gray level image;Calculate
Image gradient, according to gradient calculation image border amplitude and angle;Non- peak signal compression process (edge thinning);Dual threshold side
Edge connection is handled;Binary image output result.In specific implementation, it can be obtained by the positioning of pixel centroid method as shown in Figure 4
To the barycenter of target image, because each center of mass point is the characteristic point of correspondence image, by the positioning of pixel centroid method to the
One edge diagram data Img1_b obtains n barycenter of the first detection image, that is, obtains the corresponding feature point set of the first detection image
Conjunction-Pt1 { pt11, pt12 ..., pt1n }, and the is obtained to second edge diagram data Img2_b by the positioning of pixel centroid method
N barycenter of two detection images, that is, obtain the second detection image corresponding set of characteristic points-Pt2 pt21, pt22 ...,
pt2n}。
In specific implementation, the corresponding set of characteristic points Pt1 of first detection image and the second detection image are corresponding
The row coordinate of each characteristic point in set of characteristic points Pt2, i.e., the row coordinate of each barycenter is represented using equation below:
The corresponding set of characteristic points Pt1 and corresponding set of characteristic points Pt2 of the second detection image of first detection image
In each characteristic point row coordinate, i.e., the row coordinate of each barycenter represented using equation below:
Wherein, the threshold value of T settings, g (ui,vj) be image g uiRow, vjThe grey scale pixel value of row, i represents image
Row, span is i1 to i2, and j represents the row of image, and span is j1 to j2, and image g is image to be detected.
The visual angle and illumination condition shot due to characteristic point subject to registration is not quite similar, the first detection image and the second detection figure
As there may be translation, rotation and scale, in order to improve the precision of measurement, related registration Algorithm is extended using sub-pix, is asked
Solve the corresponding set of characteristic points Pt1 of first detection image and the corresponding set of characteristic points Pt2 of the second detection image space
Relational matrix.
In specific implementation, step S14 is specifically included:
It is each in the corresponding set of characteristic points of first detection image and the corresponding set of characteristic points of the second detection image
The row coordinate and row coordinate of characteristic point, and obtain initial projection relational matrix using the related method of phase;
According to initial projection relational matrix, related registration Algorithm is extended using sub-pix, to initial projection relation
Matrix is solved to obtain the best projection relational matrix of the first detection image and the second detection image.
Specifically, when taking pictures twice, when testee Object only carries out micro-displacement, first detection image pair
The corresponding characteristic points of the corresponding set of characteristic points Pt2 of set of characteristic points Pt1 and the second detection image answered utilize phase correlation
Method meets following dependency relation formula:
Wherein, (x1, y1) and (x2, y2) is corresponding same place in two images;P is initial projection relational matrix,Dx is the horizontal range of the first detection image and the second detection image, and dy is the first detection image
With the vertical range of the second detection image, θ is the anglec of rotation of the first detection image and the second detection image.
It is described that related registration Algorithm is extended using sub-pix according to initial projection relational matrix in specific implementation,
Initial projection relational matrix is solved to obtain the best projection relation square of the first detection image and the second detection image
Battle array, specific formula is as follows:
Wherein, | | | | represent Euclidean distance, irFor (pt11, pt12 ..., pt1n), the first detection image correspondence is represented
Set of characteristic points constitute vector,To subtract the arithmetic mean of instantaneous value of the corresponding set of characteristic points of the first detection image, iw(p)
For (pt21, pt22 ..., pt2n), the vector that the corresponding set of characteristic points of the second detection image is constituted is represented,For iw(p)
The arithmetic mean of instantaneous value of the corresponding set of characteristic points of the second detection image is subtracted, p is initial projection relational matrix;That is,
ir=(pt11, pt12 ..., pt1n), iw(p)=(pt21, pt22 ..., pt2n).
Specifically the arithmetic average of vector is:Gather every average.For example, (a1+a2+ ... an)/n is a1,
A2 ... ..., an arithmetic mean of instantaneous value.And the process that best projection relational matrix is determined is the process of an iterative, first
Initial projection relational matrix p is solved using phase correlation method, then by the corresponding set of characteristic points of the first detection image, second
The corresponding set of characteristic points pair of detection image and initial projection relational matrix p are brought into formula (4), obtain a two width figures
As the value of similarity, judge whether value t is less than given threshold value T, if it is lower, now projection relation matrix p is exactly optimal
Projection matrix, iteration stopping;If be unsatisfactory for, projection relation matrix p is updatedk+1=pk+ Δ p, k represent iterations, and p is to treat
Relational matrix is sought, Δ p is iteration step length, next iteration is carried out, until the threshold value T that the similar sexual satisfaction of two images is given.
In specific implementation, in the case of known focal length f and object distance H, schemed according to the first detection image and the second detection
The best projection relational matrix of picture, obtain object first position and object the second place relative distance:Horizontal range Dx
=dx/f*H, vertical range Dy=dy/f*H, angle is θ.
The ohject displacement measuring method based on image recognition of the present invention, is incident upon by camera to default target image
Testee is taken pictures twice, and followed by the imaging relations between image, the relative position relation of testee is changed
To as the displacement relation as between, extracted and subpixel registration by the rim detection to two images, center of mass point, precisely calculate figure
Spatial relationship matrix as between, finally obtains the displacement relation of testee meter.Because the method for image recognition not only need not
Contact measurement, and possess accurate measurement accuracy, metering system simple and effective, it can quickly obtain displacement information.
In specific implementation, the present invention also provides a kind of ohject displacement measurement apparatus based on image recognition of embodiment,
As shown in figure 5, the measurement apparatus includes:
Acquisition module 50, for based on default target pattern, obtaining first detection image number of the object in first position
According to object the second place the second detection image data;
Edge detection process module 51, for carrying out edge inspection to the first detection image data and the second detection image data
Survey is handled, and obtains the corresponding first edge diagram data of the first detection image and the corresponding second edge figure number of the second detection image
According to;
Locating module 52, for according to the corresponding first edge diagram data of the first detection image, the second detection image correspondence
Second edge diagram data and sub-pix centroid method positioning, obtain the corresponding set of characteristic points of the first detection image and second detection
The corresponding set of characteristic points of image;
First processing module 53, for corresponding with the second detection image according to the corresponding set of characteristic points of the first detection image
Feature point set chalaza, obtain the best projection relational matrix of the first detection image and the second detection image;
Second processing module 54, for according to default camera focus and camera object distance and the first detection image and second
The best projection relational matrix of detection image, obtain object first position and object the second place relative distance.
In specific implementation, as shown in Figure 2 by the way that default target image 4 is projected on testee Object, clap
Taken pictures according to the optical axis of device 1 perpendicular to testee Object object plane, obtain the first detection image Img1 data, work as measured object
Body Object occurs after minute movement, and now default target image 4 still projects dress of being taken pictures on testee Object
The optical axis for putting 1 is taken pictures perpendicular to testee Object object plane, obtains the second detection image Img2 data.Wherein, such as Fig. 3 institutes
Show, photograph image schematic diagram.
Specifically, the edge detection process is specifically included:Canny rim detections, that is to say, that edge detection process mould
Block 51 is additionally operable to:Canny rim detections are carried out to the first detection image Img1 and the second detection image Img2 data, first is obtained
The corresponding first edge diagram data Img1_b of the detection image and corresponding second edge diagram data Img2_b of the second detection image.Its
In, Canny edge detection algorithm concrete implementation steps are as follows:Coloured image is converted into gray level image;Gray level image is entered
Row Gaussian Blur;Image gradient is calculated, according to gradient calculation image border amplitude and angle;Non- peak signal compression process (side
Edge is refined);The connection processing of dual threshold edge;Binary image output result.
In specific implementation, the barycenter of target image can be obtained by the positioning of pixel centroid method as shown in Figure 4, due to every
One center of mass point is all the characteristic point of correspondence image, and the is obtained to first edge diagram data Img1_b by the positioning of pixel centroid method
N barycenter of one detection image, that is, obtain the first detection image corresponding set of characteristic points-Pt1 pt11, pt12 ...,
Pt1n }, and positioned by pixel centroid method n barycenter of the second detection image is obtained to second edge diagram data Img2_b,
Obtain the corresponding set of characteristic points-Pt2 of the second detection image { pt21, pt22 ..., p2tn }.
In specific implementation, the corresponding set of characteristic points Pt1 of first detection image and the second detection image are corresponding
The row coordinate of each characteristic point in set of characteristic points Pt2, i.e., the row coordinate of each barycenter is represented using equation below:
The corresponding set of characteristic points Pt1 and corresponding set of characteristic points Pt2 of the second detection image of first detection image
In each characteristic point row coordinate, i.e., the row coordinate of each barycenter represented using equation below:
Wherein, the threshold value of T settings, g (ui,vj) be image g uiRow, vjThe grey scale pixel value of row, i represents image
Row, span is i1 to i2, and j represents the row of image, and span is j1 to j2, and image g is image to be detected.
The visual angle and illumination condition shot due to characteristic point subject to registration is not quite similar, the first detection image and the second detection figure
As there may be translation, rotation and scale, in order to improve the precision of measurement, related registration Algorithm is extended using sub-pix, is asked
Solve the corresponding set of characteristic points Pt1 of first detection image and the corresponding set of characteristic points Pt2 of the second detection image space
Relational matrix.
In specific implementation, first processing module 53 is additionally operable to:
It is each in the corresponding set of characteristic points of first detection image and the corresponding set of characteristic points of the second detection image
The row coordinate and row coordinate of characteristic point, and obtain initial projection relational matrix using the related method of phase;
According to initial projection relational matrix, related registration Algorithm is extended using sub-pix, to initial projection relation
Matrix is solved to obtain the best projection relational matrix of the first detection image and the second detection image.
Specifically, when taking pictures twice, when testee Object only carries out micro-displacement, first detection image pair
The corresponding characteristic points of the corresponding set of characteristic points Pt2 of set of characteristic points Pt1 and the second detection image answered meet following relation
Formula:
Wherein, (x1, y1) and (x2, y2) is corresponding same place in two images;P is initial projection relational matrix,Dx is the horizontal range of the first detection image and the second detection image, and dy is the first detection
The vertical range of image and the second detection image, θ is the anglec of rotation of the first detection image and the second detection image.
It is described that initial projection relational matrix is solved according to phase correlation method in specific implementation, extended using sub-pix
Related registration Algorithm, is solved to obtain the first detection image and the second detection image to initial projection relational matrix
Best projection relational matrix, specific formula is as follows:
Wherein, | | | | represent Euclidean distance, irIt is corresponding that the first detection image is represented for (pt11, pt12 ..., pt1n)
The vector that set of characteristic points is constituted,To subtract the arithmetic mean of instantaneous value of the corresponding set of characteristic points of the first detection image, iw(p) it is
(pt21, pt22 ..., pt2n), represents the vector that the corresponding set of characteristic points of the second detection image is constituted,For iw(p) subtract
The arithmetic mean of instantaneous value of the corresponding set of characteristic points of the second detection image is removed, p is initial projection relational matrix, that is to say, that ir
=(pt11, pt12 ..., pt1n), iw(p)=(pt21, pt22 ..., pt2n).
Specifically the arithmetic average of vector is:Gather every average.For example, (a1+a2+ ... an)/n is a1,
A2 ... ..., an arithmetic mean of instantaneous value.And the process that best projection relational matrix is determined is the process of an iterative, first
Initial projection relational matrix p is solved using phase correlation method, then by the corresponding set of characteristic points of the first detection image, second
The corresponding set of characteristic points pair of detection image and initial projection relational matrix p are brought into formula (4), obtain a two width figures
As the value of similarity, judge whether value t is less than given threshold value T, if it is lower, now projection relation matrix p is exactly optimal
Projection matrix, iteration stopping;If be unsatisfactory for, projection relation matrix p is updatedk+1=pk+ Δ p, k represent iterations, and p is to treat
Relational matrix is sought, Δ p is iteration step length, next iteration is carried out, until the threshold value T that the similar sexual satisfaction of two images is given.
In specific implementation, in the case of known focal length f and object distance H, schemed according to the first detection image and the second detection
The best projection relational matrix of picture, obtain object first position and object the second place relative distance:Horizontal range Dx
=dx/f*H, vertical range Dy=dy/f*H, angle is θ.
The ohject displacement measurement apparatus based on image recognition of the present invention, is incident upon by camera to default target image
Testee is taken pictures twice, and followed by the imaging relations between image, the relative position relation of testee is changed
To as the displacement relation as between, extracted and subpixel registration by the rim detection to two images, center of mass point, precisely calculate figure
Spatial relationship matrix as between, finally obtains the displacement relation of testee meter.Because the method for image recognition not only need not
Contact measurement, and possess accurate measurement accuracy, metering system simple and effective, it can quickly obtain displacement information.
The present invention also provides a kind of ohject displacement measuring system based on image recognition of embodiment, as shown in fig. 6, described
System includes camera arrangement 1, light source 2, beam splitter 3 and the above-mentioned ohject displacement measurement apparatus based on image recognition, described
Optical axis of the beam splitter 3 along the light source 2 is set, and default target image 4 is located between the light source and beam splitter so that will
Target image is projected on testee Object, and the camera arrangement 1 is connected and described with the ohject displacement measurement apparatus
Object plane of the optical axis of camera arrangement 1 perpendicular to the testee Object.
In specific implementation, by taking testee Object micro-displacements as an example, the pixel dimension of designed, designed is used for 7 μm
× 7 μm, effective pixel be 4098pixel × 4098pixel, camera arrangement 1 is specially:Optical focal length f=70mm low profiles battle array
Camera, on translation stage, camera from a distance from testee Object be H=2m.Light source 2 uses bromine tungsten filament lamp, that is, uses
Bromine tungsten filament lamp is to illuminate that target is illuminated, the target image that designs is circular light-passing board by integrating sphere;Work as testee
When Object is not moved, shooting obtains the first detection image Img1;When minute movement occurs for testee Object, shoot
Obtain the second detection image Img2;Extracted by Image Edge-Detection, feature centroid and subpixel registration obtains two images
Translational movement:Dx=10.25pixel, dy=12.35pixel, anglec of rotation θ=1.3 degree.Finally according to the focal length and thing of camera
Away from the moving displacement that can obtain testee is:
Dx=10.25*7*10-6*2/(70*10-3)=2.05mm, Dy=12.35*7*10-6*2/(70*10-3)=
2.47mm, and the anglec of rotation are 1.3 degree.
The ohject displacement measuring system based on image recognition of the present invention, is incident upon by camera to default target image
Testee is taken pictures twice, and followed by the imaging relations between image, the relative position relation of testee is changed
To as the displacement relation as between, extracted and subpixel registration by the rim detection to two images, center of mass point, precisely calculate figure
Spatial relationship matrix as between, finally obtains the displacement relation of testee meter.Because the method for image recognition not only need not
Contact measurement, and possess accurate measurement accuracy, metering system simple and effective, it can quickly obtain displacement information.
Merely illustrating the principles of the invention described in above-described embodiment and specification and most preferred embodiment, are not departing from this
On the premise of spirit and scope, various changes and modifications of the present invention are possible, and these changes and improvements both fall within requirement and protected
In the scope of the invention of shield.
Claims (9)
1. a kind of ohject displacement measuring method based on image recognition, it is characterised in that:The measuring method comprises the following steps:
Based on default target image, object is obtained in the first detection image data and object of first position in the second place
Second detection image data;
Edge detection process is carried out to the first detection image data and the second detection image data, the first detection image correspondence is obtained
First edge diagram data and the corresponding second edge diagram data of the second detection image;
According to the corresponding first edge diagram data of the first detection image, the corresponding second edge diagram data of the second detection image and Asia
Pixel centroid method is positioned, and obtains the corresponding set of characteristic points of the first detection image and the corresponding feature point set of the second detection image
Close;
According to the corresponding set of characteristic points of the first detection image and the corresponding feature point set chalaza of the second detection image, first is obtained
The best projection relational matrix of detection image and the second detection image;
According to default focal length and the best projection relation square of default object distance and the first detection image and the second detection image
Battle array, obtain object first position and object the second place relative distance.
2. measuring method as claimed in claim 1, it is characterised in that:The edge detection process specifically includes Canny edges
Detection.
3. measuring method as claimed in claim 1, it is characterised in that:The corresponding set of characteristic points of first detection image and
The row coordinate of each characteristic point is in the corresponding set of characteristic points of second detection image:
<mrow>
<msub>
<mi>u</mi>
<mn>0</mn>
</msub>
<mo>=</mo>
<mfrac>
<mrow>
<munderover>
<mo>&Sigma;</mo>
<mrow>
<mi>i</mi>
<mo>=</mo>
<mi>i</mi>
<mn>1</mn>
</mrow>
<mrow>
<mi>i</mi>
<mn>2</mn>
</mrow>
</munderover>
<munderover>
<mo>&Sigma;</mo>
<mrow>
<mi>j</mi>
<mo>=</mo>
<mi>j</mi>
<mn>1</mn>
</mrow>
<mrow>
<mi>j</mi>
<mn>2</mn>
</mrow>
</munderover>
<msub>
<mi>u</mi>
<mi>i</mi>
</msub>
<msup>
<mrow>
<mo>&lsqb;</mo>
<mi>g</mi>
<mrow>
<mo>(</mo>
<msub>
<mi>u</mi>
<mi>i</mi>
</msub>
<mo>,</mo>
<msub>
<mi>v</mi>
<mi>j</mi>
</msub>
<mo>)</mo>
</mrow>
<mo>-</mo>
<mi>T</mi>
<mo>&rsqb;</mo>
</mrow>
<mn>2</mn>
</msup>
</mrow>
<mrow>
<munderover>
<mo>&Sigma;</mo>
<mrow>
<mi>i</mi>
<mo>=</mo>
<mi>i</mi>
<mn>1</mn>
</mrow>
<mrow>
<mi>i</mi>
<mn>2</mn>
</mrow>
</munderover>
<munderover>
<mo>&Sigma;</mo>
<mrow>
<mi>j</mi>
<mo>=</mo>
<mi>j</mi>
<mn>1</mn>
</mrow>
<mrow>
<mi>j</mi>
<mn>2</mn>
</mrow>
</munderover>
<msup>
<mrow>
<mo>&lsqb;</mo>
<mi>g</mi>
<mrow>
<mo>(</mo>
<msub>
<mi>u</mi>
<mi>i</mi>
</msub>
<mo>,</mo>
<msub>
<mi>v</mi>
<mi>j</mi>
</msub>
<mo>)</mo>
</mrow>
<mo>-</mo>
<mi>T</mi>
<mo>&rsqb;</mo>
</mrow>
<mn>2</mn>
</msup>
</mrow>
</mfrac>
</mrow>
Each feature in the corresponding set of characteristic points of first detection image and the corresponding set of characteristic points of the second detection image
Point row coordinate be:
<mrow>
<msub>
<mi>v</mi>
<mn>0</mn>
</msub>
<mo>=</mo>
<mfrac>
<mrow>
<munderover>
<mo>&Sigma;</mo>
<mrow>
<mi>i</mi>
<mo>=</mo>
<mi>i</mi>
<mn>1</mn>
</mrow>
<mrow>
<mi>i</mi>
<mn>2</mn>
</mrow>
</munderover>
<munderover>
<mo>&Sigma;</mo>
<mrow>
<mi>j</mi>
<mo>=</mo>
<mi>j</mi>
<mn>1</mn>
</mrow>
<mrow>
<mi>j</mi>
<mn>2</mn>
</mrow>
</munderover>
<msub>
<mi>v</mi>
<mi>i</mi>
</msub>
<msup>
<mrow>
<mo>&lsqb;</mo>
<mi>g</mi>
<mrow>
<mo>(</mo>
<msub>
<mi>u</mi>
<mi>i</mi>
</msub>
<mo>,</mo>
<msub>
<mi>v</mi>
<mi>j</mi>
</msub>
<mo>)</mo>
</mrow>
<mo>-</mo>
<mi>T</mi>
<mo>&rsqb;</mo>
</mrow>
<mn>2</mn>
</msup>
</mrow>
<mrow>
<munderover>
<mo>&Sigma;</mo>
<mrow>
<mi>i</mi>
<mo>=</mo>
<mi>i</mi>
<mn>1</mn>
</mrow>
<mrow>
<mi>i</mi>
<mn>2</mn>
</mrow>
</munderover>
<munderover>
<mo>&Sigma;</mo>
<mrow>
<mi>j</mi>
<mo>=</mo>
<mi>j</mi>
<mn>1</mn>
</mrow>
<mrow>
<mi>j</mi>
<mn>2</mn>
</mrow>
</munderover>
<msup>
<mrow>
<mo>&lsqb;</mo>
<mi>g</mi>
<mrow>
<mo>(</mo>
<msub>
<mi>u</mi>
<mi>i</mi>
</msub>
<mo>,</mo>
<msub>
<mi>v</mi>
<mi>j</mi>
</msub>
<mo>)</mo>
</mrow>
<mo>-</mo>
<mi>T</mi>
<mo>&rsqb;</mo>
</mrow>
<mn>2</mn>
</msup>
</mrow>
</mfrac>
</mrow>
Wherein, the threshold value of T settings, g (ui,vj) be image g uiRow, vjThe grey scale pixel value of row, i represents the row of image,
Span is i1 to i2, and j represents the row of image, and span is j1 to j2, and image g is image to be detected.
4. measuring method as claimed in claim 1, it is characterised in that:It is described according to the corresponding feature point set of the first detection image
Feature point set chalaza corresponding with the second detection image is closed, the best projection for obtaining the first detection image and the second detection image is closed
It is matrix step, specifically includes:
Each feature in the corresponding set of characteristic points of first detection image and the corresponding set of characteristic points of the second detection image
The row coordinate and row coordinate of point, and obtain initial projection relational matrix using the related method of phase;
According to initial projection relational matrix, related registration Algorithm is extended using sub-pix, to initial projection relational matrix
Solved to obtain the best projection relational matrix of the first detection image and the second detection image.
5. measuring method as claimed in claim 4, it is characterised in that:The formula of the initial projection relational matrix P is as follows:
<mrow>
<mi>p</mi>
<mo>=</mo>
<mfenced open = "(" close = ")">
<mtable>
<mtr>
<mtd>
<mrow>
<mi>c</mi>
<mi>o</mi>
<mi>s</mi>
<mi>&theta;</mi>
</mrow>
</mtd>
<mtd>
<mrow>
<mo>-</mo>
<mi>s</mi>
<mi>i</mi>
<mi>n</mi>
<mi>&theta;</mi>
</mrow>
</mtd>
<mtd>
<mrow>
<mi>d</mi>
<mi>x</mi>
</mrow>
</mtd>
</mtr>
<mtr>
<mtd>
<mrow>
<mi>s</mi>
<mi>i</mi>
<mi>n</mi>
<mi>&theta;</mi>
</mrow>
</mtd>
<mtd>
<mrow>
<mi>cos</mi>
<mi>&theta;</mi>
</mrow>
</mtd>
<mtd>
<mrow>
<mi>d</mi>
<mi>y</mi>
</mrow>
</mtd>
</mtr>
<mtr>
<mtd>
<mn>0</mn>
</mtd>
<mtd>
<mn>0</mn>
</mtd>
<mtd>
<mn>1</mn>
</mtd>
</mtr>
</mtable>
</mfenced>
</mrow>
Wherein, dx is the horizontal range of the first detection image and the second detection image, and dy is the first detection image and the second detection
The vertical range of image, θ is the anglec of rotation of the first detection image and the second detection image.
6. measuring method as claimed in claim 4, it is characterised in that:It is described according to initial projection relational matrix, using Asia
The related registration Algorithm of pixel-expansion, is solved to obtain the first detection image and the second inspection to initial projection relational matrix
The best projection relational matrix E of altimetric imageCC(p), specific formula is as follows:
<mrow>
<msub>
<mi>E</mi>
<mrow>
<mi>C</mi>
<mi>C</mi>
</mrow>
</msub>
<mrow>
<mo>(</mo>
<mi>p</mi>
<mo>)</mo>
</mrow>
<mo>=</mo>
<msup>
<mrow>
<mo>||</mo>
<msub>
<mi>i</mi>
<mi>r</mi>
</msub>
<mo>/</mo>
<mo>|</mo>
<mo>|</mo>
<msub>
<mover>
<mi>i</mi>
<mo>&OverBar;</mo>
</mover>
<mi>r</mi>
</msub>
<mo>|</mo>
<mo>|</mo>
<mo>-</mo>
<msub>
<mi>i</mi>
<mi>w</mi>
</msub>
<mo>/</mo>
<mo>|</mo>
<mo>|</mo>
<msub>
<mover>
<mi>i</mi>
<mo>&OverBar;</mo>
</mover>
<mi>w</mi>
</msub>
<mrow>
<mo>(</mo>
<mi>p</mi>
<mo>)</mo>
</mrow>
<mo>|</mo>
<mo>|</mo>
<mo>||</mo>
</mrow>
<mn>2</mn>
</msup>
</mrow>
Wherein, | | | | represent Euclidean distance, ir=(pt11, pt12 ..., pt1n) represent the feature that the first width image zooming-out is arrived
The vector that point is constituted,For irSubtract the arithmetic mean of instantaneous value of the set;iw(p)=(pt21, pt22 ..., pt2n), represents second
The vector that the characteristic point that width is extracted is constituted,For iw(p) arithmetic mean of instantaneous value of the set is subtracted, p closes for initial projection
It is matrix.
7. a kind of ohject displacement measurement apparatus based on image recognition, it is characterised in that:The measurement apparatus includes:
Acquisition module, for based on default target pattern, obtaining first detection image data and thing of the object in first position
Second detection image data of the body in the second place;
Edge detection process module, for being carried out to the first detection image data and the second detection image data at rim detection
Reason, obtains the corresponding first edge diagram data of the first detection image and the corresponding second edge diagram data of the second detection image;
Locating module, for according to the corresponding first edge diagram data of the first detection image, the second detection image corresponding second
Edge diagram data and the positioning of sub-pix centroid method, obtain the corresponding set of characteristic points of the first detection image and the second detection image pair
The set of characteristic points answered;
First processing module, for according to the corresponding set of characteristic points of the first detection image and the corresponding feature of the second detection image
Point set chalaza, obtains the best projection relational matrix of the first detection image and the second detection image;
Second processing module, for being schemed according to default camera focus and camera object distance and the first detection image and the second detection
The best projection relational matrix of picture, obtain object first position and object the second place relative distance.
8. measurement apparatus as claimed in claim 7, it is characterised in that:The edge detection process module is additionally operable to use
First detection image data and the second detection image data are carried out edge detection process, obtain the first inspection by Canny rim detections
The corresponding first edge diagram data of altimetric image and the corresponding second edge diagram data of the second detection image.
9. a kind of ohject displacement measuring system based on image recognition, it is characterised in that:Including camera arrangement, light source, beam splitter
And the ohject displacement measurement apparatus based on image recognition as claimed in claim 7 or 8, the beam splitter is along the light source
Optical axis set, default target image be located between the light source and beam splitter so as to which target image is projected into testee
On, the camera arrangement be connected with the ohject displacement measurement apparatus and the camera arrangement optical axis perpendicular to the measured object
The object plane of body.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201710285370.8A CN107101584B (en) | 2017-04-27 | 2017-04-27 | Object displacement measurement method, device and system based on image recognition |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201710285370.8A CN107101584B (en) | 2017-04-27 | 2017-04-27 | Object displacement measurement method, device and system based on image recognition |
Publications (2)
Publication Number | Publication Date |
---|---|
CN107101584A true CN107101584A (en) | 2017-08-29 |
CN107101584B CN107101584B (en) | 2020-06-12 |
Family
ID=59656813
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201710285370.8A Expired - Fee Related CN107101584B (en) | 2017-04-27 | 2017-04-27 | Object displacement measurement method, device and system based on image recognition |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN107101584B (en) |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107633507A (en) * | 2017-09-02 | 2018-01-26 | 南京理工大学 | LCD defect inspection methods based on contour detecting and characteristic matching |
CN109959335A (en) * | 2017-12-22 | 2019-07-02 | 北京金风科创风电设备有限公司 | Method, device and system for measuring displacement of tower top |
CN110415264A (en) * | 2018-04-25 | 2019-11-05 | 奇景光电股份有限公司 | Motion detector circuit and method |
CN111829439A (en) * | 2020-07-21 | 2020-10-27 | 中山大学 | High-precision translation measuring method and device |
CN114264227A (en) * | 2021-11-26 | 2022-04-01 | 武汉联影生命科学仪器有限公司 | Device and method for measuring size and position of focus and measuring module |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN1664494A (en) * | 2005-03-23 | 2005-09-07 | 西安交通大学 | Laser dam safety monitoring method |
CN1818546A (en) * | 2006-03-02 | 2006-08-16 | 浣石 | Small-displacement measuring method in long-distance plane |
CN101408985A (en) * | 2008-09-22 | 2009-04-15 | 北京航空航天大学 | Method and apparatus for extracting circular luminous spot second-pixel center |
CN104482861A (en) * | 2014-12-08 | 2015-04-01 | 苏州市计量测试研究所 | High-precision long-distance moving measurement system and method for measuring displacement, deformation and length by using same |
CN204718553U (en) * | 2015-06-12 | 2015-10-21 | 北京光电技术研究所 | Buildings displacement measurement system |
DE102015110289A1 (en) * | 2015-06-26 | 2016-12-29 | Werth Messtechnik Gmbh | Method for determining measuring points on the surface of a tool piece with an optical sensor |
-
2017
- 2017-04-27 CN CN201710285370.8A patent/CN107101584B/en not_active Expired - Fee Related
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN1664494A (en) * | 2005-03-23 | 2005-09-07 | 西安交通大学 | Laser dam safety monitoring method |
CN1818546A (en) * | 2006-03-02 | 2006-08-16 | 浣石 | Small-displacement measuring method in long-distance plane |
CN101408985A (en) * | 2008-09-22 | 2009-04-15 | 北京航空航天大学 | Method and apparatus for extracting circular luminous spot second-pixel center |
CN104482861A (en) * | 2014-12-08 | 2015-04-01 | 苏州市计量测试研究所 | High-precision long-distance moving measurement system and method for measuring displacement, deformation and length by using same |
CN204718553U (en) * | 2015-06-12 | 2015-10-21 | 北京光电技术研究所 | Buildings displacement measurement system |
DE102015110289A1 (en) * | 2015-06-26 | 2016-12-29 | Werth Messtechnik Gmbh | Method for determining measuring points on the surface of a tool piece with an optical sensor |
Cited By (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107633507A (en) * | 2017-09-02 | 2018-01-26 | 南京理工大学 | LCD defect inspection methods based on contour detecting and characteristic matching |
CN109959335A (en) * | 2017-12-22 | 2019-07-02 | 北京金风科创风电设备有限公司 | Method, device and system for measuring displacement of tower top |
CN110415264A (en) * | 2018-04-25 | 2019-11-05 | 奇景光电股份有限公司 | Motion detector circuit and method |
CN110415264B (en) * | 2018-04-25 | 2023-10-24 | 奇景光电股份有限公司 | Motion detection circuit and method |
CN111829439A (en) * | 2020-07-21 | 2020-10-27 | 中山大学 | High-precision translation measuring method and device |
CN114264227A (en) * | 2021-11-26 | 2022-04-01 | 武汉联影生命科学仪器有限公司 | Device and method for measuring size and position of focus and measuring module |
CN114264227B (en) * | 2021-11-26 | 2023-07-25 | 武汉联影生命科学仪器有限公司 | Device and method for measuring focal spot size and position |
Also Published As
Publication number | Publication date |
---|---|
CN107101584B (en) | 2020-06-12 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN107101584A (en) | Ohject displacement measuring method based on image recognition, apparatus and system | |
CN106441138B (en) | The deformation monitoring method of view-based access control model measurement | |
CN107452024B (en) | Visual measurement method for full-field motion tracking of rotating object | |
CN106595528B (en) | A kind of micro- binocular stereo vision measurement method of telecentricity based on digital speckle | |
CN104657711B (en) | A kind of readings of pointer type meters automatic identifying method of robust | |
CN107025670A (en) | A kind of telecentricity camera calibration method | |
CN107543496B (en) | A kind of binocular stereo vision three-dimensional coordinate measurement method | |
CN108535097A (en) | A kind of method of triaxial test sample cylindrical distortion measurement of full field | |
WO2019050417A1 (en) | Stereoscopic system calibration and method | |
CN103512548A (en) | Range measurement apparatus and range measurement method | |
CN106815866B (en) | Calibration method of fisheye camera, calibration system and target thereof | |
CN110223355B (en) | Feature mark point matching method based on dual epipolar constraint | |
CN110033407A (en) | A kind of shield tunnel surface image scaling method, joining method and splicing system | |
CN103258328A (en) | Method for locating distorted center of wide-field lens | |
CN110533686A (en) | Line-scan digital camera line frequency and the whether matched judgment method of speed of moving body and system | |
CN109974618A (en) | The overall calibration method of multisensor vision measurement system | |
CN111080711A (en) | Method for calibrating microscopic imaging system in approximately parallel state based on magnification | |
JP6942566B2 (en) | Information processing equipment, information processing methods and computer programs | |
JP5367244B2 (en) | Target detection apparatus and target detection method | |
JP2008139194A (en) | End position measuring method and size measuring method | |
JP6425406B2 (en) | INFORMATION PROCESSING APPARATUS, INFORMATION PROCESSING METHOD, AND PROGRAM | |
CN104937608B (en) | Road area detection | |
CN115631245A (en) | Correction method, terminal device and storage medium | |
CN115307865A (en) | Model deformation measurement method for high-temperature hypersonic flow field | |
CN111179347B (en) | Positioning method, positioning equipment and storage medium based on regional characteristics |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant | ||
CF01 | Termination of patent right due to non-payment of annual fee | ||
CF01 | Termination of patent right due to non-payment of annual fee |
Granted publication date: 20200612 |